00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-v23.11" build number 634 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3294 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.036 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.037 The recommended git tool is: git 00:00:00.037 using credential 00000000-0000-0000-0000-000000000002 00:00:00.039 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.065 Fetching changes from the remote Git repository 00:00:00.067 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.094 Using shallow fetch with depth 1 00:00:00.094 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.094 > git --version # timeout=10 00:00:00.121 > git --version # 'git version 2.39.2' 00:00:00.121 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.136 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.136 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.808 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.819 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.830 Checking out Revision f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 (FETCH_HEAD) 00:00:02.830 > git config core.sparsecheckout # timeout=10 00:00:02.839 > git read-tree -mu HEAD # timeout=10 00:00:02.855 > git checkout -f f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=5 00:00:02.895 Commit message: "spdk-abi-per-patch: fix check-so-deps-docker-autotest parameters" 00:00:02.895 > git rev-list --no-walk f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=10 00:00:03.004 [Pipeline] Start of Pipeline 00:00:03.017 [Pipeline] library 00:00:03.018 Loading library shm_lib@master 00:00:03.018 Library shm_lib@master is cached. Copying from home. 00:00:03.038 [Pipeline] node 00:00:03.049 Running on WFP19 in /var/jenkins/workspace/crypto-phy-autotest 00:00:03.050 [Pipeline] { 00:00:03.060 [Pipeline] catchError 00:00:03.062 [Pipeline] { 00:00:03.073 [Pipeline] wrap 00:00:03.082 [Pipeline] { 00:00:03.087 [Pipeline] stage 00:00:03.089 [Pipeline] { (Prologue) 00:00:03.306 [Pipeline] sh 00:00:03.587 + logger -p user.info -t JENKINS-CI 00:00:03.604 [Pipeline] echo 00:00:03.605 Node: WFP19 00:00:03.613 [Pipeline] sh 00:00:03.905 [Pipeline] setCustomBuildProperty 00:00:03.919 [Pipeline] echo 00:00:03.921 Cleanup processes 00:00:03.929 [Pipeline] sh 00:00:04.213 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.213 863957 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.247 [Pipeline] sh 00:00:04.523 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.523 ++ grep -v 'sudo pgrep' 00:00:04.523 ++ awk '{print $1}' 00:00:04.523 + sudo kill -9 00:00:04.523 + true 00:00:04.538 [Pipeline] cleanWs 00:00:04.548 [WS-CLEANUP] Deleting project workspace... 00:00:04.548 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.553 [WS-CLEANUP] done 00:00:04.557 [Pipeline] setCustomBuildProperty 00:00:04.570 [Pipeline] sh 00:00:04.846 + sudo git config --global --replace-all safe.directory '*' 00:00:04.925 [Pipeline] httpRequest 00:00:04.953 [Pipeline] echo 00:00:04.955 Sorcerer 10.211.164.101 is alive 00:00:04.965 [Pipeline] httpRequest 00:00:04.969 HttpMethod: GET 00:00:04.970 URL: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:04.970 Sending request to url: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:04.987 Response Code: HTTP/1.1 200 OK 00:00:04.987 Success: Status code 200 is in the accepted range: 200,404 00:00:04.988 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:09.175 [Pipeline] sh 00:00:09.459 + tar --no-same-owner -xf jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:09.475 [Pipeline] httpRequest 00:00:09.503 [Pipeline] echo 00:00:09.505 Sorcerer 10.211.164.101 is alive 00:00:09.514 [Pipeline] httpRequest 00:00:09.519 HttpMethod: GET 00:00:09.520 URL: http://10.211.164.101/packages/spdk_d005e023bd514d7d48470775331498120af1a8d8.tar.gz 00:00:09.520 Sending request to url: http://10.211.164.101/packages/spdk_d005e023bd514d7d48470775331498120af1a8d8.tar.gz 00:00:09.539 Response Code: HTTP/1.1 200 OK 00:00:09.539 Success: Status code 200 is in the accepted range: 200,404 00:00:09.540 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_d005e023bd514d7d48470775331498120af1a8d8.tar.gz 00:01:22.806 [Pipeline] sh 00:01:23.085 + tar --no-same-owner -xf spdk_d005e023bd514d7d48470775331498120af1a8d8.tar.gz 00:01:27.289 [Pipeline] sh 00:01:27.571 + git -C spdk log --oneline -n5 00:01:27.571 d005e023b raid: fix empty slot not updated in sb after resize 00:01:27.571 f41dbc235 nvme: always specify CC_CSS_NVM when CAP_CSS_IOCS is not set 00:01:27.571 8ee2672c4 test/bdev: Add test for resized RAID with superblock 00:01:27.571 19f5787c8 raid: skip configured base bdevs in sb examine 00:01:27.571 3b9baa5f8 bdev/raid1: Support resize when increasing the size of base bdevs 00:01:27.590 [Pipeline] withCredentials 00:01:27.602 > git --version # timeout=10 00:01:27.615 > git --version # 'git version 2.39.2' 00:01:27.632 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:27.635 [Pipeline] { 00:01:27.645 [Pipeline] retry 00:01:27.647 [Pipeline] { 00:01:27.665 [Pipeline] sh 00:01:27.947 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:01:27.958 [Pipeline] } 00:01:27.981 [Pipeline] // retry 00:01:27.987 [Pipeline] } 00:01:28.007 [Pipeline] // withCredentials 00:01:28.018 [Pipeline] httpRequest 00:01:28.037 [Pipeline] echo 00:01:28.038 Sorcerer 10.211.164.101 is alive 00:01:28.049 [Pipeline] httpRequest 00:01:28.053 HttpMethod: GET 00:01:28.054 URL: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:28.055 Sending request to url: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:28.062 Response Code: HTTP/1.1 200 OK 00:01:28.063 Success: Status code 200 is in the accepted range: 200,404 00:01:28.063 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:39.648 [Pipeline] sh 00:01:39.929 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:41.841 [Pipeline] sh 00:01:42.122 + git -C dpdk log --oneline -n5 00:01:42.122 eeb0605f11 version: 23.11.0 00:01:42.122 238778122a doc: update release notes for 23.11 00:01:42.122 46aa6b3cfc doc: fix description of RSS features 00:01:42.122 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:42.122 7e421ae345 devtools: support skipping forbid rule check 00:01:42.132 [Pipeline] } 00:01:42.149 [Pipeline] // stage 00:01:42.159 [Pipeline] stage 00:01:42.161 [Pipeline] { (Prepare) 00:01:42.182 [Pipeline] writeFile 00:01:42.200 [Pipeline] sh 00:01:42.482 + logger -p user.info -t JENKINS-CI 00:01:42.494 [Pipeline] sh 00:01:42.776 + logger -p user.info -t JENKINS-CI 00:01:42.789 [Pipeline] sh 00:01:43.089 + cat autorun-spdk.conf 00:01:43.089 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:43.089 SPDK_TEST_BLOCKDEV=1 00:01:43.089 SPDK_TEST_ISAL=1 00:01:43.089 SPDK_TEST_CRYPTO=1 00:01:43.089 SPDK_TEST_REDUCE=1 00:01:43.089 SPDK_TEST_VBDEV_COMPRESS=1 00:01:43.089 SPDK_RUN_UBSAN=1 00:01:43.089 SPDK_TEST_ACCEL=1 00:01:43.089 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:43.089 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:43.096 RUN_NIGHTLY=1 00:01:43.102 [Pipeline] readFile 00:01:43.125 [Pipeline] withEnv 00:01:43.127 [Pipeline] { 00:01:43.142 [Pipeline] sh 00:01:43.425 + set -ex 00:01:43.426 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:43.426 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:43.426 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:43.426 ++ SPDK_TEST_BLOCKDEV=1 00:01:43.426 ++ SPDK_TEST_ISAL=1 00:01:43.426 ++ SPDK_TEST_CRYPTO=1 00:01:43.426 ++ SPDK_TEST_REDUCE=1 00:01:43.426 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:43.426 ++ SPDK_RUN_UBSAN=1 00:01:43.426 ++ SPDK_TEST_ACCEL=1 00:01:43.426 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:43.426 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:43.426 ++ RUN_NIGHTLY=1 00:01:43.426 + case $SPDK_TEST_NVMF_NICS in 00:01:43.426 + DRIVERS= 00:01:43.426 + [[ -n '' ]] 00:01:43.426 + exit 0 00:01:43.435 [Pipeline] } 00:01:43.454 [Pipeline] // withEnv 00:01:43.460 [Pipeline] } 00:01:43.477 [Pipeline] // stage 00:01:43.487 [Pipeline] catchError 00:01:43.489 [Pipeline] { 00:01:43.504 [Pipeline] timeout 00:01:43.505 Timeout set to expire in 1 hr 0 min 00:01:43.507 [Pipeline] { 00:01:43.522 [Pipeline] stage 00:01:43.525 [Pipeline] { (Tests) 00:01:43.541 [Pipeline] sh 00:01:43.823 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:43.823 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:43.823 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:43.823 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:43.824 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:43.824 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:43.824 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:43.824 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:43.824 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:43.824 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:43.824 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:43.824 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:43.824 + source /etc/os-release 00:01:43.824 ++ NAME='Fedora Linux' 00:01:43.824 ++ VERSION='38 (Cloud Edition)' 00:01:43.824 ++ ID=fedora 00:01:43.824 ++ VERSION_ID=38 00:01:43.824 ++ VERSION_CODENAME= 00:01:43.824 ++ PLATFORM_ID=platform:f38 00:01:43.824 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:43.824 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:43.824 ++ LOGO=fedora-logo-icon 00:01:43.824 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:43.824 ++ HOME_URL=https://fedoraproject.org/ 00:01:43.824 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:43.824 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:43.824 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:43.824 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:43.824 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:43.824 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:43.824 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:43.824 ++ SUPPORT_END=2024-05-14 00:01:43.824 ++ VARIANT='Cloud Edition' 00:01:43.824 ++ VARIANT_ID=cloud 00:01:43.824 + uname -a 00:01:43.824 Linux spdk-wfp-19 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:43.824 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:48.019 Hugepages 00:01:48.019 node hugesize free / total 00:01:48.019 node0 1048576kB 0 / 0 00:01:48.019 node0 2048kB 0 / 0 00:01:48.019 node1 1048576kB 0 / 0 00:01:48.019 node1 2048kB 0 / 0 00:01:48.019 00:01:48.019 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:48.019 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:48.019 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:48.019 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:48.019 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:48.019 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:48.019 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:48.019 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:48.019 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:48.019 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:48.019 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:48.019 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:48.019 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:48.019 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:48.019 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:48.019 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:48.019 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:48.019 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:48.019 + rm -f /tmp/spdk-ld-path 00:01:48.019 + source autorun-spdk.conf 00:01:48.019 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:48.019 ++ SPDK_TEST_BLOCKDEV=1 00:01:48.019 ++ SPDK_TEST_ISAL=1 00:01:48.019 ++ SPDK_TEST_CRYPTO=1 00:01:48.019 ++ SPDK_TEST_REDUCE=1 00:01:48.019 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:48.019 ++ SPDK_RUN_UBSAN=1 00:01:48.019 ++ SPDK_TEST_ACCEL=1 00:01:48.019 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:48.019 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:48.019 ++ RUN_NIGHTLY=1 00:01:48.019 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:48.019 + [[ -n '' ]] 00:01:48.019 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:48.019 + for M in /var/spdk/build-*-manifest.txt 00:01:48.019 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:48.019 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:48.019 + for M in /var/spdk/build-*-manifest.txt 00:01:48.019 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:48.019 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:48.019 ++ uname 00:01:48.019 + [[ Linux == \L\i\n\u\x ]] 00:01:48.019 + sudo dmesg -T 00:01:48.019 + sudo dmesg --clear 00:01:48.019 + dmesg_pid=865125 00:01:48.019 + [[ Fedora Linux == FreeBSD ]] 00:01:48.019 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:48.019 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:48.019 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:48.019 + [[ -x /usr/src/fio-static/fio ]] 00:01:48.019 + export FIO_BIN=/usr/src/fio-static/fio 00:01:48.019 + FIO_BIN=/usr/src/fio-static/fio 00:01:48.019 + sudo dmesg -Tw 00:01:48.019 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:48.019 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:48.019 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:48.019 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:48.019 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:48.019 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:48.019 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:48.019 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:48.019 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:48.019 Test configuration: 00:01:48.019 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:48.019 SPDK_TEST_BLOCKDEV=1 00:01:48.019 SPDK_TEST_ISAL=1 00:01:48.019 SPDK_TEST_CRYPTO=1 00:01:48.019 SPDK_TEST_REDUCE=1 00:01:48.019 SPDK_TEST_VBDEV_COMPRESS=1 00:01:48.019 SPDK_RUN_UBSAN=1 00:01:48.019 SPDK_TEST_ACCEL=1 00:01:48.019 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:48.019 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:48.019 RUN_NIGHTLY=1 06:18:01 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:48.019 06:18:01 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:48.019 06:18:01 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:48.019 06:18:01 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:48.019 06:18:01 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.019 06:18:01 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.019 06:18:01 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.019 06:18:01 -- paths/export.sh@5 -- $ export PATH 00:01:48.019 06:18:01 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.019 06:18:01 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:48.019 06:18:01 -- common/autobuild_common.sh@447 -- $ date +%s 00:01:48.019 06:18:01 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721881081.XXXXXX 00:01:48.019 06:18:01 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721881081.qBRvgR 00:01:48.019 06:18:01 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:01:48.019 06:18:01 -- common/autobuild_common.sh@453 -- $ '[' -n v23.11 ']' 00:01:48.019 06:18:01 -- common/autobuild_common.sh@454 -- $ dirname /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:48.019 06:18:01 -- common/autobuild_common.sh@454 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/crypto-phy-autotest/dpdk' 00:01:48.019 06:18:01 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:48.019 06:18:01 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/dpdk --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:48.019 06:18:01 -- common/autobuild_common.sh@463 -- $ get_config_params 00:01:48.019 06:18:01 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:01:48.019 06:18:01 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.020 06:18:01 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build' 00:01:48.020 06:18:01 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:01:48.020 06:18:01 -- pm/common@17 -- $ local monitor 00:01:48.020 06:18:01 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:48.020 06:18:01 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:48.020 06:18:01 -- pm/common@21 -- $ date +%s 00:01:48.020 06:18:01 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:48.020 06:18:01 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:48.020 06:18:01 -- pm/common@21 -- $ date +%s 00:01:48.020 06:18:01 -- pm/common@25 -- $ sleep 1 00:01:48.020 06:18:01 -- pm/common@21 -- $ date +%s 00:01:48.020 06:18:01 -- pm/common@21 -- $ date +%s 00:01:48.020 06:18:01 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721881081 00:01:48.020 06:18:01 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721881081 00:01:48.020 06:18:01 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721881081 00:01:48.020 06:18:01 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721881081 00:01:48.020 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721881081_collect-vmstat.pm.log 00:01:48.020 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721881081_collect-cpu-load.pm.log 00:01:48.020 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721881081_collect-cpu-temp.pm.log 00:01:48.020 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721881081_collect-bmc-pm.bmc.pm.log 00:01:48.957 06:18:02 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:01:48.957 06:18:02 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:48.957 06:18:02 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:48.957 06:18:02 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:48.957 06:18:02 -- spdk/autobuild.sh@16 -- $ date -u 00:01:48.957 Thu Jul 25 04:18:02 AM UTC 2024 00:01:48.957 06:18:02 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:48.957 v24.09-pre-318-gd005e023b 00:01:48.957 06:18:02 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:48.957 06:18:02 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:48.957 06:18:02 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:48.957 06:18:02 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:48.957 06:18:02 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:48.957 06:18:02 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.957 ************************************ 00:01:48.957 START TEST ubsan 00:01:48.957 ************************************ 00:01:48.957 06:18:02 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:01:48.957 using ubsan 00:01:48.957 00:01:48.957 real 0m0.000s 00:01:48.957 user 0m0.000s 00:01:48.957 sys 0m0.000s 00:01:48.957 06:18:02 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:01:48.957 06:18:02 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:48.957 ************************************ 00:01:48.957 END TEST ubsan 00:01:48.957 ************************************ 00:01:48.957 06:18:02 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:48.957 06:18:02 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:48.957 06:18:02 -- common/autobuild_common.sh@439 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:48.957 06:18:02 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:01:48.957 06:18:02 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:48.957 06:18:02 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.957 ************************************ 00:01:48.957 START TEST build_native_dpdk 00:01:48.957 ************************************ 00:01:48.957 06:18:02 build_native_dpdk -- common/autotest_common.sh@1125 -- $ _build_native_dpdk 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:48.957 06:18:02 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/crypto-phy-autotest/dpdk 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/dpdk ]] 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/crypto-phy-autotest/dpdk log --oneline -n 5 00:01:48.958 eeb0605f11 version: 23.11.0 00:01:48.958 238778122a doc: update release notes for 23.11 00:01:48.958 46aa6b3cfc doc: fix description of RSS features 00:01:48.958 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:48.958 7e421ae345 devtools: support skipping forbid rule check 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 1 -eq 1 ]] 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@104 -- $ intel_ipsec_mb_ver=v0.54 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@105 -- $ intel_ipsec_mb_drv=crypto/aesni_mb 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@106 -- $ intel_ipsec_lib= 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@107 -- $ ge 23.11.0 21.11.0 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '>=' 21.11.0 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=>=' 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 23 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@350 -- $ local d=23 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@352 -- $ echo 23 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=23 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:01:48.958 06:18:02 build_native_dpdk -- scripts/common.sh@364 -- $ return 0 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@112 -- $ intel_ipsec_mb_ver=v1.0 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@113 -- $ intel_ipsec_mb_drv=crypto/ipsec_mb 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@114 -- $ intel_ipsec_lib=lib 00:01:48.958 06:18:02 build_native_dpdk -- common/autobuild_common.sh@116 -- $ git clone --branch v1.0 --depth 1 https://github.com/intel/intel-ipsec-mb.git /var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb 00:01:48.958 Cloning into '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb'... 00:01:50.346 Note: switching to 'a1a289dabb23be78d6531de481ba6a417c67b0a5'. 00:01:50.346 00:01:50.346 You are in 'detached HEAD' state. You can look around, make experimental 00:01:50.346 changes and commit them, and you can discard any commits you make in this 00:01:50.346 state without impacting any branches by switching back to a branch. 00:01:50.346 00:01:50.346 If you want to create a new branch to retain commits you create, you may 00:01:50.346 do so (now or later) by using -c with the switch command. Example: 00:01:50.346 00:01:50.346 git switch -c 00:01:50.346 00:01:50.346 Or undo this operation with: 00:01:50.346 00:01:50.346 git switch - 00:01:50.346 00:01:50.346 Turn off this advice by setting config variable advice.detachedHead to false 00:01:50.346 00:01:50.606 06:18:03 build_native_dpdk -- common/autobuild_common.sh@117 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb 00:01:50.606 06:18:03 build_native_dpdk -- common/autobuild_common.sh@118 -- $ make -j112 all SHARED=y EXTRA_CFLAGS=-fPIC 00:01:50.606 make -C lib 00:01:50.606 make[1]: Entering directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' 00:01:51.174 mkdir obj 00:01:51.434 nasm -MD obj/aes_keyexp_128.d -MT obj/aes_keyexp_128.o -o obj/aes_keyexp_128.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes_keyexp_128.asm 00:01:51.434 nasm -MD obj/aes_keyexp_192.d -MT obj/aes_keyexp_192.o -o obj/aes_keyexp_192.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes_keyexp_192.asm 00:01:51.434 nasm -MD obj/aes_keyexp_256.d -MT obj/aes_keyexp_256.o -o obj/aes_keyexp_256.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes_keyexp_256.asm 00:01:51.434 nasm -MD obj/aes_cmac_subkey_gen.d -MT obj/aes_cmac_subkey_gen.o -o obj/aes_cmac_subkey_gen.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes_cmac_subkey_gen.asm 00:01:51.434 nasm -MD obj/save_xmms.d -MT obj/save_xmms.o -o obj/save_xmms.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/save_xmms.asm 00:01:51.434 nasm -MD obj/clear_regs_mem_fns.d -MT obj/clear_regs_mem_fns.o -o obj/clear_regs_mem_fns.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/clear_regs_mem_fns.asm 00:01:51.434 nasm -MD obj/const.d -MT obj/const.o -o obj/const.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/const.asm 00:01:51.434 nasm -MD obj/aes128_ecbenc_x3.d -MT obj/aes128_ecbenc_x3.o -o obj/aes128_ecbenc_x3.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes128_ecbenc_x3.asm 00:01:51.434 nasm -MD obj/zuc_common.d -MT obj/zuc_common.o -o obj/zuc_common.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/zuc_common.asm 00:01:51.434 nasm -MD obj/wireless_common.d -MT obj/wireless_common.o -o obj/wireless_common.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/wireless_common.asm 00:01:51.434 nasm -MD obj/constant_lookup.d -MT obj/constant_lookup.o -o obj/constant_lookup.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/constant_lookup.asm 00:01:51.435 nasm -MD obj/crc32_refl_const.d -MT obj/crc32_refl_const.o -o obj/crc32_refl_const.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/crc32_refl_const.asm 00:01:51.435 nasm -MD obj/crc32_const.d -MT obj/crc32_const.o -o obj/crc32_const.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/crc32_const.asm 00:01:51.435 nasm -MD obj/poly1305.d -MT obj/poly1305.o -o obj/poly1305.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/poly1305.asm 00:01:51.435 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/chacha20_poly1305.c -o obj/chacha20_poly1305.o 00:01:51.435 ld -r -z ibt -z shstk -o obj/save_xmms.o.tmp obj/save_xmms.o 00:01:51.435 ld -r -z ibt -z shstk -o obj/const.o.tmp obj/const.o 00:01:51.435 nasm -MD obj/aes128_cbc_dec_by4_sse_no_aesni.d -MT obj/aes128_cbc_dec_by4_sse_no_aesni.o -o obj/aes128_cbc_dec_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cbc_dec_by4_sse_no_aesni.asm 00:01:51.435 ld -r -z ibt -z shstk -o obj/clear_regs_mem_fns.o.tmp obj/clear_regs_mem_fns.o 00:01:51.435 nasm -MD obj/aes192_cbc_dec_by4_sse_no_aesni.d -MT obj/aes192_cbc_dec_by4_sse_no_aesni.o -o obj/aes192_cbc_dec_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes192_cbc_dec_by4_sse_no_aesni.asm 00:01:51.435 ld -r -z ibt -z shstk -o obj/wireless_common.o.tmp obj/wireless_common.o 00:01:51.435 ld -r -z ibt -z shstk -o obj/crc32_refl_const.o.tmp obj/crc32_refl_const.o 00:01:51.435 mv obj/save_xmms.o.tmp obj/save_xmms.o 00:01:51.435 mv obj/const.o.tmp obj/const.o 00:01:51.435 nasm -MD obj/aes256_cbc_dec_by4_sse_no_aesni.d -MT obj/aes256_cbc_dec_by4_sse_no_aesni.o -o obj/aes256_cbc_dec_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes256_cbc_dec_by4_sse_no_aesni.asm 00:01:51.435 ld -r -z ibt -z shstk -o obj/crc32_const.o.tmp obj/crc32_const.o 00:01:51.435 mv obj/clear_regs_mem_fns.o.tmp obj/clear_regs_mem_fns.o 00:01:51.435 nasm -MD obj/aes_cbc_enc_128_x4_no_aesni.d -MT obj/aes_cbc_enc_128_x4_no_aesni.o -o obj/aes_cbc_enc_128_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cbc_enc_128_x4_no_aesni.asm 00:01:51.435 mv obj/wireless_common.o.tmp obj/wireless_common.o 00:01:51.435 mv obj/crc32_refl_const.o.tmp obj/crc32_refl_const.o 00:01:51.435 nasm -MD obj/aes_cbc_enc_192_x4_no_aesni.d -MT obj/aes_cbc_enc_192_x4_no_aesni.o -o obj/aes_cbc_enc_192_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cbc_enc_192_x4_no_aesni.asm 00:01:51.435 nasm -MD obj/aes_cbc_enc_256_x4_no_aesni.d -MT obj/aes_cbc_enc_256_x4_no_aesni.o -o obj/aes_cbc_enc_256_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cbc_enc_256_x4_no_aesni.asm 00:01:51.435 mv obj/crc32_const.o.tmp obj/crc32_const.o 00:01:51.435 nasm -MD obj/aes128_cntr_by8_sse_no_aesni.d -MT obj/aes128_cntr_by8_sse_no_aesni.o -o obj/aes128_cntr_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cntr_by8_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/aes192_cntr_by8_sse_no_aesni.d -MT obj/aes192_cntr_by8_sse_no_aesni.o -o obj/aes192_cntr_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes192_cntr_by8_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/aes256_cntr_by8_sse_no_aesni.d -MT obj/aes256_cntr_by8_sse_no_aesni.o -o obj/aes256_cntr_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes256_cntr_by8_sse_no_aesni.asm 00:01:51.435 ld -r -z ibt -z shstk -o obj/constant_lookup.o.tmp obj/constant_lookup.o 00:01:51.435 nasm -MD obj/aes_ecb_by4_sse_no_aesni.d -MT obj/aes_ecb_by4_sse_no_aesni.o -o obj/aes_ecb_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_ecb_by4_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/aes128_cntr_ccm_by8_sse_no_aesni.d -MT obj/aes128_cntr_ccm_by8_sse_no_aesni.o -o obj/aes128_cntr_ccm_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cntr_ccm_by8_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/aes256_cntr_ccm_by8_sse_no_aesni.d -MT obj/aes256_cntr_ccm_by8_sse_no_aesni.o -o obj/aes256_cntr_ccm_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes256_cntr_ccm_by8_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/pon_sse_no_aesni.d -MT obj/pon_sse_no_aesni.o -o obj/pon_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/pon_sse_no_aesni.asm 00:01:51.435 mv obj/constant_lookup.o.tmp obj/constant_lookup.o 00:01:51.435 nasm -MD obj/zuc_sse_no_aesni.d -MT obj/zuc_sse_no_aesni.o -o obj/zuc_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/zuc_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/aes_cfb_sse_no_aesni.d -MT obj/aes_cfb_sse_no_aesni.o -o obj/aes_cfb_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cfb_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/aes128_cbc_mac_x4_no_aesni.d -MT obj/aes128_cbc_mac_x4_no_aesni.o -o obj/aes128_cbc_mac_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cbc_mac_x4_no_aesni.asm 00:01:51.435 nasm -MD obj/aes256_cbc_mac_x4_no_aesni.d -MT obj/aes256_cbc_mac_x4_no_aesni.o -o obj/aes256_cbc_mac_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes256_cbc_mac_x4_no_aesni.asm 00:01:51.435 nasm -MD obj/aes_xcbc_mac_128_x4_no_aesni.d -MT obj/aes_xcbc_mac_128_x4_no_aesni.o -o obj/aes_xcbc_mac_128_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_xcbc_mac_128_x4_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes_flush_sse_no_aesni.d -MT obj/mb_mgr_aes_flush_sse_no_aesni.o -o obj/mb_mgr_aes_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes_submit_sse_no_aesni.d -MT obj/mb_mgr_aes_submit_sse_no_aesni.o -o obj/mb_mgr_aes_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_submit_sse_no_aesni.asm 00:01:51.435 ld -r -z ibt -z shstk -o obj/poly1305.o.tmp obj/poly1305.o 00:01:51.435 nasm -MD obj/mb_mgr_aes192_flush_sse_no_aesni.d -MT obj/mb_mgr_aes192_flush_sse_no_aesni.o -o obj/mb_mgr_aes192_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes192_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes192_submit_sse_no_aesni.d -MT obj/mb_mgr_aes192_submit_sse_no_aesni.o -o obj/mb_mgr_aes192_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes192_submit_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes256_flush_sse_no_aesni.d -MT obj/mb_mgr_aes256_flush_sse_no_aesni.o -o obj/mb_mgr_aes256_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes256_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes256_submit_sse_no_aesni.d -MT obj/mb_mgr_aes256_submit_sse_no_aesni.o -o obj/mb_mgr_aes256_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes256_submit_sse_no_aesni.asm 00:01:51.435 mv obj/poly1305.o.tmp obj/poly1305.o 00:01:51.435 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o -o obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o -o obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.d -MT obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o -o obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_xcbc_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.d -MT obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o -o obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_xcbc_submit_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_zuc_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o -o obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_zuc_submit_flush_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/ethernet_fcs_sse_no_aesni.d -MT obj/ethernet_fcs_sse_no_aesni.o -o obj/ethernet_fcs_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/ethernet_fcs_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/crc16_x25_sse_no_aesni.d -MT obj/crc16_x25_sse_no_aesni.o -o obj/crc16_x25_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc16_x25_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/aes_cbcs_1_9_enc_128_x4_no_aesni.d -MT obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o -o obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cbcs_1_9_enc_128_x4_no_aesni.asm 00:01:51.435 nasm -MD obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.d -MT obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o -o obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cbcs_1_9_dec_by4_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_submit_sse.d -MT obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o -o obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes128_cbcs_1_9_submit_sse.asm 00:01:51.435 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes192_submit_sse_no_aesni.o 00:01:51.435 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_flush_sse.d -MT obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o -o obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes128_cbcs_1_9_flush_sse.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.d -MT obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o -o obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.asm 00:01:51.435 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.d -MT obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o -o obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.asm 00:01:51.435 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_flush_sse_no_aesni.o 00:01:51.435 mv obj/mb_mgr_aes192_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes192_submit_sse_no_aesni.o 00:01:51.435 ld -r -z ibt -z shstk -o obj/crc16_x25_sse_no_aesni.o.tmp obj/crc16_x25_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes192_flush_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes_submit_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/ethernet_fcs_sse_no_aesni.o.tmp obj/ethernet_fcs_sse_no_aesni.o 00:01:51.436 nasm -MD obj/crc32_refl_by8_sse_no_aesni.d -MT obj/crc32_refl_by8_sse_no_aesni.o -o obj/crc32_refl_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_refl_by8_sse_no_aesni.asm 00:01:51.436 mv obj/mb_mgr_aes_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_flush_sse_no_aesni.o 00:01:51.436 mv obj/crc16_x25_sse_no_aesni.o.tmp obj/crc16_x25_sse_no_aesni.o 00:01:51.436 mv obj/mb_mgr_aes_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes_submit_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_flush_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes256_submit_sse_no_aesni.o 00:01:51.436 mv obj/mb_mgr_aes192_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes192_flush_sse_no_aesni.o 00:01:51.436 mv obj/ethernet_fcs_sse_no_aesni.o.tmp obj/ethernet_fcs_sse_no_aesni.o 00:01:51.436 nasm -MD obj/crc32_by8_sse_no_aesni.d -MT obj/crc32_by8_sse_no_aesni.o -o obj/crc32_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_by8_sse_no_aesni.asm 00:01:51.436 mv obj/mb_mgr_aes256_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes256_submit_sse_no_aesni.o 00:01:51.436 nasm -MD obj/crc32_sctp_sse_no_aesni.d -MT obj/crc32_sctp_sse_no_aesni.o -o obj/crc32_sctp_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_sctp_sse_no_aesni.asm 00:01:51.436 mv obj/mb_mgr_aes256_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_flush_sse_no_aesni.o 00:01:51.436 nasm -MD obj/crc32_lte_sse_no_aesni.d -MT obj/crc32_lte_sse_no_aesni.o -o obj/crc32_lte_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_lte_sse_no_aesni.asm 00:01:51.436 nasm -MD obj/crc32_fp_sse_no_aesni.d -MT obj/crc32_fp_sse_no_aesni.o -o obj/crc32_fp_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_fp_sse_no_aesni.asm 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/crc32_sctp_sse_no_aesni.o.tmp obj/crc32_sctp_sse_no_aesni.o 00:01:51.436 nasm -MD obj/crc32_iuup_sse_no_aesni.d -MT obj/crc32_iuup_sse_no_aesni.o -o obj/crc32_iuup_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_iuup_sse_no_aesni.asm 00:01:51.436 mv obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o 00:01:51.436 mv obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o 00:01:51.436 nasm -MD obj/crc32_wimax_sse_no_aesni.d -MT obj/crc32_wimax_sse_no_aesni.o -o obj/crc32_wimax_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_wimax_sse_no_aesni.asm 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes_cmac_subkey_gen.o.tmp obj/aes_cmac_subkey_gen.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o 00:01:51.436 mv obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o 00:01:51.436 mv obj/crc32_sctp_sse_no_aesni.o.tmp obj/crc32_sctp_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/crc32_lte_sse_no_aesni.o.tmp obj/crc32_lte_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/crc32_fp_sse_no_aesni.o.tmp obj/crc32_fp_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/crc32_iuup_sse_no_aesni.o.tmp obj/crc32_iuup_sse_no_aesni.o 00:01:51.436 nasm -MD obj/gcm128_sse_no_aesni.d -MT obj/gcm128_sse_no_aesni.o -o obj/gcm128_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/gcm128_sse_no_aesni.asm 00:01:51.436 mv obj/aes_cmac_subkey_gen.o.tmp obj/aes_cmac_subkey_gen.o 00:01:51.436 mv obj/crc32_lte_sse_no_aesni.o.tmp obj/crc32_lte_sse_no_aesni.o 00:01:51.436 mv obj/crc32_fp_sse_no_aesni.o.tmp obj/crc32_fp_sse_no_aesni.o 00:01:51.436 mv obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o 00:01:51.436 mv obj/crc32_iuup_sse_no_aesni.o.tmp obj/crc32_iuup_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes_keyexp_128.o.tmp obj/aes_keyexp_128.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes_keyexp_192.o.tmp obj/aes_keyexp_192.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/crc32_wimax_sse_no_aesni.o.tmp obj/crc32_wimax_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o 00:01:51.436 nasm -MD obj/gcm192_sse_no_aesni.d -MT obj/gcm192_sse_no_aesni.o -o obj/gcm192_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/gcm192_sse_no_aesni.asm 00:01:51.436 mv obj/aes_keyexp_128.o.tmp obj/aes_keyexp_128.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes128_ecbenc_x3.o.tmp obj/aes128_ecbenc_x3.o 00:01:51.436 mv obj/aes_keyexp_192.o.tmp obj/aes_keyexp_192.o 00:01:51.436 mv obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o 00:01:51.436 mv obj/crc32_wimax_sse_no_aesni.o.tmp obj/crc32_wimax_sse_no_aesni.o 00:01:51.436 mv obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o 00:01:51.436 nasm -MD obj/gcm256_sse_no_aesni.d -MT obj/gcm256_sse_no_aesni.o -o obj/gcm256_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/gcm256_sse_no_aesni.asm 00:01:51.436 mv obj/aes128_ecbenc_x3.o.tmp obj/aes128_ecbenc_x3.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o 00:01:51.436 mv obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o 00:01:51.436 nasm -MD obj/aes128_cbc_dec_by4_sse.d -MT obj/aes128_cbc_dec_by4_sse.o -o obj/aes128_cbc_dec_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbc_dec_by4_sse.asm 00:01:51.436 mv obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o 00:01:51.436 nasm -MD obj/aes128_cbc_dec_by8_sse.d -MT obj/aes128_cbc_dec_by8_sse.o -o obj/aes128_cbc_dec_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbc_dec_by8_sse.asm 00:01:51.436 nasm -MD obj/aes192_cbc_dec_by4_sse.d -MT obj/aes192_cbc_dec_by4_sse.o -o obj/aes192_cbc_dec_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes192_cbc_dec_by4_sse.asm 00:01:51.436 nasm -MD obj/aes192_cbc_dec_by8_sse.d -MT obj/aes192_cbc_dec_by8_sse.o -o obj/aes192_cbc_dec_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes192_cbc_dec_by8_sse.asm 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes128_cbc_dec_by4_sse.o.tmp obj/aes128_cbc_dec_by4_sse.o 00:01:51.436 nasm -MD obj/aes256_cbc_dec_by4_sse.d -MT obj/aes256_cbc_dec_by4_sse.o -o obj/aes256_cbc_dec_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cbc_dec_by4_sse.asm 00:01:51.436 nasm -MD obj/aes256_cbc_dec_by8_sse.d -MT obj/aes256_cbc_dec_by8_sse.o -o obj/aes256_cbc_dec_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cbc_dec_by8_sse.asm 00:01:51.436 mv obj/aes128_cbc_dec_by4_sse.o.tmp obj/aes128_cbc_dec_by4_sse.o 00:01:51.436 nasm -MD obj/aes_cbc_enc_128_x4.d -MT obj/aes_cbc_enc_128_x4.o -o obj/aes_cbc_enc_128_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_128_x4.asm 00:01:51.436 nasm -MD obj/aes_cbc_enc_192_x4.d -MT obj/aes_cbc_enc_192_x4.o -o obj/aes_cbc_enc_192_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_192_x4.asm 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes_keyexp_256.o.tmp obj/aes_keyexp_256.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes128_cbc_dec_by8_sse.o.tmp obj/aes128_cbc_dec_by8_sse.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes192_cbc_dec_by4_sse.o.tmp obj/aes192_cbc_dec_by4_sse.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes192_cbc_dec_by8_sse.o.tmp obj/aes192_cbc_dec_by8_sse.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes256_cbc_dec_by4_sse.o.tmp obj/aes256_cbc_dec_by4_sse.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes256_cbc_dec_by8_sse.o.tmp obj/aes256_cbc_dec_by8_sse.o 00:01:51.436 mv obj/aes_keyexp_256.o.tmp obj/aes_keyexp_256.o 00:01:51.436 nasm -MD obj/aes_cbc_enc_256_x4.d -MT obj/aes_cbc_enc_256_x4.o -o obj/aes_cbc_enc_256_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_256_x4.asm 00:01:51.436 mv obj/aes192_cbc_dec_by4_sse.o.tmp obj/aes192_cbc_dec_by4_sse.o 00:01:51.436 mv obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o 00:01:51.436 mv obj/aes128_cbc_dec_by8_sse.o.tmp obj/aes128_cbc_dec_by8_sse.o 00:01:51.436 mv obj/aes192_cbc_dec_by8_sse.o.tmp obj/aes192_cbc_dec_by8_sse.o 00:01:51.436 mv obj/aes256_cbc_dec_by4_sse.o.tmp obj/aes256_cbc_dec_by4_sse.o 00:01:51.436 mv obj/aes256_cbc_dec_by8_sse.o.tmp obj/aes256_cbc_dec_by8_sse.o 00:01:51.436 nasm -MD obj/aes_cbc_enc_128_x8_sse.d -MT obj/aes_cbc_enc_128_x8_sse.o -o obj/aes_cbc_enc_128_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_128_x8_sse.asm 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_192_x4.o.tmp obj/aes_cbc_enc_192_x4.o 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_128_x4.o.tmp obj/aes_cbc_enc_128_x4.o 00:01:51.436 nasm -MD obj/aes_cbc_enc_192_x8_sse.d -MT obj/aes_cbc_enc_192_x8_sse.o -o obj/aes_cbc_enc_192_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_192_x8_sse.asm 00:01:51.436 nasm -MD obj/aes_cbc_enc_256_x8_sse.d -MT obj/aes_cbc_enc_256_x8_sse.o -o obj/aes_cbc_enc_256_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_256_x8_sse.asm 00:01:51.436 nasm -MD obj/pon_sse.d -MT obj/pon_sse.o -o obj/pon_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/pon_sse.asm 00:01:51.436 nasm -MD obj/aes128_cntr_by8_sse.d -MT obj/aes128_cntr_by8_sse.o -o obj/aes128_cntr_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cntr_by8_sse.asm 00:01:51.436 nasm -MD obj/aes192_cntr_by8_sse.d -MT obj/aes192_cntr_by8_sse.o -o obj/aes192_cntr_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes192_cntr_by8_sse.asm 00:01:51.436 mv obj/aes_cbc_enc_192_x4.o.tmp obj/aes_cbc_enc_192_x4.o 00:01:51.436 mv obj/aes_cbc_enc_128_x4.o.tmp obj/aes_cbc_enc_128_x4.o 00:01:51.436 nasm -MD obj/aes256_cntr_by8_sse.d -MT obj/aes256_cntr_by8_sse.o -o obj/aes256_cntr_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cntr_by8_sse.asm 00:01:51.436 ld -r -z ibt -z shstk -o obj/aes_cfb_sse_no_aesni.o.tmp obj/aes_cfb_sse_no_aesni.o 00:01:51.436 nasm -MD obj/aes_ecb_by4_sse.d -MT obj/aes_ecb_by4_sse.o -o obj/aes_ecb_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_ecb_by4_sse.asm 00:01:51.436 nasm -MD obj/aes128_cntr_ccm_by8_sse.d -MT obj/aes128_cntr_ccm_by8_sse.o -o obj/aes128_cntr_ccm_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cntr_ccm_by8_sse.asm 00:01:51.437 nasm -MD obj/aes256_cntr_ccm_by8_sse.d -MT obj/aes256_cntr_ccm_by8_sse.o -o obj/aes256_cntr_ccm_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cntr_ccm_by8_sse.asm 00:01:51.437 nasm -MD obj/aes_cfb_sse.d -MT obj/aes_cfb_sse.o -o obj/aes_cfb_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cfb_sse.asm 00:01:51.437 mv obj/aes_cfb_sse_no_aesni.o.tmp obj/aes_cfb_sse_no_aesni.o 00:01:51.437 nasm -MD obj/aes128_cbc_mac_x4.d -MT obj/aes128_cbc_mac_x4.o -o obj/aes128_cbc_mac_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbc_mac_x4.asm 00:01:51.697 nasm -MD obj/aes256_cbc_mac_x4.d -MT obj/aes256_cbc_mac_x4.o -o obj/aes256_cbc_mac_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cbc_mac_x4.asm 00:01:51.697 nasm -MD obj/aes128_cbc_mac_x8_sse.d -MT obj/aes128_cbc_mac_x8_sse.o -o obj/aes128_cbc_mac_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbc_mac_x8_sse.asm 00:01:51.697 nasm -MD obj/aes256_cbc_mac_x8_sse.d -MT obj/aes256_cbc_mac_x8_sse.o -o obj/aes256_cbc_mac_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cbc_mac_x8_sse.asm 00:01:51.697 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_256_x4.o.tmp obj/aes_cbc_enc_256_x4.o 00:01:51.697 nasm -MD obj/aes_xcbc_mac_128_x4.d -MT obj/aes_xcbc_mac_128_x4.o -o obj/aes_xcbc_mac_128_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_xcbc_mac_128_x4.asm 00:01:51.697 nasm -MD obj/md5_x4x2_sse.d -MT obj/md5_x4x2_sse.o -o obj/md5_x4x2_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/md5_x4x2_sse.asm 00:01:51.697 nasm -MD obj/sha1_mult_sse.d -MT obj/sha1_mult_sse.o -o obj/sha1_mult_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha1_mult_sse.asm 00:01:51.697 nasm -MD obj/sha1_one_block_sse.d -MT obj/sha1_one_block_sse.o -o obj/sha1_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha1_one_block_sse.asm 00:01:51.697 mv obj/aes_cbc_enc_256_x4.o.tmp obj/aes_cbc_enc_256_x4.o 00:01:51.697 nasm -MD obj/sha224_one_block_sse.d -MT obj/sha224_one_block_sse.o -o obj/sha224_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha224_one_block_sse.asm 00:01:51.697 nasm -MD obj/sha256_one_block_sse.d -MT obj/sha256_one_block_sse.o -o obj/sha256_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha256_one_block_sse.asm 00:01:51.697 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_128_x8_sse.o.tmp obj/aes_cbc_enc_128_x8_sse.o 00:01:51.697 nasm -MD obj/sha384_one_block_sse.d -MT obj/sha384_one_block_sse.o -o obj/sha384_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha384_one_block_sse.asm 00:01:51.697 nasm -MD obj/sha512_one_block_sse.d -MT obj/sha512_one_block_sse.o -o obj/sha512_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha512_one_block_sse.asm 00:01:51.697 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_192_x8_sse.o.tmp obj/aes_cbc_enc_192_x8_sse.o 00:01:51.697 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_256_x8_sse.o.tmp obj/aes_cbc_enc_256_x8_sse.o 00:01:51.697 mv obj/aes_cbc_enc_128_x8_sse.o.tmp obj/aes_cbc_enc_128_x8_sse.o 00:01:51.697 nasm -MD obj/sha512_x2_sse.d -MT obj/sha512_x2_sse.o -o obj/sha512_x2_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha512_x2_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes_cfb_sse.o.tmp obj/aes_cfb_sse.o 00:01:51.698 mv obj/aes_cbc_enc_256_x8_sse.o.tmp obj/aes_cbc_enc_256_x8_sse.o 00:01:51.698 nasm -MD obj/sha_256_mult_sse.d -MT obj/sha_256_mult_sse.o -o obj/sha_256_mult_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha_256_mult_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes256_cbc_mac_x4.o.tmp obj/aes256_cbc_mac_x4.o 00:01:51.698 mv obj/aes_cbc_enc_192_x8_sse.o.tmp obj/aes_cbc_enc_192_x8_sse.o 00:01:51.698 mv obj/aes_cfb_sse.o.tmp obj/aes_cfb_sse.o 00:01:51.698 nasm -MD obj/sha1_ni_x2_sse.d -MT obj/sha1_ni_x2_sse.o -o obj/sha1_ni_x2_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha1_ni_x2_sse.asm 00:01:51.698 nasm -MD obj/sha256_ni_x2_sse.d -MT obj/sha256_ni_x2_sse.o -o obj/sha256_ni_x2_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha256_ni_x2_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha512_one_block_sse.o.tmp obj/sha512_one_block_sse.o 00:01:51.698 nasm -MD obj/zuc_sse.d -MT obj/zuc_sse.o -o obj/zuc_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/zuc_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha1_one_block_sse.o.tmp obj/sha1_one_block_sse.o 00:01:51.698 nasm -MD obj/zuc_sse_gfni.d -MT obj/zuc_sse_gfni.o -o obj/zuc_sse_gfni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/zuc_sse_gfni.asm 00:01:51.698 mv obj/aes256_cbc_mac_x4.o.tmp obj/aes256_cbc_mac_x4.o 00:01:51.698 mv obj/sha512_one_block_sse.o.tmp obj/sha512_one_block_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes_flush_sse.d -MT obj/mb_mgr_aes_flush_sse.o -o obj/mb_mgr_aes_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_flush_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes_ecb_by4_sse.o.tmp obj/aes_ecb_by4_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes128_cbc_mac_x8_sse.o.tmp obj/aes128_cbc_mac_x8_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha256_one_block_sse.o.tmp obj/sha256_one_block_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha224_one_block_sse.o.tmp obj/sha224_one_block_sse.o 00:01:51.698 mv obj/sha1_one_block_sse.o.tmp obj/sha1_one_block_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes256_cbc_mac_x8_sse.o.tmp obj/aes256_cbc_mac_x8_sse.o 00:01:51.698 mv obj/aes128_cbc_mac_x8_sse.o.tmp obj/aes128_cbc_mac_x8_sse.o 00:01:51.698 mv obj/aes_ecb_by4_sse.o.tmp obj/aes_ecb_by4_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes_submit_sse.d -MT obj/mb_mgr_aes_submit_sse.o -o obj/mb_mgr_aes_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_submit_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha384_one_block_sse.o.tmp obj/sha384_one_block_sse.o 00:01:51.698 mv obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes_xcbc_mac_128_x4.o.tmp obj/aes_xcbc_mac_128_x4.o 00:01:51.698 mv obj/sha256_one_block_sse.o.tmp obj/sha256_one_block_sse.o 00:01:51.698 mv obj/aes256_cbc_mac_x8_sse.o.tmp obj/aes256_cbc_mac_x8_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes192_flush_sse.d -MT obj/mb_mgr_aes192_flush_sse.o -o obj/mb_mgr_aes192_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes192_flush_sse.asm 00:01:51.698 mv obj/sha224_one_block_sse.o.tmp obj/sha224_one_block_sse.o 00:01:51.698 mv obj/sha384_one_block_sse.o.tmp obj/sha384_one_block_sse.o 00:01:51.698 mv obj/aes_xcbc_mac_128_x4.o.tmp obj/aes_xcbc_mac_128_x4.o 00:01:51.698 nasm -MD obj/mb_mgr_aes192_submit_sse.d -MT obj/mb_mgr_aes192_submit_sse.o -o obj/mb_mgr_aes192_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes192_submit_sse.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes256_flush_sse.d -MT obj/mb_mgr_aes256_flush_sse.o -o obj/mb_mgr_aes256_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_flush_sse.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes256_submit_sse.d -MT obj/mb_mgr_aes256_submit_sse.o -o obj/mb_mgr_aes256_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_submit_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha256_ni_x2_sse.o.tmp obj/sha256_ni_x2_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes_flush_sse_x8.d -MT obj/mb_mgr_aes_flush_sse_x8.o -o obj/mb_mgr_aes_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_flush_sse_x8.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes_submit_sse_x8.d -MT obj/mb_mgr_aes_submit_sse_x8.o -o obj/mb_mgr_aes_submit_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_submit_sse_x8.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes192_flush_sse_x8.d -MT obj/mb_mgr_aes192_flush_sse_x8.o -o obj/mb_mgr_aes192_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes192_flush_sse_x8.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha1_ni_x2_sse.o.tmp obj/sha1_ni_x2_sse.o 00:01:51.698 mv obj/sha256_ni_x2_sse.o.tmp obj/sha256_ni_x2_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes192_submit_sse_x8.d -MT obj/mb_mgr_aes192_submit_sse_x8.o -o obj/mb_mgr_aes192_submit_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes192_submit_sse_x8.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes128_cbc_mac_x4.o.tmp obj/aes128_cbc_mac_x4.o 00:01:51.698 nasm -MD obj/mb_mgr_aes256_flush_sse_x8.d -MT obj/mb_mgr_aes256_flush_sse_x8.o -o obj/mb_mgr_aes256_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_flush_sse_x8.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes128_cntr_ccm_by8_sse.o.tmp obj/aes128_cntr_ccm_by8_sse.o 00:01:51.698 mv obj/sha1_ni_x2_sse.o.tmp obj/sha1_ni_x2_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes256_submit_sse_x8.d -MT obj/mb_mgr_aes256_submit_sse_x8.o -o obj/mb_mgr_aes256_submit_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_submit_sse_x8.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_sse.d -MT obj/mb_mgr_aes_cmac_submit_flush_sse.o -o obj/mb_mgr_aes_cmac_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_cmac_submit_flush_sse.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_sse.d -MT obj/mb_mgr_aes256_cmac_submit_flush_sse.o -o obj/mb_mgr_aes256_cmac_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_cmac_submit_flush_sse.asm 00:01:51.698 mv obj/aes128_cbc_mac_x4.o.tmp obj/aes128_cbc_mac_x4.o 00:01:51.698 mv obj/aes128_cntr_ccm_by8_sse.o.tmp obj/aes128_cntr_ccm_by8_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_sse_x8.d -MT obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o -o obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_cmac_submit_flush_sse_x8.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.d -MT obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o -o obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_cmac_submit_flush_sse_x8.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes256_cntr_ccm_by8_sse.o.tmp obj/aes256_cntr_ccm_by8_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_sse.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_ccm_auth_submit_flush_sse.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.asm 00:01:51.698 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_ccm_auth_submit_flush_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_sse.o.tmp obj/mb_mgr_aes_flush_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_sse.o.tmp obj/mb_mgr_aes_submit_sse.o 00:01:51.698 mv obj/aes256_cntr_ccm_by8_sse.o.tmp obj/aes256_cntr_ccm_by8_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_aes_xcbc_flush_sse.d -MT obj/mb_mgr_aes_xcbc_flush_sse.o -o obj/mb_mgr_aes_xcbc_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_xcbc_flush_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_sse_x8.o.tmp obj/mb_mgr_aes_submit_sse_x8.o 00:01:51.698 nasm -MD obj/mb_mgr_aes_xcbc_submit_sse.d -MT obj/mb_mgr_aes_xcbc_submit_sse.o -o obj/mb_mgr_aes_xcbc_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_xcbc_submit_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_sse.o.tmp obj/mb_mgr_aes192_flush_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes192_cntr_by8_sse.o.tmp obj/aes192_cntr_by8_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_sse.o.tmp obj/mb_mgr_aes256_flush_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_128_x4_no_aesni.o.tmp obj/aes_cbc_enc_128_x4_no_aesni.o 00:01:51.698 mv obj/mb_mgr_aes_flush_sse.o.tmp obj/mb_mgr_aes_flush_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_sse.o.tmp obj/mb_mgr_aes192_submit_sse.o 00:01:51.698 mv obj/mb_mgr_aes_submit_sse.o.tmp obj/mb_mgr_aes_submit_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_sse.o.tmp obj/mb_mgr_aes256_submit_sse.o 00:01:51.698 mv obj/mb_mgr_aes_submit_sse_x8.o.tmp obj/mb_mgr_aes_submit_sse_x8.o 00:01:51.698 mv obj/mb_mgr_aes192_flush_sse.o.tmp obj/mb_mgr_aes192_flush_sse.o 00:01:51.698 mv obj/aes192_cntr_by8_sse.o.tmp obj/aes192_cntr_by8_sse.o 00:01:51.698 mv obj/mb_mgr_aes256_flush_sse.o.tmp obj/mb_mgr_aes256_flush_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_sse_x8.o.tmp obj/mb_mgr_aes256_flush_sse_x8.o 00:01:51.698 mv obj/aes_cbc_enc_128_x4_no_aesni.o.tmp obj/aes_cbc_enc_128_x4_no_aesni.o 00:01:51.698 mv obj/mb_mgr_aes256_submit_sse.o.tmp obj/mb_mgr_aes256_submit_sse.o 00:01:51.698 nasm -MD obj/mb_mgr_hmac_md5_flush_sse.d -MT obj/mb_mgr_hmac_md5_flush_sse.o -o obj/mb_mgr_hmac_md5_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_md5_flush_sse.asm 00:01:51.698 ld -r -z ibt -z shstk -o obj/crc32_by8_sse_no_aesni.o.tmp obj/crc32_by8_sse_no_aesni.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha1_mult_sse.o.tmp obj/sha1_mult_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha512_x2_sse.o.tmp obj/sha512_x2_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_sse_x8.o.tmp obj/mb_mgr_aes192_flush_sse_x8.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_sse_x8.o.tmp obj/mb_mgr_aes256_submit_sse_x8.o 00:01:51.698 mv obj/mb_mgr_aes192_submit_sse.o.tmp obj/mb_mgr_aes192_submit_sse.o 00:01:51.698 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o 00:01:51.698 mv obj/mb_mgr_aes256_flush_sse_x8.o.tmp obj/mb_mgr_aes256_flush_sse_x8.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/sha_256_mult_sse.o.tmp obj/sha_256_mult_sse.o 00:01:51.698 mv obj/crc32_by8_sse_no_aesni.o.tmp obj/crc32_by8_sse_no_aesni.o 00:01:51.698 mv obj/sha1_mult_sse.o.tmp obj/sha1_mult_sse.o 00:01:51.698 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_sse_x8.o.tmp obj/mb_mgr_aes_flush_sse_x8.o 00:01:51.699 mv obj/sha512_x2_sse.o.tmp obj/sha512_x2_sse.o 00:01:51.699 mv obj/mb_mgr_aes192_flush_sse_x8.o.tmp obj/mb_mgr_aes192_flush_sse_x8.o 00:01:51.699 mv obj/mb_mgr_aes256_submit_sse_x8.o.tmp obj/mb_mgr_aes256_submit_sse_x8.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/crc32_refl_by8_sse_no_aesni.o.tmp obj/crc32_refl_by8_sse_no_aesni.o 00:01:51.699 mv obj/sha_256_mult_sse.o.tmp obj/sha_256_mult_sse.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_md5_submit_sse.d -MT obj/mb_mgr_hmac_md5_submit_sse.o -o obj/mb_mgr_hmac_md5_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_md5_submit_sse.asm 00:01:51.699 mv obj/mb_mgr_aes_flush_sse_x8.o.tmp obj/mb_mgr_aes_flush_sse_x8.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_flush_sse.d -MT obj/mb_mgr_hmac_flush_sse.o -o obj/mb_mgr_hmac_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_flush_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/aes128_cntr_by8_sse.o.tmp obj/aes128_cntr_by8_sse.o 00:01:51.699 mv obj/crc32_refl_by8_sse_no_aesni.o.tmp obj/crc32_refl_by8_sse_no_aesni.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_submit_sse.d -MT obj/mb_mgr_hmac_submit_sse.o -o obj/mb_mgr_hmac_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_submit_sse.asm 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_224_flush_sse.d -MT obj/mb_mgr_hmac_sha_224_flush_sse.o -o obj/mb_mgr_hmac_sha_224_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_224_flush_sse.asm 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_224_submit_sse.d -MT obj/mb_mgr_hmac_sha_224_submit_sse.o -o obj/mb_mgr_hmac_sha_224_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_224_submit_sse.asm 00:01:51.699 mv obj/aes128_cntr_by8_sse.o.tmp obj/aes128_cntr_by8_sse.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_256_flush_sse.d -MT obj/mb_mgr_hmac_sha_256_flush_sse.o -o obj/mb_mgr_hmac_sha_256_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_256_flush_sse.asm 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_256_submit_sse.d -MT obj/mb_mgr_hmac_sha_256_submit_sse.o -o obj/mb_mgr_hmac_sha_256_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_256_submit_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_sse_x8.o.tmp obj/mb_mgr_aes192_submit_sse_x8.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_384_flush_sse.d -MT obj/mb_mgr_hmac_sha_384_flush_sse.o -o obj/mb_mgr_hmac_sha_384_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_384_flush_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/aes256_cntr_by8_sse.o.tmp obj/aes256_cntr_by8_sse.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_384_submit_sse.d -MT obj/mb_mgr_hmac_sha_384_submit_sse.o -o obj/mb_mgr_hmac_sha_384_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_384_submit_sse.asm 00:01:51.699 mv obj/mb_mgr_aes192_submit_sse_x8.o.tmp obj/mb_mgr_aes192_submit_sse_x8.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_512_flush_sse.d -MT obj/mb_mgr_hmac_sha_512_flush_sse.o -o obj/mb_mgr_hmac_sha_512_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_512_flush_sse.asm 00:01:51.699 mv obj/aes256_cntr_by8_sse.o.tmp obj/aes256_cntr_by8_sse.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_512_submit_sse.d -MT obj/mb_mgr_hmac_sha_512_submit_sse.o -o obj/mb_mgr_hmac_sha_512_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_512_submit_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/aes_xcbc_mac_128_x4_no_aesni.o.tmp obj/aes_xcbc_mac_128_x4_no_aesni.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_flush_ni_sse.d -MT obj/mb_mgr_hmac_flush_ni_sse.o -o obj/mb_mgr_hmac_flush_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_flush_ni_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_flush_sse.o.tmp obj/mb_mgr_hmac_md5_flush_sse.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_submit_ni_sse.d -MT obj/mb_mgr_hmac_submit_ni_sse.o -o obj/mb_mgr_hmac_submit_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_submit_ni_sse.asm 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_224_flush_ni_sse.d -MT obj/mb_mgr_hmac_sha_224_flush_ni_sse.o -o obj/mb_mgr_hmac_sha_224_flush_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_224_flush_ni_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_submit_sse.o.tmp obj/mb_mgr_aes_xcbc_submit_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_sse.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse.o 00:01:51.699 mv obj/aes_xcbc_mac_128_x4_no_aesni.o.tmp obj/aes_xcbc_mac_128_x4_no_aesni.o 00:01:51.699 mv obj/mb_mgr_hmac_md5_flush_sse.o.tmp obj/mb_mgr_hmac_md5_flush_sse.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_224_submit_ni_sse.d -MT obj/mb_mgr_hmac_sha_224_submit_ni_sse.o -o obj/mb_mgr_hmac_sha_224_submit_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_224_submit_ni_sse.asm 00:01:51.699 mv obj/mb_mgr_aes_xcbc_submit_sse.o.tmp obj/mb_mgr_aes_xcbc_submit_sse.o 00:01:51.699 mv obj/mb_mgr_aes256_cmac_submit_flush_sse.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse.o 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_256_flush_ni_sse.d -MT obj/mb_mgr_hmac_sha_256_flush_ni_sse.o -o obj/mb_mgr_hmac_sha_256_flush_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_256_flush_ni_sse.asm 00:01:51.699 nasm -MD obj/mb_mgr_hmac_sha_256_submit_ni_sse.d -MT obj/mb_mgr_hmac_sha_256_submit_ni_sse.o -o obj/mb_mgr_hmac_sha_256_submit_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_256_submit_ni_sse.asm 00:01:51.699 nasm -MD obj/mb_mgr_zuc_submit_flush_sse.d -MT obj/mb_mgr_zuc_submit_flush_sse.o -o obj/mb_mgr_zuc_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_zuc_submit_flush_sse.asm 00:01:51.699 nasm -MD obj/mb_mgr_zuc_submit_flush_gfni_sse.d -MT obj/mb_mgr_zuc_submit_flush_gfni_sse.o -o obj/mb_mgr_zuc_submit_flush_gfni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_zuc_submit_flush_gfni_sse.asm 00:01:51.699 nasm -MD obj/ethernet_fcs_sse.d -MT obj/ethernet_fcs_sse.o -o obj/ethernet_fcs_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/ethernet_fcs_sse.asm 00:01:51.699 nasm -MD obj/crc16_x25_sse.d -MT obj/crc16_x25_sse.o -o obj/crc16_x25_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc16_x25_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_flush_sse.o.tmp obj/mb_mgr_aes_xcbc_flush_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/pon_sse.o.tmp obj/pon_sse.o 00:01:51.699 nasm -MD obj/crc32_sctp_sse.d -MT obj/crc32_sctp_sse.o -o obj/crc32_sctp_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_sctp_sse.asm 00:01:51.699 nasm -MD obj/aes_cbcs_1_9_enc_128_x4.d -MT obj/aes_cbcs_1_9_enc_128_x4.o -o obj/aes_cbcs_1_9_enc_128_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbcs_1_9_enc_128_x4.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_sse.o.tmp obj/mb_mgr_hmac_flush_sse.o 00:01:51.699 mv obj/pon_sse.o.tmp obj/pon_sse.o 00:01:51.699 nasm -MD obj/aes128_cbcs_1_9_dec_by4_sse.d -MT obj/aes128_cbcs_1_9_dec_by4_sse.o -o obj/aes128_cbcs_1_9_dec_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbcs_1_9_dec_by4_sse.asm 00:01:51.699 nasm -MD obj/crc32_refl_by8_sse.d -MT obj/crc32_refl_by8_sse.o -o obj/crc32_refl_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_refl_by8_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_submit_sse.o.tmp obj/mb_mgr_hmac_md5_submit_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/ethernet_fcs_sse.o.tmp obj/ethernet_fcs_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/crc16_x25_sse.o.tmp obj/crc16_x25_sse.o 00:01:51.699 mv obj/mb_mgr_aes_xcbc_flush_sse.o.tmp obj/mb_mgr_aes_xcbc_flush_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_sse.o.tmp obj/mb_mgr_hmac_sha_224_flush_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_sse.o.tmp obj/mb_mgr_hmac_sha_256_flush_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/crc32_sctp_sse.o.tmp obj/crc32_sctp_sse.o 00:01:51.699 mv obj/mb_mgr_hmac_flush_sse.o.tmp obj/mb_mgr_hmac_flush_sse.o 00:01:51.699 mv obj/mb_mgr_hmac_md5_submit_sse.o.tmp obj/mb_mgr_hmac_md5_submit_sse.o 00:01:51.699 mv obj/ethernet_fcs_sse.o.tmp obj/ethernet_fcs_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o 00:01:51.699 mv obj/crc16_x25_sse.o.tmp obj/crc16_x25_sse.o 00:01:51.699 mv obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o 00:01:51.699 mv obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o 00:01:51.699 mv obj/mb_mgr_hmac_sha_256_flush_sse.o.tmp obj/mb_mgr_hmac_sha_256_flush_sse.o 00:01:51.699 mv obj/crc32_sctp_sse.o.tmp obj/crc32_sctp_sse.o 00:01:51.699 nasm -MD obj/crc32_by8_sse.d -MT obj/crc32_by8_sse.o -o obj/crc32_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_by8_sse.asm 00:01:51.699 mv obj/mb_mgr_hmac_sha_224_flush_sse.o.tmp obj/mb_mgr_hmac_sha_224_flush_sse.o 00:01:51.699 mv obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/aes128_cbcs_1_9_dec_by4_sse.o.tmp obj/aes128_cbcs_1_9_dec_by4_sse.o 00:01:51.699 nasm -MD obj/crc32_lte_sse.d -MT obj/crc32_lte_sse.o -o obj/crc32_lte_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_lte_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/crc32_refl_by8_sse.o.tmp obj/crc32_refl_by8_sse.o 00:01:51.699 nasm -MD obj/crc32_fp_sse.d -MT obj/crc32_fp_sse.o -o obj/crc32_fp_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_fp_sse.asm 00:01:51.699 ld -r -z ibt -z shstk -o obj/md5_x4x2_sse.o.tmp obj/md5_x4x2_sse.o 00:01:51.699 mv obj/aes128_cbcs_1_9_dec_by4_sse.o.tmp obj/aes128_cbcs_1_9_dec_by4_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_ni_sse.o.tmp obj/mb_mgr_hmac_sha_256_flush_ni_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_sse.o.tmp obj/mb_mgr_hmac_submit_sse.o 00:01:51.699 nasm -MD obj/crc32_iuup_sse.d -MT obj/crc32_iuup_sse.o -o obj/crc32_iuup_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_iuup_sse.asm 00:01:51.699 mv obj/crc32_refl_by8_sse.o.tmp obj/crc32_refl_by8_sse.o 00:01:51.699 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_sse.o.tmp obj/mb_mgr_hmac_sha_224_submit_sse.o 00:01:51.700 nasm -MD obj/crc32_wimax_sse.d -MT obj/crc32_wimax_sse.o -o obj/crc32_wimax_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_wimax_sse.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/crc32_lte_sse.o.tmp obj/crc32_lte_sse.o 00:01:51.700 mv obj/md5_x4x2_sse.o.tmp obj/md5_x4x2_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_sha_256_flush_ni_sse.o.tmp obj/mb_mgr_hmac_sha_256_flush_ni_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_submit_sse.o.tmp obj/mb_mgr_hmac_submit_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_sse.o.tmp obj/mb_mgr_hmac_sha_256_submit_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_flush_sse.o.tmp obj/mb_mgr_hmac_sha_384_flush_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_ni_sse.o.tmp obj/mb_mgr_hmac_flush_ni_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/crc32_fp_sse.o.tmp obj/crc32_fp_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_flush_sse.o.tmp obj/mb_mgr_hmac_sha_512_flush_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_submit_sse.o.tmp obj/mb_mgr_hmac_sha_384_submit_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_sha_224_submit_sse.o.tmp obj/mb_mgr_hmac_sha_224_submit_sse.o 00:01:51.700 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o 00:01:51.700 mv obj/crc32_lte_sse.o.tmp obj/crc32_lte_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/crc32_by8_sse.o.tmp obj/crc32_by8_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/crc32_iuup_sse.o.tmp obj/crc32_iuup_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/crc32_wimax_sse.o.tmp obj/crc32_wimax_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_sha_256_submit_sse.o.tmp obj/mb_mgr_hmac_sha_256_submit_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_sha_384_flush_sse.o.tmp obj/mb_mgr_hmac_sha_384_flush_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_flush_ni_sse.o.tmp obj/mb_mgr_hmac_flush_ni_sse.o 00:01:51.700 mv obj/crc32_fp_sse.o.tmp obj/crc32_fp_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_sha_512_flush_sse.o.tmp obj/mb_mgr_hmac_sha_512_flush_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_sha_384_submit_sse.o.tmp obj/mb_mgr_hmac_sha_384_submit_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_ni_sse.o.tmp obj/mb_mgr_hmac_submit_ni_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_ni_sse.o.tmp obj/mb_mgr_hmac_sha_256_submit_ni_sse.o 00:01:51.700 mv obj/crc32_by8_sse.o.tmp obj/crc32_by8_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_ni_sse.o.tmp obj/mb_mgr_hmac_sha_224_submit_ni_sse.o 00:01:51.700 mv obj/crc32_iuup_sse.o.tmp obj/crc32_iuup_sse.o 00:01:51.700 mv obj/crc32_wimax_sse.o.tmp obj/crc32_wimax_sse.o 00:01:51.700 nasm -MD obj/chacha20_sse.d -MT obj/chacha20_sse.o -o obj/chacha20_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/chacha20_sse.asm 00:01:51.700 mv obj/mb_mgr_hmac_submit_ni_sse.o.tmp obj/mb_mgr_hmac_submit_ni_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_submit_sse.o.tmp obj/mb_mgr_hmac_sha_512_submit_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_sha_256_submit_ni_sse.o.tmp obj/mb_mgr_hmac_sha_256_submit_ni_sse.o 00:01:51.700 mv obj/mb_mgr_hmac_sha_224_submit_ni_sse.o.tmp obj/mb_mgr_hmac_sha_224_submit_ni_sse.o 00:01:51.700 nasm -MD obj/memcpy_sse.d -MT obj/memcpy_sse.o -o obj/memcpy_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/memcpy_sse.asm 00:01:51.700 nasm -MD obj/gcm128_sse.d -MT obj/gcm128_sse.o -o obj/gcm128_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/gcm128_sse.asm 00:01:51.700 mv obj/mb_mgr_hmac_sha_512_submit_sse.o.tmp obj/mb_mgr_hmac_sha_512_submit_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes_cbcs_1_9_enc_128_x4.o.tmp obj/aes_cbcs_1_9_enc_128_x4.o 00:01:51.700 nasm -MD obj/gcm192_sse.d -MT obj/gcm192_sse.o -o obj/gcm192_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/gcm192_sse.asm 00:01:51.700 nasm -MD obj/gcm256_sse.d -MT obj/gcm256_sse.o -o obj/gcm256_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/gcm256_sse.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes128_cbc_mac_x4_no_aesni.o.tmp obj/aes128_cbc_mac_x4_no_aesni.o 00:01:51.700 mv obj/aes_cbcs_1_9_enc_128_x4.o.tmp obj/aes_cbcs_1_9_enc_128_x4.o 00:01:51.700 nasm -MD obj/aes_cbc_enc_128_x8.d -MT obj/aes_cbc_enc_128_x8.o -o obj/aes_cbc_enc_128_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cbc_enc_128_x8.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/memcpy_sse.o.tmp obj/memcpy_sse.o 00:01:51.700 nasm -MD obj/aes_cbc_enc_192_x8.d -MT obj/aes_cbc_enc_192_x8.o -o obj/aes_cbc_enc_192_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cbc_enc_192_x8.asm 00:01:51.700 nasm -MD obj/aes_cbc_enc_256_x8.d -MT obj/aes_cbc_enc_256_x8.o -o obj/aes_cbc_enc_256_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cbc_enc_256_x8.asm 00:01:51.700 nasm -MD obj/aes128_cbc_dec_by8_avx.d -MT obj/aes128_cbc_dec_by8_avx.o -o obj/aes128_cbc_dec_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cbc_dec_by8_avx.asm 00:01:51.700 mv obj/aes128_cbc_mac_x4_no_aesni.o.tmp obj/aes128_cbc_mac_x4_no_aesni.o 00:01:51.700 mv obj/memcpy_sse.o.tmp obj/memcpy_sse.o 00:01:51.700 nasm -MD obj/aes192_cbc_dec_by8_avx.d -MT obj/aes192_cbc_dec_by8_avx.o -o obj/aes192_cbc_dec_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes192_cbc_dec_by8_avx.asm 00:01:51.700 nasm -MD obj/aes256_cbc_dec_by8_avx.d -MT obj/aes256_cbc_dec_by8_avx.o -o obj/aes256_cbc_dec_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes256_cbc_dec_by8_avx.asm 00:01:51.700 nasm -MD obj/pon_avx.d -MT obj/pon_avx.o -o obj/pon_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/pon_avx.asm 00:01:51.700 nasm -MD obj/aes128_cntr_by8_avx.d -MT obj/aes128_cntr_by8_avx.o -o obj/aes128_cntr_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cntr_by8_avx.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_sse.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse.o 00:01:51.700 nasm -MD obj/aes192_cntr_by8_avx.d -MT obj/aes192_cntr_by8_avx.o -o obj/aes192_cntr_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes192_cntr_by8_avx.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o 00:01:51.700 nasm -MD obj/aes256_cntr_by8_avx.d -MT obj/aes256_cntr_by8_avx.o -o obj/aes256_cntr_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes256_cntr_by8_avx.asm 00:01:51.700 nasm -MD obj/aes128_cntr_ccm_by8_avx.d -MT obj/aes128_cntr_ccm_by8_avx.o -o obj/aes128_cntr_ccm_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cntr_ccm_by8_avx.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_sse.o.tmp obj/mb_mgr_zuc_submit_flush_sse.o 00:01:51.700 nasm -MD obj/aes256_cntr_ccm_by8_avx.d -MT obj/aes256_cntr_ccm_by8_avx.o -o obj/aes256_cntr_ccm_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes256_cntr_ccm_by8_avx.asm 00:01:51.700 nasm -MD obj/aes_ecb_by4_avx.d -MT obj/aes_ecb_by4_avx.o -o obj/aes_ecb_by4_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_ecb_by4_avx.asm 00:01:51.700 nasm -MD obj/aes_cfb_avx.d -MT obj/aes_cfb_avx.o -o obj/aes_cfb_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cfb_avx.asm 00:01:51.700 mv obj/mb_mgr_aes_cmac_submit_flush_sse.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_ni_sse.o.tmp obj/mb_mgr_hmac_sha_224_flush_ni_sse.o 00:01:51.700 mv obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o 00:01:51.700 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o 00:01:51.700 mv obj/mb_mgr_zuc_submit_flush_sse.o.tmp obj/mb_mgr_zuc_submit_flush_sse.o 00:01:51.700 nasm -MD obj/aes128_cbc_mac_x8.d -MT obj/aes128_cbc_mac_x8.o -o obj/aes128_cbc_mac_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cbc_mac_x8.asm 00:01:51.700 nasm -MD obj/aes256_cbc_mac_x8.d -MT obj/aes256_cbc_mac_x8.o -o obj/aes256_cbc_mac_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes256_cbc_mac_x8.asm 00:01:51.700 nasm -MD obj/aes_xcbc_mac_128_x8.d -MT obj/aes_xcbc_mac_128_x8.o -o obj/aes_xcbc_mac_128_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_xcbc_mac_128_x8.asm 00:01:51.700 nasm -MD obj/md5_x4x2_avx.d -MT obj/md5_x4x2_avx.o -o obj/md5_x4x2_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/md5_x4x2_avx.asm 00:01:51.700 mv obj/mb_mgr_hmac_sha_224_flush_ni_sse.o.tmp obj/mb_mgr_hmac_sha_224_flush_ni_sse.o 00:01:51.700 nasm -MD obj/sha1_mult_avx.d -MT obj/sha1_mult_avx.o -o obj/sha1_mult_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha1_mult_avx.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_128_x8.o.tmp obj/aes_cbc_enc_128_x8.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes128_cbc_dec_by8_avx.o.tmp obj/aes128_cbc_dec_by8_avx.o 00:01:51.700 nasm -MD obj/sha1_one_block_avx.d -MT obj/sha1_one_block_avx.o -o obj/sha1_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha1_one_block_avx.asm 00:01:51.700 nasm -MD obj/sha224_one_block_avx.d -MT obj/sha224_one_block_avx.o -o obj/sha224_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha224_one_block_avx.asm 00:01:51.700 mv obj/aes_cbc_enc_128_x8.o.tmp obj/aes_cbc_enc_128_x8.o 00:01:51.700 nasm -MD obj/sha256_one_block_avx.d -MT obj/sha256_one_block_avx.o -o obj/sha256_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha256_one_block_avx.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_gfni_sse.o.tmp obj/mb_mgr_zuc_submit_flush_gfni_sse.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_192_x8.o.tmp obj/aes_cbc_enc_192_x8.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o.tmp obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o 00:01:51.700 mv obj/aes128_cbc_dec_by8_avx.o.tmp obj/aes128_cbc_dec_by8_avx.o 00:01:51.700 nasm -MD obj/sha_256_mult_avx.d -MT obj/sha_256_mult_avx.o -o obj/sha_256_mult_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha_256_mult_avx.asm 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_256_x8.o.tmp obj/aes_cbc_enc_256_x8.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_192_x4_no_aesni.o.tmp obj/aes_cbc_enc_192_x4_no_aesni.o 00:01:51.700 ld -r -z ibt -z shstk -o obj/aes192_cbc_dec_by8_avx.o.tmp obj/aes192_cbc_dec_by8_avx.o 00:01:51.700 mv obj/mb_mgr_zuc_submit_flush_gfni_sse.o.tmp obj/mb_mgr_zuc_submit_flush_gfni_sse.o 00:01:51.700 mv obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o.tmp obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o 00:01:51.700 nasm -MD obj/sha384_one_block_avx.d -MT obj/sha384_one_block_avx.o -o obj/sha384_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha384_one_block_avx.asm 00:01:51.700 mv obj/aes_cbc_enc_192_x8.o.tmp obj/aes_cbc_enc_192_x8.o 00:01:51.700 mv obj/aes_cbc_enc_192_x4_no_aesni.o.tmp obj/aes_cbc_enc_192_x4_no_aesni.o 00:01:51.700 mv obj/aes192_cbc_dec_by8_avx.o.tmp obj/aes192_cbc_dec_by8_avx.o 00:01:51.700 nasm -MD obj/sha512_one_block_avx.d -MT obj/sha512_one_block_avx.o -o obj/sha512_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha512_one_block_avx.asm 00:01:51.701 mv obj/aes_cbc_enc_256_x8.o.tmp obj/aes_cbc_enc_256_x8.o 00:01:51.701 nasm -MD obj/sha512_x2_avx.d -MT obj/sha512_x2_avx.o -o obj/sha512_x2_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha512_x2_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes_ecb_by4_avx.o.tmp obj/aes_ecb_by4_avx.o 00:01:51.701 nasm -MD obj/zuc_avx.d -MT obj/zuc_avx.o -o obj/zuc_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/zuc_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes128_cbc_mac_x8.o.tmp obj/aes128_cbc_mac_x8.o 00:01:51.701 nasm -MD obj/mb_mgr_aes_flush_avx.d -MT obj/mb_mgr_aes_flush_avx.o -o obj/mb_mgr_aes_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_flush_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes256_cbc_dec_by8_avx.o.tmp obj/aes256_cbc_dec_by8_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/sha256_one_block_avx.o.tmp obj/sha256_one_block_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/sha1_one_block_avx.o.tmp obj/sha1_one_block_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes_cfb_avx.o.tmp obj/aes_cfb_avx.o 00:01:51.701 mv obj/aes_ecb_by4_avx.o.tmp obj/aes_ecb_by4_avx.o 00:01:51.701 mv obj/aes128_cbc_mac_x8.o.tmp obj/aes128_cbc_mac_x8.o 00:01:51.701 nasm -MD obj/mb_mgr_aes_submit_avx.d -MT obj/mb_mgr_aes_submit_avx.o -o obj/mb_mgr_aes_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_submit_avx.asm 00:01:51.701 mv obj/aes256_cbc_dec_by8_avx.o.tmp obj/aes256_cbc_dec_by8_avx.o 00:01:51.701 mv obj/sha256_one_block_avx.o.tmp obj/sha256_one_block_avx.o 00:01:51.701 mv obj/sha1_one_block_avx.o.tmp obj/sha1_one_block_avx.o 00:01:51.701 mv obj/aes_cfb_avx.o.tmp obj/aes_cfb_avx.o 00:01:51.701 nasm -MD obj/mb_mgr_aes192_flush_avx.d -MT obj/mb_mgr_aes192_flush_avx.o -o obj/mb_mgr_aes192_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes192_flush_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_aes192_submit_avx.d -MT obj/mb_mgr_aes192_submit_avx.o -o obj/mb_mgr_aes192_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes192_submit_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_aes256_flush_avx.d -MT obj/mb_mgr_aes256_flush_avx.o -o obj/mb_mgr_aes256_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes256_flush_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes256_cbc_mac_x8.o.tmp obj/aes256_cbc_mac_x8.o 00:01:51.701 nasm -MD obj/mb_mgr_aes256_submit_avx.d -MT obj/mb_mgr_aes256_submit_avx.o -o obj/mb_mgr_aes256_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes256_submit_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_avx.d -MT obj/mb_mgr_aes_cmac_submit_flush_avx.o -o obj/mb_mgr_aes_cmac_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_cmac_submit_flush_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes256_cntr_ccm_by8_avx.o.tmp obj/aes256_cntr_ccm_by8_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/sha512_one_block_avx.o.tmp obj/sha512_one_block_avx.o 00:01:51.701 mv obj/aes256_cbc_mac_x8.o.tmp obj/aes256_cbc_mac_x8.o 00:01:51.701 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_avx.d -MT obj/mb_mgr_aes256_cmac_submit_flush_avx.o -o obj/mb_mgr_aes256_cmac_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes256_cmac_submit_flush_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_avx.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_ccm_auth_submit_flush_avx.asm 00:01:51.701 mv obj/aes256_cntr_ccm_by8_avx.o.tmp obj/aes256_cntr_ccm_by8_avx.o 00:01:51.701 mv obj/sha512_one_block_avx.o.tmp obj/sha512_one_block_avx.o 00:01:51.701 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes256_ccm_auth_submit_flush_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_aes_xcbc_flush_avx.d -MT obj/mb_mgr_aes_xcbc_flush_avx.o -o obj/mb_mgr_aes_xcbc_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_xcbc_flush_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_aes_xcbc_submit_avx.d -MT obj/mb_mgr_aes_xcbc_submit_avx.o -o obj/mb_mgr_aes_xcbc_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_xcbc_submit_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/sha224_one_block_avx.o.tmp obj/sha224_one_block_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/sha384_one_block_avx.o.tmp obj/sha384_one_block_avx.o 00:01:51.701 nasm -MD obj/mb_mgr_hmac_md5_flush_avx.d -MT obj/mb_mgr_hmac_md5_flush_avx.o -o obj/mb_mgr_hmac_md5_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_md5_flush_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_hmac_md5_submit_avx.d -MT obj/mb_mgr_hmac_md5_submit_avx.o -o obj/mb_mgr_hmac_md5_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_md5_submit_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_hmac_flush_avx.d -MT obj/mb_mgr_hmac_flush_avx.o -o obj/mb_mgr_hmac_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_flush_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes_xcbc_mac_128_x8.o.tmp obj/aes_xcbc_mac_128_x8.o 00:01:51.701 mv obj/sha384_one_block_avx.o.tmp obj/sha384_one_block_avx.o 00:01:51.701 nasm -MD obj/mb_mgr_hmac_submit_avx.d -MT obj/mb_mgr_hmac_submit_avx.o -o obj/mb_mgr_hmac_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_submit_avx.asm 00:01:51.701 mv obj/sha224_one_block_avx.o.tmp obj/sha224_one_block_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_avx.o.tmp obj/mb_mgr_aes_flush_avx.o 00:01:51.701 mv obj/aes_xcbc_mac_128_x8.o.tmp obj/aes_xcbc_mac_128_x8.o 00:01:51.701 nasm -MD obj/mb_mgr_hmac_sha_224_flush_avx.d -MT obj/mb_mgr_hmac_sha_224_flush_avx.o -o obj/mb_mgr_hmac_sha_224_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_224_flush_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_hmac_sha_224_submit_avx.d -MT obj/mb_mgr_hmac_sha_224_submit_avx.o -o obj/mb_mgr_hmac_sha_224_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_224_submit_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_hmac_sha_256_flush_avx.d -MT obj/mb_mgr_hmac_sha_256_flush_avx.o -o obj/mb_mgr_hmac_sha_256_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_256_flush_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes128_cntr_ccm_by8_avx.o.tmp obj/aes128_cntr_ccm_by8_avx.o 00:01:51.701 mv obj/mb_mgr_aes_flush_avx.o.tmp obj/mb_mgr_aes_flush_avx.o 00:01:51.701 nasm -MD obj/mb_mgr_hmac_sha_256_submit_avx.d -MT obj/mb_mgr_hmac_sha_256_submit_avx.o -o obj/mb_mgr_hmac_sha_256_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_256_submit_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_hmac_sha_384_flush_avx.d -MT obj/mb_mgr_hmac_sha_384_flush_avx.o -o obj/mb_mgr_hmac_sha_384_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_384_flush_avx.asm 00:01:51.701 mv obj/aes128_cntr_ccm_by8_avx.o.tmp obj/aes128_cntr_ccm_by8_avx.o 00:01:51.701 nasm -MD obj/mb_mgr_hmac_sha_384_submit_avx.d -MT obj/mb_mgr_hmac_sha_384_submit_avx.o -o obj/mb_mgr_hmac_sha_384_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_384_submit_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_avx.o.tmp obj/mb_mgr_aes192_flush_avx.o 00:01:51.701 nasm -MD obj/mb_mgr_hmac_sha_512_flush_avx.d -MT obj/mb_mgr_hmac_sha_512_flush_avx.o -o obj/mb_mgr_hmac_sha_512_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_512_flush_avx.asm 00:01:51.701 nasm -MD obj/mb_mgr_hmac_sha_512_submit_avx.d -MT obj/mb_mgr_hmac_sha_512_submit_avx.o -o obj/mb_mgr_hmac_sha_512_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_512_submit_avx.asm 00:01:51.701 mv obj/mb_mgr_aes192_flush_avx.o.tmp obj/mb_mgr_aes192_flush_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_256_x4_no_aesni.o.tmp obj/aes_cbc_enc_256_x4_no_aesni.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_avx.o.tmp obj/mb_mgr_aes256_submit_avx.o 00:01:51.701 nasm -MD obj/mb_mgr_zuc_submit_flush_avx.d -MT obj/mb_mgr_zuc_submit_flush_avx.o -o obj/mb_mgr_zuc_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_zuc_submit_flush_avx.asm 00:01:51.701 nasm -MD obj/ethernet_fcs_avx.d -MT obj/ethernet_fcs_avx.o -o obj/ethernet_fcs_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/ethernet_fcs_avx.asm 00:01:51.701 nasm -MD obj/crc16_x25_avx.d -MT obj/crc16_x25_avx.o -o obj/crc16_x25_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc16_x25_avx.asm 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_avx.o.tmp obj/mb_mgr_aes_submit_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_avx.o.tmp obj/mb_mgr_aes192_submit_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_avx.o.tmp obj/mb_mgr_aes256_flush_avx.o 00:01:51.701 mv obj/aes_cbc_enc_256_x4_no_aesni.o.tmp obj/aes_cbc_enc_256_x4_no_aesni.o 00:01:51.701 mv obj/mb_mgr_aes256_submit_avx.o.tmp obj/mb_mgr_aes256_submit_avx.o 00:01:51.701 nasm -MD obj/aes_cbcs_1_9_enc_128_x8.d -MT obj/aes_cbcs_1_9_enc_128_x8.o -o obj/aes_cbcs_1_9_enc_128_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cbcs_1_9_enc_128_x8.asm 00:01:51.701 nasm -MD obj/aes128_cbcs_1_9_dec_by8_avx.d -MT obj/aes128_cbcs_1_9_dec_by8_avx.o -o obj/aes128_cbcs_1_9_dec_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cbcs_1_9_dec_by8_avx.asm 00:01:51.701 mv obj/mb_mgr_aes_submit_avx.o.tmp obj/mb_mgr_aes_submit_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_flush_avx.o.tmp obj/mb_mgr_aes_xcbc_flush_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_avx.o.tmp obj/mb_mgr_hmac_flush_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_flush_avx.o.tmp obj/mb_mgr_hmac_md5_flush_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/sha512_x2_avx.o.tmp obj/sha512_x2_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/ethernet_fcs_avx.o.tmp obj/ethernet_fcs_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/crc16_x25_avx.o.tmp obj/crc16_x25_avx.o 00:01:51.701 mv obj/mb_mgr_aes192_submit_avx.o.tmp obj/mb_mgr_aes192_submit_avx.o 00:01:51.701 mv obj/mb_mgr_aes256_flush_avx.o.tmp obj/mb_mgr_aes256_flush_avx.o 00:01:51.701 mv obj/mb_mgr_aes_xcbc_flush_avx.o.tmp obj/mb_mgr_aes_xcbc_flush_avx.o 00:01:51.701 ld -r -z ibt -z shstk -o obj/sha1_mult_avx.o.tmp obj/sha1_mult_avx.o 00:01:51.701 mv obj/mb_mgr_hmac_flush_avx.o.tmp obj/mb_mgr_hmac_flush_avx.o 00:01:51.701 mv obj/mb_mgr_hmac_md5_flush_avx.o.tmp obj/mb_mgr_hmac_md5_flush_avx.o 00:01:51.702 ld -r -z ibt -z shstk -o obj/sha_256_mult_avx.o.tmp obj/sha_256_mult_avx.o 00:01:51.702 mv obj/sha512_x2_avx.o.tmp obj/sha512_x2_avx.o 00:01:51.702 mv obj/ethernet_fcs_avx.o.tmp obj/ethernet_fcs_avx.o 00:01:51.702 mv obj/crc16_x25_avx.o.tmp obj/crc16_x25_avx.o 00:01:51.702 mv obj/sha1_mult_avx.o.tmp obj/sha1_mult_avx.o 00:01:51.702 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_submit_avx.d -MT obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o -o obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes128_cbcs_1_9_submit_avx.asm 00:01:51.963 mv obj/sha_256_mult_avx.o.tmp obj/sha_256_mult_avx.o 00:01:51.963 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_flush_avx.d -MT obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o -o obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes128_cbcs_1_9_flush_avx.asm 00:01:51.963 nasm -MD obj/crc32_refl_by8_avx.d -MT obj/crc32_refl_by8_avx.o -o obj/crc32_refl_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_refl_by8_avx.asm 00:01:51.963 nasm -MD obj/crc32_by8_avx.d -MT obj/crc32_by8_avx.o -o obj/crc32_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_by8_avx.asm 00:01:51.963 nasm -MD obj/crc32_sctp_avx.d -MT obj/crc32_sctp_avx.o -o obj/crc32_sctp_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_sctp_avx.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_avx.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx.o 00:01:51.963 nasm -MD obj/crc32_lte_avx.d -MT obj/crc32_lte_avx.o -o obj/crc32_lte_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_lte_avx.asm 00:01:51.963 nasm -MD obj/crc32_fp_avx.d -MT obj/crc32_fp_avx.o -o obj/crc32_fp_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_fp_avx.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/aes192_cntr_by8_avx.o.tmp obj/aes192_cntr_by8_avx.o 00:01:51.963 nasm -MD obj/crc32_iuup_avx.d -MT obj/crc32_iuup_avx.o -o obj/crc32_iuup_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_iuup_avx.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_avx.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx.o 00:01:51.963 mv obj/mb_mgr_hmac_sha_224_flush_avx.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/aes128_cntr_by8_avx.o.tmp obj/aes128_cntr_by8_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_submit_avx.o.tmp obj/mb_mgr_aes_xcbc_submit_avx.o 00:01:51.963 mv obj/aes192_cntr_by8_avx.o.tmp obj/aes192_cntr_by8_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/crc32_sctp_avx.o.tmp obj/crc32_sctp_avx.o 00:01:51.963 nasm -MD obj/crc32_wimax_avx.d -MT obj/crc32_wimax_avx.o -o obj/crc32_wimax_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_wimax_avx.asm 00:01:51.963 mv obj/mb_mgr_hmac_sha_256_flush_avx.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx.o 00:01:51.963 nasm -MD obj/chacha20_avx.d -MT obj/chacha20_avx.o -o obj/chacha20_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/chacha20_avx.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/aes256_cbc_mac_x4_no_aesni.o.tmp obj/aes256_cbc_mac_x4_no_aesni.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_submit_avx.o.tmp obj/mb_mgr_hmac_md5_submit_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/aes128_cbcs_1_9_dec_by8_avx.o.tmp obj/aes128_cbcs_1_9_dec_by8_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/crc32_lte_avx.o.tmp obj/crc32_lte_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/crc32_refl_by8_avx.o.tmp obj/crc32_refl_by8_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/crc32_fp_avx.o.tmp obj/crc32_fp_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/crc32_iuup_avx.o.tmp obj/crc32_iuup_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/crc32_by8_avx.o.tmp obj/crc32_by8_avx.o 00:01:51.963 mv obj/aes128_cntr_by8_avx.o.tmp obj/aes128_cntr_by8_avx.o 00:01:51.963 mv obj/mb_mgr_aes_xcbc_submit_avx.o.tmp obj/mb_mgr_aes_xcbc_submit_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_flush_avx.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx.o 00:01:51.963 mv obj/crc32_sctp_avx.o.tmp obj/crc32_sctp_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/crc32_wimax_avx.o.tmp obj/crc32_wimax_avx.o 00:01:51.963 mv obj/aes256_cbc_mac_x4_no_aesni.o.tmp obj/aes256_cbc_mac_x4_no_aesni.o 00:01:51.963 mv obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o 00:01:51.963 mv obj/mb_mgr_hmac_md5_submit_avx.o.tmp obj/mb_mgr_hmac_md5_submit_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_avx.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx.o 00:01:51.963 mv obj/aes128_cbcs_1_9_dec_by8_avx.o.tmp obj/aes128_cbcs_1_9_dec_by8_avx.o 00:01:51.963 mv obj/crc32_lte_avx.o.tmp obj/crc32_lte_avx.o 00:01:51.963 mv obj/crc32_refl_by8_avx.o.tmp obj/crc32_refl_by8_avx.o 00:01:51.963 mv obj/crc32_fp_avx.o.tmp obj/crc32_fp_avx.o 00:01:51.963 mv obj/crc32_iuup_avx.o.tmp obj/crc32_iuup_avx.o 00:01:51.963 mv obj/crc32_by8_avx.o.tmp obj/crc32_by8_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o 00:01:51.963 mv obj/mb_mgr_hmac_sha_384_flush_avx.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx.o 00:01:51.963 mv obj/crc32_wimax_avx.o.tmp obj/crc32_wimax_avx.o 00:01:51.963 mv obj/mb_mgr_hmac_sha_224_submit_avx.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx.o 00:01:51.963 nasm -MD obj/memcpy_avx.d -MT obj/memcpy_avx.o -o obj/memcpy_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/memcpy_avx.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_submit_avx.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_flush_avx.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx.o 00:01:51.963 mv obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o 00:01:51.963 nasm -MD obj/gcm128_avx_gen2.d -MT obj/gcm128_avx_gen2.o -o obj/gcm128_avx_gen2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/gcm128_avx_gen2.asm 00:01:51.963 nasm -MD obj/gcm192_avx_gen2.d -MT obj/gcm192_avx_gen2.o -o obj/gcm192_avx_gen2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/gcm192_avx_gen2.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_avx.o.tmp obj/mb_mgr_aes_cmac_submit_flush_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_avx.o.tmp obj/mb_mgr_hmac_submit_avx.o 00:01:51.963 mv obj/mb_mgr_hmac_sha_512_submit_avx.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx.o 00:01:51.963 mv obj/mb_mgr_hmac_sha_512_flush_avx.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx.o 00:01:51.963 mv obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/aes_cbcs_1_9_enc_128_x8.o.tmp obj/aes_cbcs_1_9_enc_128_x8.o 00:01:51.963 nasm -MD obj/gcm256_avx_gen2.d -MT obj/gcm256_avx_gen2.o -o obj/gcm256_avx_gen2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/gcm256_avx_gen2.asm 00:01:51.963 mv obj/mb_mgr_aes_cmac_submit_flush_avx.o.tmp obj/mb_mgr_aes_cmac_submit_flush_avx.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/memcpy_avx.o.tmp obj/memcpy_avx.o 00:01:51.963 mv obj/mb_mgr_hmac_submit_avx.o.tmp obj/mb_mgr_hmac_submit_avx.o 00:01:51.963 nasm -MD obj/md5_x8x2_avx2.d -MT obj/md5_x8x2_avx2.o -o obj/md5_x8x2_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/md5_x8x2_avx2.asm 00:01:51.963 nasm -MD obj/sha1_x8_avx2.d -MT obj/sha1_x8_avx2.o -o obj/sha1_x8_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/sha1_x8_avx2.asm 00:01:51.963 mv obj/aes_cbcs_1_9_enc_128_x8.o.tmp obj/aes_cbcs_1_9_enc_128_x8.o 00:01:51.963 ld -r -z ibt -z shstk -o obj/aes256_cntr_by8_avx.o.tmp obj/aes256_cntr_by8_avx.o 00:01:51.963 mv obj/memcpy_avx.o.tmp obj/memcpy_avx.o 00:01:51.963 nasm -MD obj/sha256_oct_avx2.d -MT obj/sha256_oct_avx2.o -o obj/sha256_oct_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/sha256_oct_avx2.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o 00:01:51.963 nasm -MD obj/sha512_x4_avx2.d -MT obj/sha512_x4_avx2.o -o obj/sha512_x4_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/sha512_x4_avx2.asm 00:01:51.963 nasm -MD obj/zuc_avx2.d -MT obj/zuc_avx2.o -o obj/zuc_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/zuc_avx2.asm 00:01:51.963 nasm -MD obj/mb_mgr_hmac_md5_flush_avx2.d -MT obj/mb_mgr_hmac_md5_flush_avx2.o -o obj/mb_mgr_hmac_md5_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_md5_flush_avx2.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_avx.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_avx.o 00:01:51.963 mv obj/aes256_cntr_by8_avx.o.tmp obj/aes256_cntr_by8_avx.o 00:01:51.963 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o 00:01:51.963 nasm -MD obj/mb_mgr_hmac_md5_submit_avx2.d -MT obj/mb_mgr_hmac_md5_submit_avx2.o -o obj/mb_mgr_hmac_md5_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_md5_submit_avx2.asm 00:01:51.963 mv obj/mb_mgr_aes256_cmac_submit_flush_avx.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_avx.o 00:01:51.963 nasm -MD obj/mb_mgr_hmac_flush_avx2.d -MT obj/mb_mgr_hmac_flush_avx2.o -o obj/mb_mgr_hmac_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_flush_avx2.asm 00:01:51.963 nasm -MD obj/mb_mgr_hmac_submit_avx2.d -MT obj/mb_mgr_hmac_submit_avx2.o -o obj/mb_mgr_hmac_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_submit_avx2.asm 00:01:51.963 nasm -MD obj/mb_mgr_hmac_sha_224_flush_avx2.d -MT obj/mb_mgr_hmac_sha_224_flush_avx2.o -o obj/mb_mgr_hmac_sha_224_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_224_flush_avx2.asm 00:01:51.963 nasm -MD obj/mb_mgr_hmac_sha_224_submit_avx2.d -MT obj/mb_mgr_hmac_sha_224_submit_avx2.o -o obj/mb_mgr_hmac_sha_224_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_224_submit_avx2.asm 00:01:51.963 nasm -MD obj/mb_mgr_hmac_sha_256_flush_avx2.d -MT obj/mb_mgr_hmac_sha_256_flush_avx2.o -o obj/mb_mgr_hmac_sha_256_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_256_flush_avx2.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/aes128_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes128_cbc_dec_by4_sse_no_aesni.o 00:01:51.963 nasm -MD obj/mb_mgr_hmac_sha_256_submit_avx2.d -MT obj/mb_mgr_hmac_sha_256_submit_avx2.o -o obj/mb_mgr_hmac_sha_256_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_256_submit_avx2.asm 00:01:51.963 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_submit_avx.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx.o 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_384_flush_avx2.d -MT obj/mb_mgr_hmac_sha_384_flush_avx2.o -o obj/mb_mgr_hmac_sha_384_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_384_flush_avx2.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_384_submit_avx2.d -MT obj/mb_mgr_hmac_sha_384_submit_avx2.o -o obj/mb_mgr_hmac_sha_384_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_384_submit_avx2.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/md5_x4x2_avx.o.tmp obj/md5_x4x2_avx.o 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_512_flush_avx2.d -MT obj/mb_mgr_hmac_sha_512_flush_avx2.o -o obj/mb_mgr_hmac_sha_512_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_512_flush_avx2.asm 00:01:51.964 mv obj/mb_mgr_hmac_sha_384_submit_avx.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx.o 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_512_submit_avx2.d -MT obj/mb_mgr_hmac_sha_512_submit_avx2.o -o obj/mb_mgr_hmac_sha_512_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_512_submit_avx2.asm 00:01:51.964 mv obj/aes128_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes128_cbc_dec_by4_sse_no_aesni.o 00:01:51.964 nasm -MD obj/mb_mgr_zuc_submit_flush_avx2.d -MT obj/mb_mgr_zuc_submit_flush_avx2.o -o obj/mb_mgr_zuc_submit_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_zuc_submit_flush_avx2.asm 00:01:51.964 mv obj/md5_x4x2_avx.o.tmp obj/md5_x4x2_avx.o 00:01:51.964 nasm -MD obj/chacha20_avx2.d -MT obj/chacha20_avx2.o -o obj/chacha20_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/chacha20_avx2.asm 00:01:51.964 nasm -MD obj/gcm128_avx_gen4.d -MT obj/gcm128_avx_gen4.o -o obj/gcm128_avx_gen4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/gcm128_avx_gen4.asm 00:01:51.964 nasm -MD obj/gcm192_avx_gen4.d -MT obj/gcm192_avx_gen4.o -o obj/gcm192_avx_gen4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/gcm192_avx_gen4.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_avx.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx.o 00:01:51.964 nasm -MD obj/gcm256_avx_gen4.d -MT obj/gcm256_avx_gen4.o -o obj/gcm256_avx_gen4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/gcm256_avx_gen4.asm 00:01:51.964 nasm -MD obj/sha1_x16_avx512.d -MT obj/sha1_x16_avx512.o -o obj/sha1_x16_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/sha1_x16_avx512.asm 00:01:51.964 nasm -MD obj/sha256_x16_avx512.d -MT obj/sha256_x16_avx512.o -o obj/sha256_x16_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/sha256_x16_avx512.asm 00:01:51.964 nasm -MD obj/sha512_x8_avx512.d -MT obj/sha512_x8_avx512.o -o obj/sha512_x8_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/sha512_x8_avx512.asm 00:01:51.964 mv obj/mb_mgr_hmac_sha_256_submit_avx.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx.o 00:01:51.964 nasm -MD obj/des_x16_avx512.d -MT obj/des_x16_avx512.o -o obj/des_x16_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/des_x16_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_flush_avx2.o.tmp obj/mb_mgr_hmac_md5_flush_avx2.o 00:01:51.964 nasm -MD obj/cntr_vaes_avx512.d -MT obj/cntr_vaes_avx512.o -o obj/cntr_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/cntr_vaes_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_avx2.o.tmp obj/mb_mgr_hmac_flush_avx2.o 00:01:51.964 nasm -MD obj/cntr_ccm_vaes_avx512.d -MT obj/cntr_ccm_vaes_avx512.o -o obj/cntr_ccm_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/cntr_ccm_vaes_avx512.asm 00:01:51.964 mv obj/mb_mgr_hmac_md5_flush_avx2.o.tmp obj/mb_mgr_hmac_md5_flush_avx2.o 00:01:51.964 mv obj/mb_mgr_hmac_flush_avx2.o.tmp obj/mb_mgr_hmac_flush_avx2.o 00:01:51.964 nasm -MD obj/aes_cbc_dec_vaes_avx512.d -MT obj/aes_cbc_dec_vaes_avx512.o -o obj/aes_cbc_dec_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_cbc_dec_vaes_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_avx.o.tmp obj/mb_mgr_zuc_submit_flush_avx.o 00:01:51.964 nasm -MD obj/aes_cbc_enc_vaes_avx512.d -MT obj/aes_cbc_enc_vaes_avx512.o -o obj/aes_cbc_enc_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_cbc_enc_vaes_avx512.asm 00:01:51.964 nasm -MD obj/aes_cbcs_enc_vaes_avx512.d -MT obj/aes_cbcs_enc_vaes_avx512.o -o obj/aes_cbcs_enc_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_cbcs_enc_vaes_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/sha256_oct_avx2.o.tmp obj/sha256_oct_avx2.o 00:01:51.964 mv obj/mb_mgr_zuc_submit_flush_avx.o.tmp obj/mb_mgr_zuc_submit_flush_avx.o 00:01:51.964 nasm -MD obj/aes_cbcs_dec_vaes_avx512.d -MT obj/aes_cbcs_dec_vaes_avx512.o -o obj/aes_cbcs_dec_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_cbcs_dec_vaes_avx512.asm 00:01:51.964 nasm -MD obj/aes_docsis_dec_avx512.d -MT obj/aes_docsis_dec_avx512.o -o obj/aes_docsis_dec_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_docsis_dec_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_submit_avx2.o.tmp obj/mb_mgr_hmac_md5_submit_avx2.o 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx2.o 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_avx2.o.tmp obj/mb_mgr_hmac_submit_avx2.o 00:01:51.964 mv obj/sha256_oct_avx2.o.tmp obj/sha256_oct_avx2.o 00:01:51.964 nasm -MD obj/aes_docsis_enc_avx512.d -MT obj/aes_docsis_enc_avx512.o -o obj/aes_docsis_enc_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_docsis_enc_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx2.o 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx2.o 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx2.o 00:01:51.964 ld -r -z ibt -z shstk -o obj/pon_avx.o.tmp obj/pon_avx.o 00:01:51.964 mv obj/mb_mgr_hmac_md5_submit_avx2.o.tmp obj/mb_mgr_hmac_md5_submit_avx2.o 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx2.o 00:01:51.964 mv obj/mb_mgr_hmac_sha_224_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx2.o 00:01:51.964 mv obj/mb_mgr_hmac_submit_avx2.o.tmp obj/mb_mgr_hmac_submit_avx2.o 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx2.o 00:01:51.964 mv obj/mb_mgr_hmac_sha_256_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx2.o 00:01:51.964 mv obj/mb_mgr_hmac_sha_256_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx2.o 00:01:51.964 mv obj/mb_mgr_hmac_sha_384_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx2.o 00:01:51.964 mv obj/pon_avx.o.tmp obj/pon_avx.o 00:01:51.964 mv obj/mb_mgr_hmac_sha_384_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx2.o 00:01:51.964 nasm -MD obj/aes_docsis_dec_vaes_avx512.d -MT obj/aes_docsis_dec_vaes_avx512.o -o obj/aes_docsis_dec_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_docsis_dec_vaes_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/sha512_x4_avx2.o.tmp obj/sha512_x4_avx2.o 00:01:51.964 mv obj/mb_mgr_hmac_sha_512_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx2.o 00:01:51.964 nasm -MD obj/aes_docsis_enc_vaes_avx512.d -MT obj/aes_docsis_enc_vaes_avx512.o -o obj/aes_docsis_enc_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_docsis_enc_vaes_avx512.asm 00:01:51.964 nasm -MD obj/zuc_avx512.d -MT obj/zuc_avx512.o -o obj/zuc_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/zuc_avx512.asm 00:01:51.964 mv obj/sha512_x4_avx2.o.tmp obj/sha512_x4_avx2.o 00:01:51.964 nasm -MD obj/mb_mgr_aes_submit_avx512.d -MT obj/mb_mgr_aes_submit_avx512.o -o obj/mb_mgr_aes_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_submit_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/sha1_x8_avx2.o.tmp obj/sha1_x8_avx2.o 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx2.o 00:01:51.964 nasm -MD obj/mb_mgr_aes_flush_avx512.d -MT obj/mb_mgr_aes_flush_avx512.o -o obj/mb_mgr_aes_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_flush_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_aes192_submit_avx512.d -MT obj/mb_mgr_aes192_submit_avx512.o -o obj/mb_mgr_aes192_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes192_submit_avx512.asm 00:01:51.964 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx2.o 00:01:51.964 mv obj/sha1_x8_avx2.o.tmp obj/sha1_x8_avx2.o 00:01:51.964 mv obj/mb_mgr_hmac_sha_224_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx2.o 00:01:51.964 nasm -MD obj/mb_mgr_aes192_flush_avx512.d -MT obj/mb_mgr_aes192_flush_avx512.o -o obj/mb_mgr_aes192_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes192_flush_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_aes256_submit_avx512.d -MT obj/mb_mgr_aes256_submit_avx512.o -o obj/mb_mgr_aes256_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes256_submit_avx512.asm 00:01:51.964 mv obj/mb_mgr_hmac_sha_512_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx2.o 00:01:51.964 nasm -MD obj/mb_mgr_aes256_flush_avx512.d -MT obj/mb_mgr_aes256_flush_avx512.o -o obj/mb_mgr_aes256_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes256_flush_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_flush_avx512.d -MT obj/mb_mgr_hmac_flush_avx512.o -o obj/mb_mgr_hmac_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_flush_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_submit_avx512.d -MT obj/mb_mgr_hmac_submit_avx512.o -o obj/mb_mgr_hmac_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_submit_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_224_flush_avx512.d -MT obj/mb_mgr_hmac_sha_224_flush_avx512.o -o obj/mb_mgr_hmac_sha_224_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_224_flush_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_224_submit_avx512.d -MT obj/mb_mgr_hmac_sha_224_submit_avx512.o -o obj/mb_mgr_hmac_sha_224_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_224_submit_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_256_flush_avx512.d -MT obj/mb_mgr_hmac_sha_256_flush_avx512.o -o obj/mb_mgr_hmac_sha_256_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_256_flush_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_256_submit_avx512.d -MT obj/mb_mgr_hmac_sha_256_submit_avx512.o -o obj/mb_mgr_hmac_sha_256_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_256_submit_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_384_flush_avx512.d -MT obj/mb_mgr_hmac_sha_384_flush_avx512.o -o obj/mb_mgr_hmac_sha_384_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_384_flush_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_384_submit_avx512.d -MT obj/mb_mgr_hmac_sha_384_submit_avx512.o -o obj/mb_mgr_hmac_sha_384_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_384_submit_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_512_flush_avx512.d -MT obj/mb_mgr_hmac_sha_512_flush_avx512.o -o obj/mb_mgr_hmac_sha_512_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_512_flush_avx512.asm 00:01:51.964 nasm -MD obj/mb_mgr_hmac_sha_512_submit_avx512.d -MT obj/mb_mgr_hmac_sha_512_submit_avx512.o -o obj/mb_mgr_hmac_sha_512_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_512_submit_avx512.asm 00:01:51.965 nasm -MD obj/mb_mgr_des_avx512.d -MT obj/mb_mgr_des_avx512.o -o obj/mb_mgr_des_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_des_avx512.asm 00:01:51.965 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_cmac_submit_flush_vaes_avx512.asm 00:01:51.965 ld -r -z ibt -z shstk -o obj/sha1_x16_avx512.o.tmp obj/sha1_x16_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/aes_cbcs_enc_vaes_avx512.o.tmp obj/aes_cbcs_enc_vaes_avx512.o 00:01:51.965 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.asm 00:01:51.965 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.asm 00:01:51.965 mv obj/sha1_x16_avx512.o.tmp obj/sha1_x16_avx512.o 00:01:51.965 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.asm 00:01:51.965 mv obj/aes_cbcs_enc_vaes_avx512.o.tmp obj/aes_cbcs_enc_vaes_avx512.o 00:01:51.965 nasm -MD obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.asm 00:01:51.965 nasm -MD obj/mb_mgr_zuc_submit_flush_avx512.d -MT obj/mb_mgr_zuc_submit_flush_avx512.o -o obj/mb_mgr_zuc_submit_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_zuc_submit_flush_avx512.asm 00:01:51.965 nasm -MD obj/mb_mgr_zuc_submit_flush_gfni_avx512.d -MT obj/mb_mgr_zuc_submit_flush_gfni_avx512.o -o obj/mb_mgr_zuc_submit_flush_gfni_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_zuc_submit_flush_gfni_avx512.asm 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_avx512.o.tmp obj/mb_mgr_aes_submit_avx512.o 00:01:51.965 nasm -MD obj/chacha20_avx512.d -MT obj/chacha20_avx512.o -o obj/chacha20_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/chacha20_avx512.asm 00:01:51.965 nasm -MD obj/poly_avx512.d -MT obj/poly_avx512.o -o obj/poly_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/poly_avx512.asm 00:01:51.965 nasm -MD obj/poly_fma_avx512.d -MT obj/poly_fma_avx512.o -o obj/poly_fma_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/poly_fma_avx512.asm 00:01:51.965 nasm -MD obj/ethernet_fcs_avx512.d -MT obj/ethernet_fcs_avx512.o -o obj/ethernet_fcs_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/ethernet_fcs_avx512.asm 00:01:51.965 mv obj/mb_mgr_aes_submit_avx512.o.tmp obj/mb_mgr_aes_submit_avx512.o 00:01:51.965 nasm -MD obj/crc16_x25_avx512.d -MT obj/crc16_x25_avx512.o -o obj/crc16_x25_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc16_x25_avx512.asm 00:01:51.965 nasm -MD obj/crc32_refl_by16_vclmul_avx512.d -MT obj/crc32_refl_by16_vclmul_avx512.o -o obj/crc32_refl_by16_vclmul_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_refl_by16_vclmul_avx512.asm 00:01:51.965 nasm -MD obj/crc32_by16_vclmul_avx512.d -MT obj/crc32_by16_vclmul_avx512.o -o obj/crc32_by16_vclmul_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_by16_vclmul_avx512.asm 00:01:51.965 ld -r -z ibt -z shstk -o obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o.tmp obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o 00:01:51.965 nasm -MD obj/mb_mgr_aes_cbcs_1_9_submit_avx512.d -MT obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o -o obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_cbcs_1_9_submit_avx512.asm 00:01:51.965 nasm -MD obj/mb_mgr_aes_cbcs_1_9_flush_avx512.d -MT obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o -o obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_cbcs_1_9_flush_avx512.asm 00:01:51.965 nasm -MD obj/crc32_sctp_avx512.d -MT obj/crc32_sctp_avx512.o -o obj/crc32_sctp_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_sctp_avx512.asm 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_avx512.o.tmp obj/mb_mgr_aes192_submit_avx512.o 00:01:51.965 mv obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o.tmp obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o 00:01:51.965 nasm -MD obj/crc32_lte_avx512.d -MT obj/crc32_lte_avx512.o -o obj/crc32_lte_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_lte_avx512.asm 00:01:51.965 nasm -MD obj/crc32_fp_avx512.d -MT obj/crc32_fp_avx512.o -o obj/crc32_fp_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_fp_avx512.asm 00:01:51.965 ld -r -z ibt -z shstk -o obj/ethernet_fcs_avx512.o.tmp obj/ethernet_fcs_avx512.o 00:01:51.965 nasm -MD obj/crc32_iuup_avx512.d -MT obj/crc32_iuup_avx512.o -o obj/crc32_iuup_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_iuup_avx512.asm 00:01:51.965 mv obj/mb_mgr_aes192_submit_avx512.o.tmp obj/mb_mgr_aes192_submit_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/crc32_sctp_avx512.o.tmp obj/crc32_sctp_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_avx512.o.tmp obj/mb_mgr_hmac_flush_avx512.o 00:01:51.965 nasm -MD obj/crc32_wimax_avx512.d -MT obj/crc32_wimax_avx512.o -o obj/crc32_wimax_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_wimax_avx512.asm 00:01:51.965 mv obj/ethernet_fcs_avx512.o.tmp obj/ethernet_fcs_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/aes192_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes192_cbc_dec_by4_sse_no_aesni.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_avx512.o.tmp obj/mb_mgr_aes256_submit_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_avx512.o.tmp obj/mb_mgr_hmac_submit_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/crc32_refl_by16_vclmul_avx512.o.tmp obj/crc32_refl_by16_vclmul_avx512.o 00:01:51.965 mv obj/crc32_sctp_avx512.o.tmp obj/crc32_sctp_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/crc32_by16_vclmul_avx512.o.tmp obj/crc32_by16_vclmul_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/crc32_lte_avx512.o.tmp obj/crc32_lte_avx512.o 00:01:51.965 mv obj/mb_mgr_hmac_flush_avx512.o.tmp obj/mb_mgr_hmac_flush_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx512.o 00:01:51.965 mv obj/mb_mgr_aes256_submit_avx512.o.tmp obj/mb_mgr_aes256_submit_avx512.o 00:01:51.965 mv obj/crc32_refl_by16_vclmul_avx512.o.tmp obj/crc32_refl_by16_vclmul_avx512.o 00:01:51.965 mv obj/aes192_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes192_cbc_dec_by4_sse_no_aesni.o 00:01:51.965 mv obj/mb_mgr_hmac_submit_avx512.o.tmp obj/mb_mgr_hmac_submit_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_avx512.o.tmp obj/mb_mgr_aes_flush_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/crc32_fp_avx512.o.tmp obj/crc32_fp_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/crc32_iuup_avx512.o.tmp obj/crc32_iuup_avx512.o 00:01:51.965 mv obj/crc32_by16_vclmul_avx512.o.tmp obj/crc32_by16_vclmul_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/crc16_x25_avx512.o.tmp obj/crc16_x25_avx512.o 00:01:51.965 mv obj/crc32_lte_avx512.o.tmp obj/crc32_lte_avx512.o 00:01:51.965 mv obj/mb_mgr_hmac_sha_512_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/sha256_x16_avx512.o.tmp obj/sha256_x16_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/crc32_wimax_avx512.o.tmp obj/crc32_wimax_avx512.o 00:01:51.965 mv obj/mb_mgr_aes_flush_avx512.o.tmp obj/mb_mgr_aes_flush_avx512.o 00:01:51.965 mv obj/crc32_fp_avx512.o.tmp obj/crc32_fp_avx512.o 00:01:51.965 mv obj/crc32_iuup_avx512.o.tmp obj/crc32_iuup_avx512.o 00:01:51.965 mv obj/crc16_x25_avx512.o.tmp obj/crc16_x25_avx512.o 00:01:51.965 nasm -MD obj/gcm128_vaes_avx512.d -MT obj/gcm128_vaes_avx512.o -o obj/gcm128_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm128_vaes_avx512.asm 00:01:51.965 ld -r -z ibt -z shstk -o obj/md5_x8x2_avx2.o.tmp obj/md5_x8x2_avx2.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx512.o 00:01:51.965 mv obj/sha256_x16_avx512.o.tmp obj/sha256_x16_avx512.o 00:01:51.965 mv obj/crc32_wimax_avx512.o.tmp obj/crc32_wimax_avx512.o 00:01:51.965 nasm -MD obj/gcm192_vaes_avx512.d -MT obj/gcm192_vaes_avx512.o -o obj/gcm192_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm192_vaes_avx512.asm 00:01:51.965 mv obj/md5_x8x2_avx2.o.tmp obj/md5_x8x2_avx2.o 00:01:51.965 mv obj/mb_mgr_hmac_sha_256_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx512.o 00:01:51.965 nasm -MD obj/gcm256_vaes_avx512.d -MT obj/gcm256_vaes_avx512.o -o obj/gcm256_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm256_vaes_avx512.asm 00:01:51.965 nasm -MD obj/gcm128_avx512.d -MT obj/gcm128_avx512.o -o obj/gcm128_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm128_avx512.asm 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx512.o 00:01:51.965 nasm -MD obj/gcm192_avx512.d -MT obj/gcm192_avx512.o -o obj/gcm192_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm192_avx512.asm 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_avx512.o.tmp obj/mb_mgr_aes192_flush_avx512.o 00:01:51.965 nasm -MD obj/gcm256_avx512.d -MT obj/gcm256_avx512.o -o obj/gcm256_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm256_avx512.asm 00:01:51.965 mv obj/mb_mgr_hmac_sha_384_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx512.o 00:01:51.965 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/mb_mgr_avx.c -o obj/mb_mgr_avx.o 00:01:51.965 gcc -MMD -march=haswell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx2/mb_mgr_avx2.c -o obj/mb_mgr_avx2.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx512.o 00:01:51.965 mv obj/mb_mgr_aes192_flush_avx512.o.tmp obj/mb_mgr_aes192_flush_avx512.o 00:01:51.965 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx512.o 00:01:51.965 gcc -MMD -march=broadwell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx512/mb_mgr_avx512.c -o obj/mb_mgr_avx512.o 00:01:51.966 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/mb_mgr_sse.c -o obj/mb_mgr_sse.o 00:01:51.966 gcc -MMD -march=nehalem -mno-pclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC -O1 no-aesni/mb_mgr_sse_no_aesni.c -o obj/mb_mgr_sse_no_aesni.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/alloc.c -o obj/alloc.o 00:01:51.966 mv obj/mb_mgr_hmac_sha_224_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx512.o 00:01:51.966 mv obj/mb_mgr_hmac_sha_224_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx512.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/aes_xcbc_expand_key.c -o obj/aes_xcbc_expand_key.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/md5_one_block.c -o obj/md5_one_block.o 00:01:51.966 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/sha_sse.c -o obj/sha_sse.o 00:01:51.966 ld -r -z ibt -z shstk -o obj/sha512_x8_avx512.o.tmp obj/sha512_x8_avx512.o 00:01:51.966 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx512.o 00:01:51.966 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o.tmp obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o 00:01:51.966 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/sha_avx.c -o obj/sha_avx.o 00:01:51.966 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx512.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/des_key.c -o obj/des_key.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/des_basic.c -o obj/des_basic.o 00:01:51.966 mv obj/sha512_x8_avx512.o.tmp obj/sha512_x8_avx512.o 00:01:51.966 mv obj/mb_mgr_hmac_sha_384_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx512.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/version.c -o obj/version.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/cpu_feature.c -o obj/cpu_feature.o 00:01:51.966 mv obj/mb_mgr_hmac_sha_512_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx512.o 00:01:51.966 gcc -MMD -march=nehalem -mno-pclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC -O1 no-aesni/aesni_emu.c -o obj/aesni_emu.o 00:01:51.966 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/kasumi_avx.c -o obj/kasumi_avx.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/kasumi_iv.c -o obj/kasumi_iv.o 00:01:51.966 mv obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o.tmp obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o 00:01:51.966 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/kasumi_sse.c -o obj/kasumi_sse.o 00:01:51.966 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/zuc_sse_top.c -o obj/zuc_sse_top.o 00:01:51.966 gcc -MMD -march=nehalem -mno-pclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC -O1 no-aesni/zuc_sse_no_aesni_top.c -o obj/zuc_sse_no_aesni_top.o 00:01:51.966 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx512.o 00:01:51.966 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/zuc_avx_top.c -o obj/zuc_avx_top.o 00:01:51.966 gcc -MMD -march=haswell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx2/zuc_avx2_top.c -o obj/zuc_avx2_top.o 00:01:51.966 gcc -MMD -march=broadwell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx512/zuc_avx512_top.c -o obj/zuc_avx512_top.o 00:01:51.966 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/zuc_iv.c -o obj/zuc_iv.o 00:01:51.966 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o 00:01:51.966 mv obj/mb_mgr_hmac_sha_256_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx512.o 00:01:51.966 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/snow3g_sse.c -o obj/snow3g_sse.o 00:01:51.966 gcc -MMD -march=nehalem -mno-pclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC -O1 no-aesni/snow3g_sse_no_aesni.c -o obj/snow3g_sse_no_aesni.o 00:01:51.967 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/snow3g_avx.c -o obj/snow3g_avx.o 00:01:51.967 gcc -MMD -march=haswell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx2/snow3g_avx2.c -o obj/snow3g_avx2.o 00:01:51.967 mv obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o 00:01:51.967 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/snow3g_tables.c -o obj/snow3g_tables.o 00:01:51.967 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/snow3g_iv.c -o obj/snow3g_iv.o 00:01:51.967 nasm -MD obj/snow_v_sse.d -MT obj/snow_v_sse.o -o obj/snow_v_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/snow_v_sse.asm 00:01:51.967 nasm -MD obj/snow_v_sse_noaesni.d -MT obj/snow_v_sse_noaesni.o -o obj/snow_v_sse_noaesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/snow_v_sse_noaesni.asm 00:01:51.967 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/mb_mgr_auto.c -o obj/mb_mgr_auto.o 00:01:51.967 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/error.c -o obj/error.o 00:01:51.967 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_avx512.o.tmp obj/mb_mgr_aes256_flush_avx512.o 00:01:51.967 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/gcm.c -o obj/gcm.o 00:01:51.967 ld -r -z ibt -z shstk -o obj/mb_mgr_des_avx512.o.tmp obj/mb_mgr_des_avx512.o 00:01:51.967 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_avx512.o.tmp obj/mb_mgr_zuc_submit_flush_avx512.o 00:01:51.967 mv obj/mb_mgr_aes256_flush_avx512.o.tmp obj/mb_mgr_aes256_flush_avx512.o 00:01:51.967 mv obj/mb_mgr_des_avx512.o.tmp obj/mb_mgr_des_avx512.o 00:01:51.967 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_gfni_avx512.o.tmp obj/mb_mgr_zuc_submit_flush_gfni_avx512.o 00:01:51.967 mv obj/mb_mgr_zuc_submit_flush_avx512.o.tmp obj/mb_mgr_zuc_submit_flush_avx512.o 00:01:51.967 mv obj/mb_mgr_zuc_submit_flush_gfni_avx512.o.tmp obj/mb_mgr_zuc_submit_flush_gfni_avx512.o 00:01:51.967 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o 00:01:51.967 mv obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o 00:01:51.967 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o.tmp obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o 00:01:51.967 mv obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o.tmp obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o 00:01:51.967 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o 00:01:51.967 mv obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/poly_avx512.o.tmp obj/poly_avx512.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/snow_v_sse.o.tmp obj/snow_v_sse.o 00:01:52.226 mv obj/poly_avx512.o.tmp obj/poly_avx512.o 00:01:52.226 mv obj/snow_v_sse.o.tmp obj/snow_v_sse.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/aes256_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes256_cbc_dec_by4_sse_no_aesni.o 00:01:52.226 mv obj/aes256_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes256_cbc_dec_by4_sse_no_aesni.o 00:01:52.226 mv obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/poly_fma_avx512.o.tmp obj/poly_fma_avx512.o 00:01:52.226 mv obj/poly_fma_avx512.o.tmp obj/poly_fma_avx512.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/aes_cbcs_dec_vaes_avx512.o.tmp obj/aes_cbcs_dec_vaes_avx512.o 00:01:52.226 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o 00:01:52.226 mv obj/aes_cbcs_dec_vaes_avx512.o.tmp obj/aes_cbcs_dec_vaes_avx512.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/snow_v_sse_noaesni.o.tmp obj/snow_v_sse_noaesni.o 00:01:52.226 mv obj/snow_v_sse_noaesni.o.tmp obj/snow_v_sse_noaesni.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/zuc_common.o.tmp obj/zuc_common.o 00:01:52.226 mv obj/zuc_common.o.tmp obj/zuc_common.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/aes_docsis_dec_avx512.o.tmp obj/aes_docsis_dec_avx512.o 00:01:52.226 mv obj/aes_docsis_dec_avx512.o.tmp obj/aes_docsis_dec_avx512.o 00:01:52.226 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_avx2.o.tmp obj/mb_mgr_zuc_submit_flush_avx2.o 00:01:52.226 mv obj/mb_mgr_zuc_submit_flush_avx2.o.tmp obj/mb_mgr_zuc_submit_flush_avx2.o 00:01:52.484 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_vaes_avx512.o.tmp obj/aes_cbc_enc_vaes_avx512.o 00:01:52.484 mv obj/aes_cbc_enc_vaes_avx512.o.tmp obj/aes_cbc_enc_vaes_avx512.o 00:01:52.742 ld -r -z ibt -z shstk -o obj/zuc_sse_gfni.o.tmp obj/zuc_sse_gfni.o 00:01:52.742 mv obj/zuc_sse_gfni.o.tmp obj/zuc_sse_gfni.o 00:01:52.742 ld -r -z ibt -z shstk -o obj/chacha20_avx.o.tmp obj/chacha20_avx.o 00:01:52.742 mv obj/chacha20_avx.o.tmp obj/chacha20_avx.o 00:01:52.742 ld -r -z ibt -z shstk -o obj/aes_docsis_enc_vaes_avx512.o.tmp obj/aes_docsis_enc_vaes_avx512.o 00:01:52.742 mv obj/aes_docsis_enc_vaes_avx512.o.tmp obj/aes_docsis_enc_vaes_avx512.o 00:01:52.742 ld -r -z ibt -z shstk -o obj/aes_docsis_enc_avx512.o.tmp obj/aes_docsis_enc_avx512.o 00:01:52.742 mv obj/aes_docsis_enc_avx512.o.tmp obj/aes_docsis_enc_avx512.o 00:01:52.742 ld -r -z ibt -z shstk -o obj/pon_sse_no_aesni.o.tmp obj/pon_sse_no_aesni.o 00:01:52.742 mv obj/pon_sse_no_aesni.o.tmp obj/pon_sse_no_aesni.o 00:01:52.742 ld -r -z ibt -z shstk -o obj/chacha20_avx2.o.tmp obj/chacha20_avx2.o 00:01:52.742 mv obj/chacha20_avx2.o.tmp obj/chacha20_avx2.o 00:01:53.000 ld -r -z ibt -z shstk -o obj/zuc_sse.o.tmp obj/zuc_sse.o 00:01:53.000 mv obj/zuc_sse.o.tmp obj/zuc_sse.o 00:01:53.257 ld -r -z ibt -z shstk -o obj/zuc_sse_no_aesni.o.tmp obj/zuc_sse_no_aesni.o 00:01:53.257 mv obj/zuc_sse_no_aesni.o.tmp obj/zuc_sse_no_aesni.o 00:01:53.257 ld -r -z ibt -z shstk -o obj/aes_cbc_dec_vaes_avx512.o.tmp obj/aes_cbc_dec_vaes_avx512.o 00:01:53.257 mv obj/aes_cbc_dec_vaes_avx512.o.tmp obj/aes_cbc_dec_vaes_avx512.o 00:01:53.257 ld -r -z ibt -z shstk -o obj/aes128_cntr_ccm_by8_sse_no_aesni.o.tmp obj/aes128_cntr_ccm_by8_sse_no_aesni.o 00:01:53.257 ld -r -z ibt -z shstk -o obj/zuc_avx.o.tmp obj/zuc_avx.o 00:01:53.257 mv obj/aes128_cntr_ccm_by8_sse_no_aesni.o.tmp obj/aes128_cntr_ccm_by8_sse_no_aesni.o 00:01:53.257 mv obj/zuc_avx.o.tmp obj/zuc_avx.o 00:01:53.257 ld -r -z ibt -z shstk -o obj/gcm128_sse.o.tmp obj/gcm128_sse.o 00:01:53.258 mv obj/gcm128_sse.o.tmp obj/gcm128_sse.o 00:01:53.515 ld -r -z ibt -z shstk -o obj/gcm192_sse.o.tmp obj/gcm192_sse.o 00:01:53.516 mv obj/gcm192_sse.o.tmp obj/gcm192_sse.o 00:01:53.516 ld -r -z ibt -z shstk -o obj/gcm256_sse.o.tmp obj/gcm256_sse.o 00:01:53.516 mv obj/gcm256_sse.o.tmp obj/gcm256_sse.o 00:01:53.516 ld -r -z ibt -z shstk -o obj/gcm192_avx_gen2.o.tmp obj/gcm192_avx_gen2.o 00:01:53.516 mv obj/gcm192_avx_gen2.o.tmp obj/gcm192_avx_gen2.o 00:01:53.516 ld -r -z ibt -z shstk -o obj/gcm256_avx_gen2.o.tmp obj/gcm256_avx_gen2.o 00:01:53.516 mv obj/gcm256_avx_gen2.o.tmp obj/gcm256_avx_gen2.o 00:01:53.774 ld -r -z ibt -z shstk -o obj/gcm128_avx_gen2.o.tmp obj/gcm128_avx_gen2.o 00:01:53.774 mv obj/gcm128_avx_gen2.o.tmp obj/gcm128_avx_gen2.o 00:01:54.033 ld -r -z ibt -z shstk -o obj/aes256_cntr_ccm_by8_sse_no_aesni.o.tmp obj/aes256_cntr_ccm_by8_sse_no_aesni.o 00:01:54.033 mv obj/aes256_cntr_ccm_by8_sse_no_aesni.o.tmp obj/aes256_cntr_ccm_by8_sse_no_aesni.o 00:01:54.033 ld -r -z ibt -z shstk -o obj/gcm128_avx512.o.tmp obj/gcm128_avx512.o 00:01:54.291 mv obj/gcm128_avx512.o.tmp obj/gcm128_avx512.o 00:01:54.291 ld -r -z ibt -z shstk -o obj/aes_docsis_dec_vaes_avx512.o.tmp obj/aes_docsis_dec_vaes_avx512.o 00:01:54.291 mv obj/aes_docsis_dec_vaes_avx512.o.tmp obj/aes_docsis_dec_vaes_avx512.o 00:01:54.550 ld -r -z ibt -z shstk -o obj/cntr_ccm_vaes_avx512.o.tmp obj/cntr_ccm_vaes_avx512.o 00:01:54.550 mv obj/cntr_ccm_vaes_avx512.o.tmp obj/cntr_ccm_vaes_avx512.o 00:01:54.550 ld -r -z ibt -z shstk -o obj/chacha20_avx512.o.tmp obj/chacha20_avx512.o 00:01:54.550 mv obj/chacha20_avx512.o.tmp obj/chacha20_avx512.o 00:01:54.550 ld -r -z ibt -z shstk -o obj/gcm192_avx512.o.tmp obj/gcm192_avx512.o 00:01:54.550 mv obj/gcm192_avx512.o.tmp obj/gcm192_avx512.o 00:01:54.550 ld -r -z ibt -z shstk -o obj/gcm256_avx512.o.tmp obj/gcm256_avx512.o 00:01:54.550 mv obj/gcm256_avx512.o.tmp obj/gcm256_avx512.o 00:01:54.550 ld -r -z ibt -z shstk -o obj/zuc_avx512.o.tmp obj/zuc_avx512.o 00:01:54.550 mv obj/zuc_avx512.o.tmp obj/zuc_avx512.o 00:01:55.118 ld -r -z ibt -z shstk -o obj/aes_ecb_by4_sse_no_aesni.o.tmp obj/aes_ecb_by4_sse_no_aesni.o 00:01:55.118 mv obj/aes_ecb_by4_sse_no_aesni.o.tmp obj/aes_ecb_by4_sse_no_aesni.o 00:01:55.686 ld -r -z ibt -z shstk -o obj/gcm256_avx_gen4.o.tmp obj/gcm256_avx_gen4.o 00:01:55.686 mv obj/gcm256_avx_gen4.o.tmp obj/gcm256_avx_gen4.o 00:01:55.686 ld -r -z ibt -z shstk -o obj/gcm128_avx_gen4.o.tmp obj/gcm128_avx_gen4.o 00:01:55.686 ld -r -z ibt -z shstk -o obj/aes128_cntr_by8_sse_no_aesni.o.tmp obj/aes128_cntr_by8_sse_no_aesni.o 00:01:55.686 mv obj/gcm128_avx_gen4.o.tmp obj/gcm128_avx_gen4.o 00:01:55.686 mv obj/aes128_cntr_by8_sse_no_aesni.o.tmp obj/aes128_cntr_by8_sse_no_aesni.o 00:01:56.623 ld -r -z ibt -z shstk -o obj/aes192_cntr_by8_sse_no_aesni.o.tmp obj/aes192_cntr_by8_sse_no_aesni.o 00:01:56.623 mv obj/aes192_cntr_by8_sse_no_aesni.o.tmp obj/aes192_cntr_by8_sse_no_aesni.o 00:01:56.882 ld -r -z ibt -z shstk -o obj/des_x16_avx512.o.tmp obj/des_x16_avx512.o 00:01:56.882 mv obj/des_x16_avx512.o.tmp obj/des_x16_avx512.o 00:01:57.451 ld -r -z ibt -z shstk -o obj/aes256_cntr_by8_sse_no_aesni.o.tmp obj/aes256_cntr_by8_sse_no_aesni.o 00:01:57.451 mv obj/aes256_cntr_by8_sse_no_aesni.o.tmp obj/aes256_cntr_by8_sse_no_aesni.o 00:01:58.412 ld -r -z ibt -z shstk -o obj/chacha20_sse.o.tmp obj/chacha20_sse.o 00:01:58.412 mv obj/chacha20_sse.o.tmp obj/chacha20_sse.o 00:01:59.348 ld -r -z ibt -z shstk -o obj/zuc_avx2.o.tmp obj/zuc_avx2.o 00:01:59.348 mv obj/zuc_avx2.o.tmp obj/zuc_avx2.o 00:02:00.726 ld -r -z ibt -z shstk -o obj/gcm192_avx_gen4.o.tmp obj/gcm192_avx_gen4.o 00:02:00.726 mv obj/gcm192_avx_gen4.o.tmp obj/gcm192_avx_gen4.o 00:02:01.663 ld -r -z ibt -z shstk -o obj/gcm256_vaes_avx512.o.tmp obj/gcm256_vaes_avx512.o 00:02:01.663 mv obj/gcm256_vaes_avx512.o.tmp obj/gcm256_vaes_avx512.o 00:02:02.600 ld -r -z ibt -z shstk -o obj/gcm128_vaes_avx512.o.tmp obj/gcm128_vaes_avx512.o 00:02:02.600 mv obj/gcm128_vaes_avx512.o.tmp obj/gcm128_vaes_avx512.o 00:02:03.169 ld -r -z ibt -z shstk -o obj/gcm192_vaes_avx512.o.tmp obj/gcm192_vaes_avx512.o 00:02:03.169 mv obj/gcm192_vaes_avx512.o.tmp obj/gcm192_vaes_avx512.o 00:02:13.182 ld -r -z ibt -z shstk -o obj/cntr_vaes_avx512.o.tmp obj/cntr_vaes_avx512.o 00:02:13.182 mv obj/cntr_vaes_avx512.o.tmp obj/cntr_vaes_avx512.o 00:03:20.914 ld -r -z ibt -z shstk -o obj/gcm128_sse_no_aesni.o.tmp obj/gcm128_sse_no_aesni.o 00:03:20.914 mv obj/gcm128_sse_no_aesni.o.tmp obj/gcm128_sse_no_aesni.o 00:03:23.449 ld -r -z ibt -z shstk -o obj/gcm192_sse_no_aesni.o.tmp obj/gcm192_sse_no_aesni.o 00:03:23.449 mv obj/gcm192_sse_no_aesni.o.tmp obj/gcm192_sse_no_aesni.o 00:03:38.341 ld -r -z ibt -z shstk -o obj/gcm256_sse_no_aesni.o.tmp obj/gcm256_sse_no_aesni.o 00:03:38.342 mv obj/gcm256_sse_no_aesni.o.tmp obj/gcm256_sse_no_aesni.o 00:03:38.342 gcc -shared -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now -fcf-protection=full -Wl,-z,ibt -Wl,-z,shstk -Wl,-z,cet-report=error -Wl,-soname,libIPSec_MB.so.1 -o libIPSec_MB.so.1.0.0 obj/aes_keyexp_128.o obj/aes_keyexp_192.o obj/aes_keyexp_256.o obj/aes_cmac_subkey_gen.o obj/save_xmms.o obj/clear_regs_mem_fns.o obj/const.o obj/aes128_ecbenc_x3.o obj/zuc_common.o obj/wireless_common.o obj/constant_lookup.o obj/crc32_refl_const.o obj/crc32_const.o obj/poly1305.o obj/chacha20_poly1305.o obj/aes128_cbc_dec_by4_sse_no_aesni.o obj/aes192_cbc_dec_by4_sse_no_aesni.o obj/aes256_cbc_dec_by4_sse_no_aesni.o obj/aes_cbc_enc_128_x4_no_aesni.o obj/aes_cbc_enc_192_x4_no_aesni.o obj/aes_cbc_enc_256_x4_no_aesni.o obj/aes128_cntr_by8_sse_no_aesni.o obj/aes192_cntr_by8_sse_no_aesni.o obj/aes256_cntr_by8_sse_no_aesni.o obj/aes_ecb_by4_sse_no_aesni.o obj/aes128_cntr_ccm_by8_sse_no_aesni.o obj/aes256_cntr_ccm_by8_sse_no_aesni.o obj/pon_sse_no_aesni.o obj/zuc_sse_no_aesni.o obj/aes_cfb_sse_no_aesni.o obj/aes128_cbc_mac_x4_no_aesni.o obj/aes256_cbc_mac_x4_no_aesni.o obj/aes_xcbc_mac_128_x4_no_aesni.o obj/mb_mgr_aes_flush_sse_no_aesni.o obj/mb_mgr_aes_submit_sse_no_aesni.o obj/mb_mgr_aes192_flush_sse_no_aesni.o obj/mb_mgr_aes192_submit_sse_no_aesni.o obj/mb_mgr_aes256_flush_sse_no_aesni.o obj/mb_mgr_aes256_submit_sse_no_aesni.o obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o obj/ethernet_fcs_sse_no_aesni.o obj/crc16_x25_sse_no_aesni.o obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o obj/crc32_refl_by8_sse_no_aesni.o obj/crc32_by8_sse_no_aesni.o obj/crc32_sctp_sse_no_aesni.o obj/crc32_lte_sse_no_aesni.o obj/crc32_fp_sse_no_aesni.o obj/crc32_iuup_sse_no_aesni.o obj/crc32_wimax_sse_no_aesni.o obj/gcm128_sse_no_aesni.o obj/gcm192_sse_no_aesni.o obj/gcm256_sse_no_aesni.o obj/aes128_cbc_dec_by4_sse.o obj/aes128_cbc_dec_by8_sse.o obj/aes192_cbc_dec_by4_sse.o obj/aes192_cbc_dec_by8_sse.o obj/aes256_cbc_dec_by4_sse.o obj/aes256_cbc_dec_by8_sse.o obj/aes_cbc_enc_128_x4.o obj/aes_cbc_enc_192_x4.o obj/aes_cbc_enc_256_x4.o obj/aes_cbc_enc_128_x8_sse.o obj/aes_cbc_enc_192_x8_sse.o obj/aes_cbc_enc_256_x8_sse.o obj/pon_sse.o obj/aes128_cntr_by8_sse.o obj/aes192_cntr_by8_sse.o obj/aes256_cntr_by8_sse.o obj/aes_ecb_by4_sse.o obj/aes128_cntr_ccm_by8_sse.o obj/aes256_cntr_ccm_by8_sse.o obj/aes_cfb_sse.o obj/aes128_cbc_mac_x4.o obj/aes256_cbc_mac_x4.o obj/aes128_cbc_mac_x8_sse.o obj/aes256_cbc_mac_x8_sse.o obj/aes_xcbc_mac_128_x4.o obj/md5_x4x2_sse.o obj/sha1_mult_sse.o obj/sha1_one_block_sse.o obj/sha224_one_block_sse.o obj/sha256_one_block_sse.o obj/sha384_one_block_sse.o obj/sha512_one_block_sse.o obj/sha512_x2_sse.o obj/sha_256_mult_sse.o obj/sha1_ni_x2_sse.o obj/sha256_ni_x2_sse.o obj/zuc_sse.o obj/zuc_sse_gfni.o obj/mb_mgr_aes_flush_sse.o obj/mb_mgr_aes_submit_sse.o obj/mb_mgr_aes192_flush_sse.o obj/mb_mgr_aes192_submit_sse.o obj/mb_mgr_aes256_flush_sse.o obj/mb_mgr_aes256_submit_sse.o obj/mb_mgr_aes_flush_sse_x8.o obj/mb_mgr_aes_submit_sse_x8.o obj/mb_mgr_aes192_flush_sse_x8.o obj/mb_mgr_aes192_submit_sse_x8.o obj/mb_mgr_aes256_flush_sse_x8.o obj/mb_mgr_aes256_submit_sse_x8.o obj/mb_mgr_aes_cmac_submit_flush_sse.o obj/mb_mgr_aes256_cmac_submit_flush_sse.o obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o obj/mb_mgr_aes_xcbc_flush_sse.o obj/mb_mgr_aes_xcbc_submit_sse.o obj/mb_mgr_hmac_md5_flush_sse.o obj/mb_mgr_hmac_md5_submit_sse.o obj/mb_mgr_hmac_flush_sse.o obj/mb_mgr_hmac_submit_sse.o obj/mb_mgr_hmac_sha_224_flush_sse.o obj/mb_mgr_hmac_sha_224_submit_sse.o obj/mb_mgr_hmac_sha_256_flush_sse.o obj/mb_mgr_hmac_sha_256_submit_sse.o obj/mb_mgr_hmac_sha_384_flush_sse.o obj/mb_mgr_hmac_sha_384_submit_sse.o obj/mb_mgr_hmac_sha_512_flush_sse.o obj/mb_mgr_hmac_sha_512_submit_sse.o obj/mb_mgr_hmac_flush_ni_sse.o obj/mb_mgr_hmac_submit_ni_sse.o obj/mb_mgr_hmac_sha_224_flush_ni_sse.o obj/mb_mgr_hmac_sha_224_submit_ni_sse.o obj/mb_mgr_hmac_sha_256_flush_ni_sse.o obj/mb_mgr_hmac_sha_256_submit_ni_sse.o obj/mb_mgr_zuc_submit_flush_sse.o obj/mb_mgr_zuc_submit_flush_gfni_sse.o obj/ethernet_fcs_sse.o obj/crc16_x25_sse.o obj/crc32_sctp_sse.o obj/aes_cbcs_1_9_enc_128_x4.o obj/aes128_cbcs_1_9_dec_by4_sse.o obj/crc32_refl_by8_sse.o obj/crc32_by8_sse.o obj/crc32_lte_sse.o obj/crc32_fp_sse.o obj/crc32_iuup_sse.o obj/crc32_wimax_sse.o obj/chacha20_sse.o obj/memcpy_sse.o obj/gcm128_sse.o obj/gcm192_sse.o obj/gcm256_sse.o obj/aes_cbc_enc_128_x8.o obj/aes_cbc_enc_192_x8.o obj/aes_cbc_enc_256_x8.o obj/aes128_cbc_dec_by8_avx.o obj/aes192_cbc_dec_by8_avx.o obj/aes256_cbc_dec_by8_avx.o obj/pon_avx.o obj/aes128_cntr_by8_avx.o obj/aes192_cntr_by8_avx.o obj/aes256_cntr_by8_avx.o obj/aes128_cntr_ccm_by8_avx.o obj/aes256_cntr_ccm_by8_avx.o obj/aes_ecb_by4_avx.o obj/aes_cfb_avx.o obj/aes128_cbc_mac_x8.o obj/aes256_cbc_mac_x8.o obj/aes_xcbc_mac_128_x8.o obj/md5_x4x2_avx.o obj/sha1_mult_avx.o obj/sha1_one_block_avx.o obj/sha224_one_block_avx.o obj/sha256_one_block_avx.o obj/sha_256_mult_avx.o obj/sha384_one_block_avx.o obj/sha512_one_block_avx.o obj/sha512_x2_avx.o obj/zuc_avx.o obj/mb_mgr_aes_flush_avx.o obj/mb_mgr_aes_submit_avx.o obj/mb_mgr_aes192_flush_avx.o obj/mb_mgr_aes192_submit_avx.o obj/mb_mgr_aes256_flush_avx.o obj/mb_mgr_aes256_submit_avx.o obj/mb_mgr_aes_cmac_submit_flush_avx.o obj/mb_mgr_aes256_cmac_submit_flush_avx.o obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o obj/mb_mgr_aes_xcbc_flush_avx.o obj/mb_mgr_aes_xcbc_submit_avx.o obj/mb_mgr_hmac_md5_flush_avx.o obj/mb_mgr_hmac_md5_submit_avx.o obj/mb_mgr_hmac_flush_avx.o obj/mb_mgr_hmac_submit_avx.o obj/mb_mgr_hmac_sha_224_flush_avx.o obj/mb_mgr_hmac_sha_224_submit_avx.o obj/mb_mgr_hmac_sha_256_flush_avx.o obj/mb_mgr_hmac_sha_256_submit_avx.o obj/mb_mgr_hmac_sha_384_flush_avx.o obj/mb_mgr_hmac_sha_384_submit_avx.o obj/mb_mgr_hmac_sha_512_flush_avx.o obj/mb_mgr_hmac_sha_512_submit_avx.o obj/mb_mgr_zuc_submit_flush_avx.o obj/ethernet_fcs_avx.o obj/crc16_x25_avx.o obj/aes_cbcs_1_9_enc_128_x8.o obj/aes128_cbcs_1_9_dec_by8_avx.o obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o obj/crc32_refl_by8_avx.o obj/crc32_by8_avx.o obj/crc32_sctp_avx.o obj/crc32_lte_avx.o obj/crc32_fp_avx.o obj/crc32_iuup_avx.o obj/crc32_wimax_avx.o obj/chacha20_avx.o obj/memcpy_avx.o obj/gcm128_avx_gen2.o obj/gcm192_avx_gen2.o obj/gcm256_avx_gen2.o obj/md5_x8x2_avx2.o obj/sha1_x8_avx2.o obj/sha256_oct_avx2.o obj/sha512_x4_avx2.o obj/zuc_avx2.o obj/mb_mgr_hmac_md5_flush_avx2.o obj/mb_mgr_hmac_md5_submit_avx2.o obj/mb_mgr_hmac_flush_avx2.o obj/mb_mgr_hmac_submit_avx2.o obj/mb_mgr_hmac_sha_224_flush_avx2.o obj/mb_mgr_hmac_sha_224_submit_avx2.o obj/mb_mgr_hmac_sha_256_flush_avx2.o obj/mb_mgr_hmac_sha_256_submit_avx2.o obj/mb_mgr_hmac_sha_384_flush_avx2.o obj/mb_mgr_hmac_sha_384_submit_avx2.o obj/mb_mgr_hmac_sha_512_flush_avx2.o obj/mb_mgr_hmac_sha_512_submit_avx2.o obj/mb_mgr_zuc_submit_flush_avx2.o obj/chacha20_avx2.o obj/gcm128_avx_gen4.o obj/gcm192_avx_gen4.o obj/gcm256_avx_gen4.o obj/sha1_x16_avx512.o obj/sha256_x16_avx512.o obj/sha512_x8_avx512.o obj/des_x16_avx512.o obj/cntr_vaes_avx512.o obj/cntr_ccm_vaes_avx512.o obj/aes_cbc_dec_vaes_avx512.o obj/aes_cbc_enc_vaes_avx512.o obj/aes_cbcs_enc_vaes_avx512.o obj/aes_cbcs_dec_vaes_avx512.o obj/aes_docsis_dec_avx512.o obj/aes_docsis_enc_avx512.o obj/aes_docsis_dec_vaes_avx512.o obj/aes_docsis_enc_vaes_avx512.o obj/zuc_avx512.o obj/mb_mgr_aes_submit_avx512.o obj/mb_mgr_aes_flush_avx512.o obj/mb_mgr_aes192_submit_avx512.o obj/mb_mgr_aes192_flush_avx512.o obj/mb_mgr_aes256_submit_avx512.o obj/mb_mgr_aes256_flush_avx512.o obj/mb_mgr_hmac_flush_avx512.o obj/mb_mgr_hmac_submit_avx512.o obj/mb_mgr_hmac_sha_224_flush_avx512.o obj/mb_mgr_hmac_sha_224_submit_avx512.o obj/mb_mgr_hmac_sha_256_flush_avx512.o obj/mb_mgr_hmac_sha_256_submit_avx512.o obj/mb_mgr_hmac_sha_384_flush_avx512.o obj/mb_mgr_hmac_sha_384_submit_avx512.o obj/mb_mgr_hmac_sha_512_flush_avx512.o obj/mb_mgr_hmac_sha_512_submit_avx512.o obj/mb_mgr_des_avx512.o obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o obj/mb_mgr_zuc_submit_flush_avx512.o obj/mb_mgr_zuc_submit_flush_gfni_avx512.o obj/chacha20_avx512.o obj/poly_avx512.o obj/poly_fma_avx512.o obj/ethernet_fcs_avx512.o obj/crc16_x25_avx512.o obj/crc32_refl_by16_vclmul_avx512.o obj/crc32_by16_vclmul_avx512.o obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o obj/crc32_sctp_avx512.o obj/crc32_lte_avx512.o obj/crc32_fp_avx512.o obj/crc32_iuup_avx512.o obj/crc32_wimax_avx512.o obj/gcm128_vaes_avx512.o obj/gcm192_vaes_avx512.o obj/gcm256_vaes_avx512.o obj/gcm128_avx512.o obj/gcm192_avx512.o obj/gcm256_avx512.o obj/mb_mgr_avx.o obj/mb_mgr_avx2.o obj/mb_mgr_avx512.o obj/mb_mgr_sse.o obj/mb_mgr_sse_no_aesni.o obj/alloc.o obj/aes_xcbc_expand_key.o obj/md5_one_block.o obj/sha_sse.o obj/sha_avx.o obj/des_key.o obj/des_basic.o obj/version.o obj/cpu_feature.o obj/aesni_emu.o obj/kasumi_avx.o obj/kasumi_iv.o obj/kasumi_sse.o obj/zuc_sse_top.o obj/zuc_sse_no_aesni_top.o obj/zuc_avx_top.o obj/zuc_avx2_top.o obj/zuc_avx512_top.o obj/zuc_iv.o obj/snow3g_sse.o obj/snow3g_sse_no_aesni.o obj/snow3g_avx.o obj/snow3g_avx2.o obj/snow3g_tables.o obj/snow3g_iv.o obj/snow_v_sse.o obj/snow_v_sse_noaesni.o obj/mb_mgr_auto.o obj/error.o obj/gcm.o -lc 00:03:38.342 ln -f -s libIPSec_MB.so.1.0.0 ./libIPSec_MB.so.1 00:03:38.342 ln -f -s libIPSec_MB.so.1 ./libIPSec_MB.so 00:03:38.342 make[1]: Leaving directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' 00:03:38.342 make -C test 00:03:38.342 make[1]: Entering directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/test' 00:03:38.342 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o main.o main.c 00:03:38.342 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o gcm_test.o gcm_test.c 00:03:38.342 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o ctr_test.o ctr_test.c 00:03:38.342 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o customop_test.o customop_test.c 00:03:38.342 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o des_test.o des_test.c 00:03:38.342 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o ccm_test.o ccm_test.c 00:03:38.342 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o cmac_test.o cmac_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o utils.o utils.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o hmac_sha1_test.o hmac_sha1_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o hmac_sha256_sha512_test.o hmac_sha256_sha512_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o hmac_md5_test.o hmac_md5_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o aes_test.o aes_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o sha_test.o sha_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o chained_test.o chained_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o api_test.o api_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o pon_test.o pon_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o ecb_test.o ecb_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o zuc_test.o zuc_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o kasumi_test.o kasumi_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o snow3g_test.o snow3g_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o direct_api_test.o direct_api_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o clear_mem_test.o clear_mem_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o hec_test.o hec_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o xcbc_test.o xcbc_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o aes_cbcs_test.o aes_cbcs_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o crc_test.o crc_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o chacha_test.o chacha_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o poly1305_test.o poly1305_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o chacha20_poly1305_test.o chacha20_poly1305_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o null_test.o null_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o snow_v_test.o snow_v_test.c 00:03:38.343 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o ipsec_xvalid.o ipsec_xvalid.c 00:03:38.343 nasm -MD misc.d -MT misc.o -o misc.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ misc.asm 00:03:38.343 ld -r -z ibt -z shstk -o misc.o.tmp misc.o 00:03:38.343 mv misc.o.tmp misc.o 00:03:38.343 utils.c:166:32: warning: argument 2 of type ‘uint8_t[6]’ {aka ‘unsigned char[6]’} with mismatched bound [-Warray-parameter=] 00:03:38.343 166 | uint8_t arch_support[IMB_ARCH_NUM], 00:03:38.343 | ~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.344 In file included from utils.c:35: 00:03:38.344 utils.h:39:54: note: previously declared as ‘uint8_t *’ {aka ‘unsigned char *’} 00:03:38.344 39 | int update_flags_and_archs(const char *arg, uint8_t *arch_support, 00:03:38.344 | ~~~~~~~~~^~~~~~~~~~~~ 00:03:38.344 utils.c:207:21: warning: argument 1 of type ‘uint8_t[6]’ {aka ‘unsigned char[6]’} with mismatched bound [-Warray-parameter=] 00:03:38.344 207 | detect_arch(uint8_t arch_support[IMB_ARCH_NUM]) 00:03:38.344 | ~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.344 utils.h:41:26: note: previously declared as ‘uint8_t *’ {aka ‘unsigned char *’} 00:03:38.344 41 | int detect_arch(uint8_t *arch_support); 00:03:38.344 | ~~~~~~~~~^~~~~~~~~~~~ 00:03:38.344 In file included from null_test.c:33: 00:03:38.344 null_test.c: In function ‘test_null_hash’: 00:03:38.344 ../lib/intel-ipsec-mb.h:1235:10: warning: ‘cipher_key’ may be used uninitialized [-Wmaybe-uninitialized] 00:03:38.344 1235 | ((_mgr)->keyexp_128((_raw), (_enc), (_dec))) 00:03:38.344 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.344 null_test.c:62:9: note: in expansion of macro ‘IMB_AES_KEYEXP_128’ 00:03:38.344 62 | IMB_AES_KEYEXP_128(mb_mgr, cipher_key, expkey, dust); 00:03:38.344 | ^~~~~~~~~~~~~~~~~~ 00:03:38.344 ../lib/intel-ipsec-mb.h:1235:10: note: by argument 1 of type ‘const void *’ to ‘void(const void *, void *, void *)’ 00:03:38.344 1235 | ((_mgr)->keyexp_128((_raw), (_enc), (_dec))) 00:03:38.344 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.344 null_test.c:62:9: note: in expansion of macro ‘IMB_AES_KEYEXP_128’ 00:03:38.344 62 | IMB_AES_KEYEXP_128(mb_mgr, cipher_key, expkey, dust); 00:03:38.344 | ^~~~~~~~~~~~~~~~~~ 00:03:38.344 null_test.c:47:33: note: ‘cipher_key’ declared here 00:03:38.344 47 | DECLARE_ALIGNED(uint8_t cipher_key[16], 16); 00:03:38.344 | ^~~~~~~~~~ 00:03:38.344 ../lib/intel-ipsec-mb.h:51:9: note: in definition of macro ‘DECLARE_ALIGNED’ 00:03:38.344 51 | decl __attribute__((aligned(alignval))) 00:03:38.344 | ^~~~ 00:03:38.344 gcc -fPIE -z noexecstack -z relro -z now -fcf-protection=full -Wl,-z,ibt -Wl,-z,shstk -Wl,-z,cet-report=error -L../lib main.o gcm_test.o ctr_test.o customop_test.o des_test.o ccm_test.o cmac_test.o utils.o hmac_sha1_test.o hmac_sha256_sha512_test.o hmac_md5_test.o aes_test.o sha_test.o chained_test.o api_test.o pon_test.o ecb_test.o zuc_test.o kasumi_test.o snow3g_test.o direct_api_test.o clear_mem_test.o hec_test.o xcbc_test.o aes_cbcs_test.o crc_test.o chacha_test.o poly1305_test.o chacha20_poly1305_test.o null_test.o snow_v_test.o -lIPSec_MB -o ipsec_MB_testapp 00:03:38.344 gcc -fPIE -z noexecstack -z relro -z now -fcf-protection=full -Wl,-z,ibt -Wl,-z,shstk -Wl,-z,cet-report=error -L../lib ipsec_xvalid.o utils.o misc.o -lIPSec_MB -o ipsec_xvalid_test 00:03:38.344 make[1]: Leaving directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/test' 00:03:38.344 make -C perf 00:03:38.344 make[1]: Entering directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/perf' 00:03:38.344 gcc -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -pthread -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -fPIE -fstack-protector -D_FORTIFY_SOURCE=2 -c -o ipsec_perf.o ipsec_perf.c 00:03:38.344 gcc -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -pthread -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -fPIE -fstack-protector -D_FORTIFY_SOURCE=2 -c -o msr.o msr.c 00:03:38.344 nasm -MD misc.d -MT misc.o -o misc.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ misc.asm 00:03:38.344 ld -r -z ibt -z shstk -o misc.o.tmp misc.o 00:03:38.344 mv misc.o.tmp misc.o 00:03:38.913 In file included from ipsec_perf.c:59: 00:03:38.913 ipsec_perf.c: In function ‘do_test_gcm’: 00:03:38.913 ../lib/intel-ipsec-mb.h:1382:10: warning: ‘key’ may be used uninitialized [-Wmaybe-uninitialized] 00:03:38.913 1382 | ((_mgr)->gcm128_pre((_key_in), (_key_exp))) 00:03:38.913 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.913 ipsec_perf.c:1937:17: note: in expansion of macro ‘IMB_AES128_GCM_PRE’ 00:03:38.913 1937 | IMB_AES128_GCM_PRE(mb_mgr, key, &gdata_key); 00:03:38.913 | ^~~~~~~~~~~~~~~~~~ 00:03:38.913 ../lib/intel-ipsec-mb.h:1382:10: note: by argument 1 of type ‘const void *’ to ‘void(const void *, struct gcm_key_data *)’ 00:03:38.913 1382 | ((_mgr)->gcm128_pre((_key_in), (_key_exp))) 00:03:38.913 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.913 ipsec_perf.c:1937:17: note: in expansion of macro ‘IMB_AES128_GCM_PRE’ 00:03:38.913 1937 | IMB_AES128_GCM_PRE(mb_mgr, key, &gdata_key); 00:03:38.913 | ^~~~~~~~~~~~~~~~~~ 00:03:38.913 ../lib/intel-ipsec-mb.h:1384:10: warning: ‘key’ may be used uninitialized [-Wmaybe-uninitialized] 00:03:38.913 1384 | ((_mgr)->gcm192_pre((_key_in), (_key_exp))) 00:03:38.913 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.913 ipsec_perf.c:1940:17: note: in expansion of macro ‘IMB_AES192_GCM_PRE’ 00:03:38.913 1940 | IMB_AES192_GCM_PRE(mb_mgr, key, &gdata_key); 00:03:38.913 | ^~~~~~~~~~~~~~~~~~ 00:03:38.913 ../lib/intel-ipsec-mb.h:1384:10: note: by argument 1 of type ‘const void *’ to ‘void(const void *, struct gcm_key_data *)’ 00:03:38.913 1384 | ((_mgr)->gcm192_pre((_key_in), (_key_exp))) 00:03:38.913 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.913 ipsec_perf.c:1940:17: note: in expansion of macro ‘IMB_AES192_GCM_PRE’ 00:03:38.913 1940 | IMB_AES192_GCM_PRE(mb_mgr, key, &gdata_key); 00:03:38.913 | ^~~~~~~~~~~~~~~~~~ 00:03:38.913 ../lib/intel-ipsec-mb.h:1386:10: warning: ‘key’ may be used uninitialized [-Wmaybe-uninitialized] 00:03:38.913 1386 | ((_mgr)->gcm256_pre((_key_in), (_key_exp))) 00:03:38.913 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.913 ipsec_perf.c:1944:17: note: in expansion of macro ‘IMB_AES256_GCM_PRE’ 00:03:38.913 1944 | IMB_AES256_GCM_PRE(mb_mgr, key, &gdata_key); 00:03:38.913 | ^~~~~~~~~~~~~~~~~~ 00:03:38.913 ../lib/intel-ipsec-mb.h:1386:10: note: by argument 1 of type ‘const void *’ to ‘void(const void *, struct gcm_key_data *)’ 00:03:38.913 1386 | ((_mgr)->gcm256_pre((_key_in), (_key_exp))) 00:03:38.913 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:03:38.913 ipsec_perf.c:1944:17: note: in expansion of macro ‘IMB_AES256_GCM_PRE’ 00:03:38.913 1944 | IMB_AES256_GCM_PRE(mb_mgr, key, &gdata_key); 00:03:38.913 | ^~~~~~~~~~~~~~~~~~ 00:03:39.173 gcc -fPIE -z noexecstack -z relro -z now -pthread -fcf-protection=full -Wl,-z,ibt -Wl,-z,shstk -Wl,-z,cet-report=error -L../lib ipsec_perf.o msr.o misc.o -lIPSec_MB -o ipsec_perf 00:03:39.173 make[1]: Leaving directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/perf' 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@119 -- $ DPDK_DRIVERS+=("crypto") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@120 -- $ DPDK_DRIVERS+=("$intel_ipsec_mb_drv") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@121 -- $ DPDK_DRIVERS+=("crypto/qat") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@122 -- $ DPDK_DRIVERS+=("compress/qat") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@123 -- $ DPDK_DRIVERS+=("common/qat") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@125 -- $ ge 23.11.0 21.11.0 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '>=' 21.11.0 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=>=' 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 23 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@350 -- $ local d=23 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@352 -- $ echo 23 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=23 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:03:39.173 06:19:52 build_native_dpdk -- scripts/common.sh@364 -- $ return 0 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@128 -- $ DPDK_DRIVERS+=("bus/auxiliary") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@129 -- $ DPDK_DRIVERS+=("common/mlx5") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@130 -- $ DPDK_DRIVERS+=("common/mlx5/linux") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@131 -- $ DPDK_DRIVERS+=("crypto/mlx5") 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@132 -- $ mlx5_libs_added=y 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@134 -- $ dpdk_cflags+=' -I/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@135 -- $ dpdk_ldflags+=' -L/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@136 -- $ export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@136 -- $ LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 1 -eq 1 ]] 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@140 -- $ isal_dir=/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l 00:03:39.173 06:19:52 build_native_dpdk -- common/autobuild_common.sh@141 -- $ git clone --branch v2.29.0 --depth 1 https://github.com/intel/isa-l.git /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l 00:03:39.173 Cloning into '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l'... 00:03:40.111 Note: switching to '806b55ee578efd8158962b90121a4568eb1ecb66'. 00:03:40.111 00:03:40.111 You are in 'detached HEAD' state. You can look around, make experimental 00:03:40.111 changes and commit them, and you can discard any commits you make in this 00:03:40.111 state without impacting any branches by switching back to a branch. 00:03:40.111 00:03:40.111 If you want to create a new branch to retain commits you create, you may 00:03:40.111 do so (now or later) by using -c with the switch command. Example: 00:03:40.111 00:03:40.111 git switch -c 00:03:40.111 00:03:40.111 Or undo this operation with: 00:03:40.111 00:03:40.111 git switch - 00:03:40.111 00:03:40.111 Turn off this advice by setting config variable advice.detachedHead to false 00:03:40.111 00:03:40.111 06:19:53 build_native_dpdk -- common/autobuild_common.sh@143 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l 00:03:40.111 06:19:53 build_native_dpdk -- common/autobuild_common.sh@144 -- $ ./autogen.sh 00:03:43.402 libtoolize: putting auxiliary files in AC_CONFIG_AUX_DIR, 'build-aux'. 00:03:43.402 libtoolize: linking file 'build-aux/ltmain.sh' 00:03:43.661 libtoolize: Consider adding 'AC_CONFIG_MACRO_DIRS([m4])' to configure.ac, 00:03:43.661 libtoolize: and rerunning libtoolize and aclocal. 00:03:43.661 libtoolize: Consider adding '-I m4' to ACLOCAL_AMFLAGS in Makefile.am. 00:03:44.595 configure.ac:53: warning: The macro `AC_PROG_CC_STDC' is obsolete. 00:03:44.595 configure.ac:53: You should run autoupdate. 00:03:44.595 ./lib/autoconf/c.m4:1666: AC_PROG_CC_STDC is expanded from... 00:03:44.595 configure.ac:53: the top level 00:03:45.531 configure.ac:23: installing 'build-aux/compile' 00:03:45.531 configure.ac:25: installing 'build-aux/config.guess' 00:03:45.531 configure.ac:25: installing 'build-aux/config.sub' 00:03:45.531 configure.ac:12: installing 'build-aux/install-sh' 00:03:45.531 configure.ac:12: installing 'build-aux/missing' 00:03:45.531 Makefile.am: installing 'build-aux/depcomp' 00:03:45.531 parallel-tests: installing 'build-aux/test-driver' 00:03:45.788 00:03:45.788 ---------------------------------------------------------------- 00:03:45.788 Initialized build system. For a common configuration please run: 00:03:45.788 ---------------------------------------------------------------- 00:03:45.788 00:03:45.788 ./configure --prefix=/usr --libdir=/usr/lib64 00:03:45.788 00:03:45.788 06:19:59 build_native_dpdk -- common/autobuild_common.sh@145 -- $ ./configure 'CFLAGS=-fPIC -g -O2' --enable-shared=yes --prefix=/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build 00:03:46.047 checking for a BSD-compatible install... /usr/bin/install -c 00:03:46.047 checking whether build environment is sane... yes 00:03:46.047 checking for a race-free mkdir -p... /usr/bin/mkdir -p 00:03:46.047 checking for gawk... gawk 00:03:46.047 checking whether make sets $(MAKE)... yes 00:03:46.047 checking whether make supports nested variables... yes 00:03:46.047 checking how to create a pax tar archive... gnutar 00:03:46.047 checking whether make supports the include directive... yes (GNU style) 00:03:46.047 checking for gcc... gcc 00:03:46.306 checking whether the C compiler works... yes 00:03:46.306 checking for C compiler default output file name... a.out 00:03:46.306 checking for suffix of executables... 00:03:46.565 checking whether we are cross compiling... no 00:03:46.565 checking for suffix of object files... o 00:03:46.565 checking whether the compiler supports GNU C... yes 00:03:46.565 checking whether gcc accepts -g... yes 00:03:46.824 checking for gcc option to enable C11 features... none needed 00:03:46.824 checking whether gcc understands -c and -o together... yes 00:03:46.824 checking dependency style of gcc... gcc3 00:03:47.083 checking dependency style of gcc... gcc3 00:03:47.083 checking build system type... x86_64-pc-linux-gnu 00:03:47.083 checking host system type... x86_64-pc-linux-gnu 00:03:47.083 checking for stdio.h... yes 00:03:47.083 checking for stdlib.h... yes 00:03:47.363 checking for string.h... yes 00:03:47.363 checking for inttypes.h... yes 00:03:47.363 checking for stdint.h... yes 00:03:47.363 checking for strings.h... yes 00:03:47.622 checking for sys/stat.h... yes 00:03:47.622 checking for sys/types.h... yes 00:03:47.622 checking for unistd.h... yes 00:03:47.622 checking for wchar.h... yes 00:03:47.880 checking for minix/config.h... no 00:03:47.880 checking whether it is safe to define __EXTENSIONS__... yes 00:03:47.880 checking whether _XOPEN_SOURCE should be defined... no 00:03:47.880 checking whether make supports nested variables... (cached) yes 00:03:47.880 checking how to print strings... printf 00:03:47.880 checking for a sed that does not truncate output... /usr/bin/sed 00:03:47.880 checking for grep that handles long lines and -e... /usr/bin/grep 00:03:47.880 checking for egrep... /usr/bin/grep -E 00:03:47.880 checking for fgrep... /usr/bin/grep -F 00:03:47.880 checking for ld used by gcc... /usr/bin/ld 00:03:47.880 checking if the linker (/usr/bin/ld) is GNU ld... yes 00:03:47.880 checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B 00:03:48.138 checking the name lister (/usr/bin/nm -B) interface... BSD nm 00:03:48.138 checking whether ln -s works... yes 00:03:48.138 checking the maximum length of command line arguments... 1572864 00:03:48.138 checking how to convert x86_64-pc-linux-gnu file names to x86_64-pc-linux-gnu format... func_convert_file_noop 00:03:48.138 checking how to convert x86_64-pc-linux-gnu file names to toolchain format... func_convert_file_noop 00:03:48.138 checking for /usr/bin/ld option to reload object files... -r 00:03:48.138 checking for file... file 00:03:48.138 checking for objdump... objdump 00:03:48.139 checking how to recognize dependent libraries... pass_all 00:03:48.139 checking for dlltool... no 00:03:48.139 checking how to associate runtime and link libraries... printf %s\n 00:03:48.139 checking for ar... ar 00:03:48.139 checking for archiver @FILE support... @ 00:03:48.139 checking for strip... strip 00:03:48.139 checking for ranlib... ranlib 00:03:48.397 checking command to parse /usr/bin/nm -B output from gcc object... ok 00:03:48.397 checking for sysroot... no 00:03:48.397 checking for a working dd... /usr/bin/dd 00:03:48.397 checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1 00:03:48.397 checking for mt... no 00:03:48.397 checking if : is a manifest tool... no 00:03:48.654 checking for dlfcn.h... yes 00:03:48.654 checking for objdir... .libs 00:03:48.912 checking if gcc supports -fno-rtti -fno-exceptions... no 00:03:48.912 checking for gcc option to produce PIC... -fPIC -DPIC 00:03:48.912 checking if gcc PIC flag -fPIC -DPIC works... yes 00:03:48.912 checking if gcc static flag -static works... yes 00:03:49.171 checking if gcc supports -c -o file.o... yes 00:03:49.171 checking if gcc supports -c -o file.o... (cached) yes 00:03:49.171 checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes 00:03:49.171 checking whether -lc should be explicitly linked in... no 00:03:49.429 checking dynamic linker characteristics... GNU/Linux ld.so 00:03:49.429 checking how to hardcode library paths into programs... immediate 00:03:49.429 checking whether stripping libraries is possible... yes 00:03:49.429 checking if libtool supports shared libraries... yes 00:03:49.429 checking whether to build shared libraries... yes 00:03:49.429 checking whether to build static libraries... yes 00:03:49.429 checking for a sed that does not truncate output... (cached) /usr/bin/sed 00:03:49.429 checking for yasm... yes 00:03:49.429 checking for modern yasm... yes 00:03:49.429 checking for optional yasm AVX512 support... no 00:03:49.429 checking for nasm... yes 00:03:49.429 checking for modern nasm... yes 00:03:49.429 checking for optional nasm AVX512 support... yes 00:03:49.429 checking for additional nasm AVX512 support... yes 00:03:49.429 Using nasm args target "linux" "-f elf64" 00:03:49.429 checking for limits.h... yes 00:03:49.429 checking for stdint.h... (cached) yes 00:03:49.429 checking for stdlib.h... (cached) yes 00:03:49.429 checking for string.h... (cached) yes 00:03:49.746 checking for inline... inline 00:03:49.746 checking for size_t... yes 00:03:49.746 checking for uint16_t... yes 00:03:49.746 checking for uint32_t... yes 00:03:50.005 checking for uint64_t... yes 00:03:50.005 checking for uint8_t... yes 00:03:50.005 checking for GNU libc compatible malloc... yes 00:03:50.264 checking for memmove... yes 00:03:50.264 checking for memset... yes 00:03:50.522 checking for getopt... yes 00:03:50.522 checking that generated files are newer than configure... done 00:03:50.522 configure: creating ./config.status 00:03:51.899 config.status: creating Makefile 00:03:51.899 config.status: creating libisal.pc 00:03:51.899 config.status: executing depfiles commands 00:03:53.804 config.status: executing libtool commands 00:03:53.804 00:03:53.804 isa-l 2.29.0 00:03:53.804 ===== 00:03:53.804 00:03:53.804 prefix: /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build 00:03:53.804 sysconfdir: ${prefix}/etc 00:03:53.804 libdir: ${exec_prefix}/lib 00:03:53.804 includedir: ${prefix}/include 00:03:53.804 00:03:53.804 compiler: gcc 00:03:53.804 cflags: -fPIC -g -O2 00:03:53.804 ldflags: 00:03:53.804 00:03:53.804 debug: no 00:03:53.804 00:03:53.804 06:20:07 build_native_dpdk -- common/autobuild_common.sh@146 -- $ ln -s /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/include /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/isa-l 00:03:53.804 06:20:07 build_native_dpdk -- common/autobuild_common.sh@147 -- $ make -j112 all 00:03:53.804 Building isa-l.h 00:03:53.804 make --no-print-directory all-am 00:03:53.804 CC erasure_code/ec_highlevel_func.lo 00:03:53.804 MKTMP erasure_code/gf_vect_mul_sse.s 00:03:53.804 MKTMP erasure_code/gf_vect_mul_avx.s 00:03:53.804 MKTMP erasure_code/gf_vect_dot_prod_sse.s 00:03:53.804 MKTMP erasure_code/gf_vect_dot_prod_avx.s 00:03:53.804 MKTMP erasure_code/gf_2vect_dot_prod_sse.s 00:03:53.804 MKTMP erasure_code/gf_vect_dot_prod_avx2.s 00:03:53.804 MKTMP erasure_code/gf_3vect_dot_prod_sse.s 00:03:53.804 MKTMP erasure_code/gf_4vect_dot_prod_sse.s 00:03:53.804 MKTMP erasure_code/gf_5vect_dot_prod_sse.s 00:03:53.804 MKTMP erasure_code/gf_6vect_dot_prod_sse.s 00:03:53.804 MKTMP erasure_code/gf_2vect_dot_prod_avx.s 00:03:53.804 MKTMP erasure_code/gf_3vect_dot_prod_avx.s 00:03:53.804 MKTMP erasure_code/gf_4vect_dot_prod_avx.s 00:03:53.804 MKTMP erasure_code/gf_5vect_dot_prod_avx.s 00:03:53.804 MKTMP erasure_code/gf_6vect_dot_prod_avx.s 00:03:53.804 MKTMP erasure_code/gf_2vect_dot_prod_avx2.s 00:03:53.804 MKTMP erasure_code/gf_3vect_dot_prod_avx2.s 00:03:53.804 MKTMP erasure_code/gf_4vect_dot_prod_avx2.s 00:03:53.804 MKTMP erasure_code/gf_5vect_dot_prod_avx2.s 00:03:53.804 MKTMP erasure_code/gf_6vect_dot_prod_avx2.s 00:03:53.804 MKTMP erasure_code/gf_vect_mad_sse.s 00:03:53.804 MKTMP erasure_code/gf_3vect_mad_sse.s 00:03:53.804 MKTMP erasure_code/gf_2vect_mad_sse.s 00:03:53.804 MKTMP erasure_code/gf_4vect_mad_sse.s 00:03:53.804 MKTMP erasure_code/gf_5vect_mad_sse.s 00:03:53.804 MKTMP erasure_code/gf_6vect_mad_sse.s 00:03:53.804 MKTMP erasure_code/gf_vect_mad_avx.s 00:03:54.065 MKTMP erasure_code/gf_2vect_mad_avx.s 00:03:54.065 MKTMP erasure_code/gf_3vect_mad_avx.s 00:03:54.065 MKTMP erasure_code/gf_4vect_mad_avx.s 00:03:54.065 MKTMP erasure_code/gf_5vect_mad_avx.s 00:03:54.065 MKTMP erasure_code/gf_6vect_mad_avx.s 00:03:54.065 MKTMP erasure_code/gf_vect_mad_avx2.s 00:03:54.065 MKTMP erasure_code/gf_2vect_mad_avx2.s 00:03:54.065 MKTMP erasure_code/gf_3vect_mad_avx2.s 00:03:54.065 MKTMP erasure_code/gf_4vect_mad_avx2.s 00:03:54.065 MKTMP erasure_code/gf_5vect_mad_avx2.s 00:03:54.065 MKTMP erasure_code/gf_6vect_mad_avx2.s 00:03:54.065 MKTMP erasure_code/ec_multibinary.s 00:03:54.065 MKTMP erasure_code/gf_vect_dot_prod_avx512.s 00:03:54.065 MKTMP erasure_code/gf_2vect_dot_prod_avx512.s 00:03:54.065 MKTMP erasure_code/gf_3vect_dot_prod_avx512.s 00:03:54.065 MKTMP erasure_code/gf_4vect_dot_prod_avx512.s 00:03:54.065 MKTMP erasure_code/gf_5vect_dot_prod_avx512.s 00:03:54.065 MKTMP erasure_code/gf_6vect_dot_prod_avx512.s 00:03:54.065 MKTMP erasure_code/gf_vect_mad_avx512.s 00:03:54.065 MKTMP erasure_code/gf_2vect_mad_avx512.s 00:03:54.065 MKTMP erasure_code/gf_3vect_mad_avx512.s 00:03:54.065 MKTMP erasure_code/gf_5vect_mad_avx512.s 00:03:54.065 MKTMP erasure_code/gf_4vect_mad_avx512.s 00:03:54.065 MKTMP erasure_code/gf_6vect_mad_avx512.s 00:03:54.065 MKTMP raid/xor_gen_sse.s 00:03:54.065 MKTMP raid/pq_check_sse.s 00:03:54.065 MKTMP raid/pq_gen_sse.s 00:03:54.065 MKTMP raid/xor_check_sse.s 00:03:54.065 MKTMP raid/pq_gen_avx.s 00:03:54.065 MKTMP raid/pq_gen_avx2.s 00:03:54.065 MKTMP raid/xor_gen_avx.s 00:03:54.065 MKTMP raid/xor_gen_avx512.s 00:03:54.065 MKTMP raid/pq_gen_avx512.s 00:03:54.065 MKTMP crc/crc16_t10dif_01.s 00:03:54.065 MKTMP raid/raid_multibinary.s 00:03:54.065 MKTMP crc/crc16_t10dif_by4.s 00:03:54.065 MKTMP crc/crc16_t10dif_02.s 00:03:54.065 MKTMP crc/crc16_t10dif_copy_by4.s 00:03:54.065 MKTMP crc/crc16_t10dif_by16_10.s 00:03:54.065 MKTMP crc/crc16_t10dif_copy_by4_02.s 00:03:54.065 MKTMP crc/crc32_ieee_02.s 00:03:54.065 MKTMP crc/crc32_ieee_01.s 00:03:54.065 MKTMP crc/crc32_ieee_by4.s 00:03:54.065 MKTMP crc/crc32_iscsi_01.s 00:03:54.065 MKTMP crc/crc32_ieee_by16_10.s 00:03:54.065 MKTMP crc/crc_multibinary.s 00:03:54.065 MKTMP crc/crc32_iscsi_00.s 00:03:54.065 MKTMP crc/crc64_multibinary.s 00:03:54.065 MKTMP crc/crc64_ecma_refl_by8.s 00:03:54.065 MKTMP crc/crc64_ecma_refl_by16_10.s 00:03:54.065 MKTMP crc/crc64_iso_refl_by8.s 00:03:54.065 MKTMP crc/crc64_ecma_norm_by8.s 00:03:54.065 MKTMP crc/crc64_ecma_norm_by16_10.s 00:03:54.065 MKTMP crc/crc64_iso_refl_by16_10.s 00:03:54.065 MKTMP crc/crc64_iso_norm_by8.s 00:03:54.065 MKTMP crc/crc64_jones_refl_by8.s 00:03:54.065 MKTMP crc/crc64_iso_norm_by16_10.s 00:03:54.065 MKTMP crc/crc64_jones_refl_by16_10.s 00:03:54.065 MKTMP crc/crc64_jones_norm_by8.s 00:03:54.065 MKTMP crc/crc64_jones_norm_by16_10.s 00:03:54.065 MKTMP crc/crc32_gzip_refl_by8.s 00:03:54.065 MKTMP crc/crc32_gzip_refl_by8_02.s 00:03:54.065 MKTMP crc/crc32_gzip_refl_by16_10.s 00:03:54.065 MKTMP igzip/igzip_body.s 00:03:54.065 MKTMP igzip/igzip_finish.s 00:03:54.065 MKTMP igzip/igzip_icf_body_h1_gr_bt.s 00:03:54.065 MKTMP igzip/igzip_icf_finish.s 00:03:54.065 MKTMP igzip/rfc1951_lookup.s 00:03:54.065 MKTMP igzip/adler32_avx2_4.s 00:03:54.065 MKTMP igzip/adler32_sse.s 00:03:54.065 MKTMP igzip/igzip_multibinary.s 00:03:54.065 MKTMP igzip/igzip_update_histogram_04.s 00:03:54.065 MKTMP igzip/igzip_update_histogram_01.s 00:03:54.065 MKTMP igzip/igzip_decode_block_stateless_01.s 00:03:54.065 MKTMP igzip/igzip_decode_block_stateless_04.s 00:03:54.065 MKTMP igzip/igzip_inflate_multibinary.s 00:03:54.065 MKTMP igzip/encode_df_04.s 00:03:54.065 MKTMP igzip/encode_df_06.s 00:03:54.065 MKTMP igzip/proc_heap.s 00:03:54.065 MKTMP igzip/igzip_deflate_hash.s 00:03:54.065 MKTMP igzip/igzip_gen_icf_map_lh1_06.s 00:03:54.065 MKTMP igzip/igzip_gen_icf_map_lh1_04.s 00:03:54.065 MKTMP igzip/igzip_set_long_icf_fg_04.s 00:03:54.065 MKTMP igzip/igzip_set_long_icf_fg_06.s 00:03:54.065 MKTMP mem/mem_zero_detect_sse.s 00:03:54.065 MKTMP mem/mem_zero_detect_avx.s 00:03:54.065 MKTMP mem/mem_multibinary.s 00:03:54.065 CC programs/igzip_cli.o 00:03:54.065 CC erasure_code/ec_base.lo 00:03:54.065 CC crc/crc64_base.lo 00:03:54.065 CC raid/raid_base.lo 00:03:54.065 CC crc/crc_base.lo 00:03:54.065 CC igzip/igzip.lo 00:03:54.065 CC igzip/hufftables_c.lo 00:03:54.065 CC igzip/igzip_base.lo 00:03:54.065 CC igzip/igzip_icf_base.lo 00:03:54.065 CC igzip/adler32_base.lo 00:03:54.065 CC igzip/flatten_ll.lo 00:03:54.065 CC igzip/encode_df.lo 00:03:54.065 CC igzip/igzip_icf_body.lo 00:03:54.065 CC igzip/igzip_inflate.lo 00:03:54.065 CC igzip/huff_codes.lo 00:03:54.065 CC mem/mem_zero_detect_base.lo 00:03:54.065 CCAS erasure_code/gf_vect_mul_sse.lo 00:03:54.065 CCAS erasure_code/gf_vect_mul_avx.lo 00:03:54.065 CCAS erasure_code/gf_vect_dot_prod_sse.lo 00:03:54.065 CCAS erasure_code/gf_vect_dot_prod_avx.lo 00:03:54.065 CCAS erasure_code/gf_vect_dot_prod_avx2.lo 00:03:54.065 CCAS erasure_code/gf_2vect_dot_prod_sse.lo 00:03:54.065 CCAS erasure_code/gf_3vect_dot_prod_sse.lo 00:03:54.065 CCAS erasure_code/gf_4vect_dot_prod_sse.lo 00:03:54.065 CCAS erasure_code/gf_5vect_dot_prod_sse.lo 00:03:54.065 CCAS erasure_code/gf_6vect_dot_prod_sse.lo 00:03:54.065 CCAS erasure_code/gf_2vect_dot_prod_avx.lo 00:03:54.065 CCAS erasure_code/gf_3vect_dot_prod_avx.lo 00:03:54.065 CCAS erasure_code/gf_4vect_dot_prod_avx.lo 00:03:54.065 CCAS erasure_code/gf_5vect_dot_prod_avx.lo 00:03:54.065 CCAS erasure_code/gf_6vect_dot_prod_avx.lo 00:03:54.065 CCAS erasure_code/gf_2vect_dot_prod_avx2.lo 00:03:54.065 CCAS erasure_code/gf_3vect_dot_prod_avx2.lo 00:03:54.065 CCAS erasure_code/gf_4vect_dot_prod_avx2.lo 00:03:54.065 CCAS erasure_code/gf_5vect_dot_prod_avx2.lo 00:03:54.065 CCAS erasure_code/gf_6vect_dot_prod_avx2.lo 00:03:54.065 CCAS erasure_code/gf_vect_mad_sse.lo 00:03:54.065 CCAS erasure_code/gf_2vect_mad_sse.lo 00:03:54.065 CCAS erasure_code/gf_3vect_mad_sse.lo 00:03:54.065 CCAS erasure_code/gf_5vect_mad_sse.lo 00:03:54.065 CCAS erasure_code/gf_4vect_mad_sse.lo 00:03:54.065 CCAS erasure_code/gf_6vect_mad_sse.lo 00:03:54.065 CCAS erasure_code/gf_vect_mad_avx.lo 00:03:54.065 CCAS erasure_code/gf_2vect_mad_avx.lo 00:03:54.065 CCAS erasure_code/gf_3vect_mad_avx.lo 00:03:54.065 CCAS erasure_code/gf_4vect_mad_avx.lo 00:03:54.065 CCAS erasure_code/gf_5vect_mad_avx.lo 00:03:54.065 CCAS erasure_code/gf_6vect_mad_avx.lo 00:03:54.065 CCAS erasure_code/gf_vect_mad_avx2.lo 00:03:54.065 CCAS erasure_code/gf_2vect_mad_avx2.lo 00:03:54.065 CCAS erasure_code/gf_3vect_mad_avx2.lo 00:03:54.065 CCAS erasure_code/gf_4vect_mad_avx2.lo 00:03:54.065 CCAS erasure_code/gf_5vect_mad_avx2.lo 00:03:54.065 CCAS erasure_code/gf_6vect_mad_avx2.lo 00:03:54.066 CCAS erasure_code/ec_multibinary.lo 00:03:54.066 CCAS erasure_code/gf_vect_dot_prod_avx512.lo 00:03:54.066 CCAS erasure_code/gf_2vect_dot_prod_avx512.lo 00:03:54.066 CCAS erasure_code/gf_3vect_dot_prod_avx512.lo 00:03:54.066 CCAS erasure_code/gf_4vect_dot_prod_avx512.lo 00:03:54.066 CCAS erasure_code/gf_5vect_dot_prod_avx512.lo 00:03:54.066 CCAS erasure_code/gf_vect_mad_avx512.lo 00:03:54.066 CCAS erasure_code/gf_3vect_mad_avx512.lo 00:03:54.066 CCAS erasure_code/gf_4vect_mad_avx512.lo 00:03:54.066 CCAS erasure_code/gf_2vect_mad_avx512.lo 00:03:54.066 CCAS erasure_code/gf_6vect_dot_prod_avx512.lo 00:03:54.066 CCAS erasure_code/gf_5vect_mad_avx512.lo 00:03:54.066 CCAS erasure_code/gf_6vect_mad_avx512.lo 00:03:54.066 CCAS raid/pq_gen_sse.lo 00:03:54.066 CCAS raid/xor_gen_sse.lo 00:03:54.066 CCAS raid/xor_check_sse.lo 00:03:54.066 CCAS raid/pq_check_sse.lo 00:03:54.066 CCAS raid/pq_gen_avx.lo 00:03:54.066 CCAS raid/xor_gen_avx.lo 00:03:54.066 CCAS raid/pq_gen_avx2.lo 00:03:54.066 CCAS raid/pq_gen_avx512.lo 00:03:54.066 CCAS raid/raid_multibinary.lo 00:03:54.066 CCAS crc/crc16_t10dif_01.lo 00:03:54.066 CCAS raid/xor_gen_avx512.lo 00:03:54.066 CCAS crc/crc16_t10dif_by4.lo 00:03:54.066 CCAS crc/crc16_t10dif_02.lo 00:03:54.066 CCAS crc/crc16_t10dif_by16_10.lo 00:03:54.066 CCAS crc/crc16_t10dif_copy_by4.lo 00:03:54.066 CCAS crc/crc16_t10dif_copy_by4_02.lo 00:03:54.066 CCAS crc/crc32_ieee_01.lo 00:03:54.066 CCAS crc/crc32_ieee_02.lo 00:03:54.066 CCAS crc/crc32_ieee_by4.lo 00:03:54.066 CCAS crc/crc32_ieee_by16_10.lo 00:03:54.066 CCAS crc/crc32_iscsi_01.lo 00:03:54.066 CCAS crc/crc32_iscsi_00.lo 00:03:54.066 CCAS crc/crc_multibinary.lo 00:03:54.066 CCAS crc/crc64_multibinary.lo 00:03:54.066 CCAS crc/crc64_ecma_refl_by8.lo 00:03:54.066 CCAS crc/crc64_ecma_refl_by16_10.lo 00:03:54.066 CCAS crc/crc64_ecma_norm_by8.lo 00:03:54.066 CCAS crc/crc64_ecma_norm_by16_10.lo 00:03:54.066 CCAS crc/crc64_iso_refl_by8.lo 00:03:54.066 CCAS crc/crc64_iso_refl_by16_10.lo 00:03:54.066 CCAS crc/crc64_iso_norm_by8.lo 00:03:54.066 CCAS crc/crc64_iso_norm_by16_10.lo 00:03:54.066 CCAS crc/crc64_jones_refl_by8.lo 00:03:54.066 CCAS crc/crc64_jones_refl_by16_10.lo 00:03:54.066 CCAS crc/crc64_jones_norm_by8.lo 00:03:54.066 CCAS crc/crc64_jones_norm_by16_10.lo 00:03:54.066 CCAS crc/crc32_gzip_refl_by8.lo 00:03:54.066 CCAS crc/crc32_gzip_refl_by8_02.lo 00:03:54.066 CCAS crc/crc32_gzip_refl_by16_10.lo 00:03:54.325 CCAS igzip/igzip_body.lo 00:03:54.325 CCAS igzip/igzip_finish.lo 00:03:54.325 CCAS igzip/igzip_icf_body_h1_gr_bt.lo 00:03:54.325 CCAS igzip/igzip_icf_finish.lo 00:03:54.325 CCAS igzip/rfc1951_lookup.lo 00:03:54.325 CCAS igzip/adler32_sse.lo 00:03:54.325 CCAS igzip/adler32_avx2_4.lo 00:03:54.325 CCAS igzip/igzip_multibinary.lo 00:03:54.325 CCAS igzip/igzip_update_histogram_01.lo 00:03:54.325 CCAS igzip/igzip_update_histogram_04.lo 00:03:54.325 CCAS igzip/igzip_decode_block_stateless_04.lo 00:03:54.325 CCAS igzip/igzip_decode_block_stateless_01.lo 00:03:54.325 CCAS igzip/igzip_inflate_multibinary.lo 00:03:54.325 CCAS igzip/encode_df_04.lo 00:03:54.325 CCAS igzip/encode_df_06.lo 00:03:54.325 CCAS igzip/proc_heap.lo 00:03:54.325 CCAS igzip/igzip_deflate_hash.lo 00:03:54.325 CCAS igzip/igzip_gen_icf_map_lh1_06.lo 00:03:54.325 CCAS igzip/igzip_gen_icf_map_lh1_04.lo 00:03:54.325 CCAS igzip/igzip_set_long_icf_fg_04.lo 00:03:54.325 CCAS igzip/igzip_set_long_icf_fg_06.lo 00:03:54.325 CCAS mem/mem_zero_detect_avx.lo 00:03:54.325 CCAS mem/mem_zero_detect_sse.lo 00:03:54.325 CCAS mem/mem_multibinary.lo 00:03:58.509 CCLD libisal.la 00:03:58.509 CCLD programs/igzip 00:03:58.767 rm erasure_code/gf_5vect_dot_prod_avx512.s erasure_code/gf_3vect_mad_avx.s erasure_code/gf_5vect_dot_prod_avx2.s erasure_code/gf_6vect_dot_prod_avx.s crc/crc16_t10dif_01.s crc/crc32_iscsi_00.s erasure_code/gf_5vect_dot_prod_avx.s igzip/encode_df_04.s erasure_code/gf_6vect_mad_sse.s erasure_code/gf_4vect_dot_prod_sse.s erasure_code/gf_5vect_mad_avx512.s crc/crc16_t10dif_copy_by4.s erasure_code/gf_5vect_mad_avx2.s erasure_code/gf_vect_mad_avx2.s igzip/proc_heap.s erasure_code/gf_3vect_dot_prod_sse.s igzip/igzip_set_long_icf_fg_06.s crc/crc64_jones_refl_by8.s erasure_code/gf_vect_dot_prod_avx2.s igzip/encode_df_06.s crc/crc_multibinary.s erasure_code/gf_4vect_mad_avx512.s erasure_code/gf_2vect_mad_avx2.s erasure_code/gf_4vect_mad_avx.s igzip/igzip_set_long_icf_fg_04.s crc/crc64_iso_refl_by8.s crc/crc16_t10dif_by16_10.s erasure_code/gf_2vect_dot_prod_avx2.s igzip/igzip_gen_icf_map_lh1_04.s raid/xor_check_sse.s erasure_code/gf_5vect_mad_avx.s raid/pq_gen_sse.s erasure_code/gf_vect_mad_avx.s erasure_code/gf_5vect_dot_prod_sse.s erasure_code/ec_multibinary.s crc/crc64_iso_norm_by16_10.s igzip/rfc1951_lookup.s raid/pq_gen_avx2.s erasure_code/gf_6vect_mad_avx.s crc/crc32_gzip_refl_by8.s igzip/igzip_gen_icf_map_lh1_06.s erasure_code/gf_3vect_dot_prod_avx2.s erasure_code/gf_2vect_mad_avx512.s igzip/igzip_update_histogram_04.s crc/crc64_ecma_norm_by16_10.s crc/crc32_ieee_by4.s erasure_code/gf_4vect_dot_prod_avx.s crc/crc16_t10dif_02.s erasure_code/gf_2vect_mad_sse.s raid/xor_gen_sse.s erasure_code/gf_5vect_mad_sse.s erasure_code/gf_3vect_dot_prod_avx512.s erasure_code/gf_3vect_mad_avx512.s raid/pq_gen_avx.s erasure_code/gf_2vect_dot_prod_sse.s igzip/igzip_multibinary.s igzip/igzip_deflate_hash.s erasure_code/gf_vect_mad_avx512.s raid/pq_gen_avx512.s igzip/adler32_sse.s crc/crc32_iscsi_01.s crc/crc16_t10dif_by4.s erasure_code/gf_6vect_dot_prod_avx2.s crc/crc32_gzip_refl_by16_10.s raid/xor_gen_avx512.s erasure_code/gf_vect_dot_prod_avx.s igzip/igzip_icf_finish.s erasure_code/gf_vect_mad_sse.s erasure_code/gf_vect_mul_sse.s erasure_code/gf_6vect_mad_avx512.s igzip/igzip_decode_block_stateless_04.s erasure_code/gf_6vect_mad_avx2.s crc/crc64_ecma_refl_by16_10.s raid/xor_gen_avx.s erasure_code/gf_6vect_dot_prod_avx512.s erasure_code/gf_2vect_mad_avx.s erasure_code/gf_2vect_dot_prod_avx512.s crc/crc32_ieee_by16_10.s crc/crc64_iso_refl_by16_10.s erasure_code/gf_3vect_mad_sse.s raid/pq_check_sse.s erasure_code/gf_2vect_dot_prod_avx.s mem/mem_zero_detect_avx.s crc/crc32_ieee_01.s crc/crc64_jones_refl_by16_10.s crc/crc64_multibinary.s mem/mem_multibinary.s raid/raid_multibinary.s erasure_code/gf_3vect_dot_prod_avx.s crc/crc32_ieee_02.s mem/mem_zero_detect_sse.s igzip/igzip_decode_block_stateless_01.s erasure_code/gf_4vect_dot_prod_avx2.s crc/crc32_gzip_refl_by8_02.s igzip/igzip_finish.s erasure_code/gf_4vect_mad_avx2.s crc/crc16_t10dif_copy_by4_02.s erasure_code/gf_vect_dot_prod_sse.s erasure_code/gf_3vect_mad_avx2.s erasure_code/gf_vect_mul_avx.s igzip/adler32_avx2_4.s erasure_code/gf_4vect_mad_sse.s igzip/igzip_inflate_multibinary.s crc/crc64_ecma_norm_by8.s igzip/igzip_body.s erasure_code/gf_6vect_dot_prod_sse.s crc/crc64_jones_norm_by16_10.s crc/crc64_iso_norm_by8.s crc/crc64_jones_norm_by8.s erasure_code/gf_4vect_dot_prod_avx512.s crc/crc64_ecma_refl_by8.s igzip/igzip_update_histogram_01.s igzip/igzip_icf_body_h1_gr_bt.s erasure_code/gf_vect_dot_prod_avx512.s 00:03:58.767 06:20:12 build_native_dpdk -- common/autobuild_common.sh@148 -- $ make install 00:03:58.767 make --no-print-directory install-am 00:03:58.767 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib' 00:03:58.767 /bin/sh ./libtool --mode=install /usr/bin/install -c libisal.la '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib' 00:03:59.026 libtool: install: /usr/bin/install -c .libs/libisal.so.2.0.29 /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.so.2.0.29 00:03:59.026 libtool: install: (cd /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib && { ln -s -f libisal.so.2.0.29 libisal.so.2 || { rm -f libisal.so.2 && ln -s libisal.so.2.0.29 libisal.so.2; }; }) 00:03:59.026 libtool: install: (cd /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib && { ln -s -f libisal.so.2.0.29 libisal.so || { rm -f libisal.so && ln -s libisal.so.2.0.29 libisal.so; }; }) 00:03:59.026 libtool: install: /usr/bin/install -c .libs/libisal.lai /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.la 00:03:59.026 libtool: install: /usr/bin/install -c .libs/libisal.a /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.a 00:03:59.026 libtool: install: chmod 644 /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.a 00:03:59.026 libtool: install: ranlib /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.a 00:03:59.026 libtool: finish: PATH="/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin:/sbin" ldconfig -n /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib 00:03:59.026 ---------------------------------------------------------------------- 00:03:59.026 Libraries have been installed in: 00:03:59.026 /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib 00:03:59.026 00:03:59.026 If you ever happen to want to link against installed libraries 00:03:59.026 in a given directory, LIBDIR, you must either use libtool, and 00:03:59.026 specify the full pathname of the library, or use the '-LLIBDIR' 00:03:59.026 flag during linking and do at least one of the following: 00:03:59.026 - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable 00:03:59.026 during execution 00:03:59.026 - add LIBDIR to the 'LD_RUN_PATH' environment variable 00:03:59.026 during linking 00:03:59.026 - use the '-Wl,-rpath -Wl,LIBDIR' linker flag 00:03:59.026 - have your system administrator add LIBDIR to '/etc/ld.so.conf' 00:03:59.026 00:03:59.026 See any operating system documentation about shared libraries for 00:03:59.026 more information, such as the ld(1) and ld.so(8) manual pages. 00:03:59.026 ---------------------------------------------------------------------- 00:03:59.026 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/bin' 00:03:59.026 /bin/sh ./libtool --mode=install /usr/bin/install -c programs/igzip '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/bin' 00:03:59.285 libtool: install: /usr/bin/install -c programs/.libs/igzip /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/bin/igzip 00:03:59.285 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/share/man/man1' 00:03:59.285 /usr/bin/install -c -m 644 programs/igzip.1 '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/share/man/man1' 00:03:59.285 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/include' 00:03:59.285 /usr/bin/install -c -m 644 isa-l.h '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/include/.' 00:03:59.285 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/pkgconfig' 00:03:59.285 /usr/bin/install -c -m 644 libisal.pc '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/pkgconfig' 00:03:59.285 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/include/isa-l' 00:03:59.285 /usr/bin/install -c -m 644 include/test.h include/types.h include/crc.h include/crc64.h include/erasure_code.h include/gf_vect_mul.h include/igzip_lib.h include/mem_routines.h include/raid.h '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/include/isa-l' 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@149 -- $ DPDK_DRIVERS+=("compress") 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@150 -- $ DPDK_DRIVERS+=("compress/isal") 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@151 -- $ DPDK_DRIVERS+=("compress/qat") 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@152 -- $ DPDK_DRIVERS+=("common/qat") 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@153 -- $ ge 23.11.0 21.02.0 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '>=' 21.02.0 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=>=' 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 23 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@350 -- $ local d=23 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@352 -- $ echo 23 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=23 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@364 -- $ return 0 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@156 -- $ test y = n 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@161 -- $ DPDK_DRIVERS+=("compress/mlx5") 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@163 -- $ export PKG_CONFIG_PATH=:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/pkgconfig 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@163 -- $ PKG_CONFIG_PATH=:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/pkgconfig 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@164 -- $ export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@164 -- $ LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/dpdk 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:03:59.285 06:20:12 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@370 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=<' 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@342 -- $ : 1 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 23 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@350 -- $ local d=23 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@352 -- $ echo 23 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=23 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:03:59.285 06:20:12 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:03:59.543 06:20:12 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:03:59.543 06:20:12 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:03:59.543 06:20:12 build_native_dpdk -- scripts/common.sh@364 -- $ return 1 00:03:59.543 06:20:12 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:03:59.543 patching file config/rte_config.h 00:03:59.543 Hunk #1 succeeded at 60 (offset 1 line). 00:03:59.543 06:20:12 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 23.11.0 24.07.0 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@370 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=<' 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@342 -- $ : 1 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 23 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@350 -- $ local d=23 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@352 -- $ echo 23 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=23 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 24 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@350 -- $ local d=24 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@352 -- $ echo 24 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=24 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@365 -- $ (( ver1[v] < ver2[v] )) 00:03:59.544 06:20:12 build_native_dpdk -- scripts/common.sh@365 -- $ return 0 00:03:59.544 06:20:12 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:03:59.544 patching file lib/pcapng/rte_pcapng.c 00:03:59.544 06:20:12 build_native_dpdk -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:03:59.544 06:20:12 build_native_dpdk -- common/autobuild_common.sh@181 -- $ uname -s 00:03:59.544 06:20:12 build_native_dpdk -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:03:59.544 06:20:12 build_native_dpdk -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base crypto crypto/ipsec_mb crypto/qat compress/qat common/qat bus/auxiliary common/mlx5 common/mlx5/linux crypto/mlx5 compress compress/isal compress/qat common/qat compress/mlx5 00:03:59.544 06:20:12 build_native_dpdk -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false '-Dc_link_args= -L/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow -I/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,crypto,crypto/ipsec_mb,crypto/qat,compress/qat,common/qat,bus/auxiliary,common/mlx5,common/mlx5/linux,crypto/mlx5,compress,compress/isal,compress/qat,common/qat,compress/mlx5, 00:04:04.814 The Meson build system 00:04:04.814 Version: 1.3.1 00:04:04.814 Source dir: /var/jenkins/workspace/crypto-phy-autotest/dpdk 00:04:04.814 Build dir: /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp 00:04:04.814 Build type: native build 00:04:04.814 Program cat found: YES (/usr/bin/cat) 00:04:04.814 Project name: DPDK 00:04:04.814 Project version: 23.11.0 00:04:04.814 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:04:04.814 C linker for the host machine: gcc ld.bfd 2.39-16 00:04:04.814 Host machine cpu family: x86_64 00:04:04.814 Host machine cpu: x86_64 00:04:04.814 Message: ## Building in Developer Mode ## 00:04:04.814 Program pkg-config found: YES (/usr/bin/pkg-config) 00:04:04.814 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:04:04.814 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:04:04.814 Program python3 found: YES (/usr/bin/python3) 00:04:04.814 Program cat found: YES (/usr/bin/cat) 00:04:04.814 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:04:04.814 Compiler for C supports arguments -march=native: YES 00:04:04.814 Checking for size of "void *" : 8 00:04:04.814 Checking for size of "void *" : 8 (cached) 00:04:04.814 Library m found: YES 00:04:04.814 Library numa found: YES 00:04:04.814 Has header "numaif.h" : YES 00:04:04.814 Library fdt found: NO 00:04:04.814 Library execinfo found: NO 00:04:04.814 Has header "execinfo.h" : YES 00:04:04.814 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:04:04.814 Run-time dependency libarchive found: NO (tried pkgconfig) 00:04:04.814 Run-time dependency libbsd found: NO (tried pkgconfig) 00:04:04.814 Run-time dependency jansson found: NO (tried pkgconfig) 00:04:04.814 Run-time dependency openssl found: YES 3.0.9 00:04:04.814 Run-time dependency libpcap found: YES 1.10.4 00:04:04.814 Has header "pcap.h" with dependency libpcap: YES 00:04:04.814 Compiler for C supports arguments -Wcast-qual: YES 00:04:04.814 Compiler for C supports arguments -Wdeprecated: YES 00:04:04.814 Compiler for C supports arguments -Wformat: YES 00:04:04.814 Compiler for C supports arguments -Wformat-nonliteral: NO 00:04:04.814 Compiler for C supports arguments -Wformat-security: NO 00:04:04.814 Compiler for C supports arguments -Wmissing-declarations: YES 00:04:04.814 Compiler for C supports arguments -Wmissing-prototypes: YES 00:04:04.814 Compiler for C supports arguments -Wnested-externs: YES 00:04:04.814 Compiler for C supports arguments -Wold-style-definition: YES 00:04:04.814 Compiler for C supports arguments -Wpointer-arith: YES 00:04:04.814 Compiler for C supports arguments -Wsign-compare: YES 00:04:04.814 Compiler for C supports arguments -Wstrict-prototypes: YES 00:04:04.814 Compiler for C supports arguments -Wundef: YES 00:04:04.814 Compiler for C supports arguments -Wwrite-strings: YES 00:04:04.814 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:04:04.814 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:04:04.814 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:04:04.814 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:04:04.814 Program objdump found: YES (/usr/bin/objdump) 00:04:04.814 Compiler for C supports arguments -mavx512f: YES 00:04:04.814 Checking if "AVX512 checking" compiles: YES 00:04:04.814 Fetching value of define "__SSE4_2__" : 1 00:04:04.814 Fetching value of define "__AES__" : 1 00:04:04.814 Fetching value of define "__AVX__" : 1 00:04:04.814 Fetching value of define "__AVX2__" : 1 00:04:04.814 Fetching value of define "__AVX512BW__" : 1 00:04:04.814 Fetching value of define "__AVX512CD__" : 1 00:04:04.814 Fetching value of define "__AVX512DQ__" : 1 00:04:04.814 Fetching value of define "__AVX512F__" : 1 00:04:04.814 Fetching value of define "__AVX512VL__" : 1 00:04:04.814 Fetching value of define "__PCLMUL__" : 1 00:04:04.814 Fetching value of define "__RDRND__" : 1 00:04:04.814 Fetching value of define "__RDSEED__" : 1 00:04:04.814 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:04:04.814 Fetching value of define "__znver1__" : (undefined) 00:04:04.814 Fetching value of define "__znver2__" : (undefined) 00:04:04.814 Fetching value of define "__znver3__" : (undefined) 00:04:04.814 Fetching value of define "__znver4__" : (undefined) 00:04:04.814 Compiler for C supports arguments -Wno-format-truncation: YES 00:04:04.814 Message: lib/log: Defining dependency "log" 00:04:04.814 Message: lib/kvargs: Defining dependency "kvargs" 00:04:04.814 Message: lib/telemetry: Defining dependency "telemetry" 00:04:04.814 Checking for function "getentropy" : NO 00:04:04.814 Message: lib/eal: Defining dependency "eal" 00:04:04.814 Message: lib/ring: Defining dependency "ring" 00:04:04.814 Message: lib/rcu: Defining dependency "rcu" 00:04:04.814 Message: lib/mempool: Defining dependency "mempool" 00:04:04.814 Message: lib/mbuf: Defining dependency "mbuf" 00:04:04.814 Fetching value of define "__PCLMUL__" : 1 (cached) 00:04:04.814 Fetching value of define "__AVX512F__" : 1 (cached) 00:04:04.814 Fetching value of define "__AVX512BW__" : 1 (cached) 00:04:04.814 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:04:04.814 Fetching value of define "__AVX512VL__" : 1 (cached) 00:04:04.814 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:04:04.814 Compiler for C supports arguments -mpclmul: YES 00:04:04.814 Compiler for C supports arguments -maes: YES 00:04:04.814 Compiler for C supports arguments -mavx512f: YES (cached) 00:04:04.814 Compiler for C supports arguments -mavx512bw: YES 00:04:04.814 Compiler for C supports arguments -mavx512dq: YES 00:04:04.814 Compiler for C supports arguments -mavx512vl: YES 00:04:04.814 Compiler for C supports arguments -mvpclmulqdq: YES 00:04:04.814 Compiler for C supports arguments -mavx2: YES 00:04:04.814 Compiler for C supports arguments -mavx: YES 00:04:04.814 Message: lib/net: Defining dependency "net" 00:04:04.814 Message: lib/meter: Defining dependency "meter" 00:04:04.814 Message: lib/ethdev: Defining dependency "ethdev" 00:04:04.815 Message: lib/pci: Defining dependency "pci" 00:04:04.815 Message: lib/cmdline: Defining dependency "cmdline" 00:04:04.815 Message: lib/metrics: Defining dependency "metrics" 00:04:04.815 Message: lib/hash: Defining dependency "hash" 00:04:04.815 Message: lib/timer: Defining dependency "timer" 00:04:04.815 Fetching value of define "__AVX512F__" : 1 (cached) 00:04:04.815 Fetching value of define "__AVX512VL__" : 1 (cached) 00:04:04.815 Fetching value of define "__AVX512CD__" : 1 (cached) 00:04:04.815 Fetching value of define "__AVX512BW__" : 1 (cached) 00:04:04.815 Message: lib/acl: Defining dependency "acl" 00:04:04.815 Message: lib/bbdev: Defining dependency "bbdev" 00:04:04.815 Message: lib/bitratestats: Defining dependency "bitratestats" 00:04:04.815 Run-time dependency libelf found: YES 0.190 00:04:04.815 Message: lib/bpf: Defining dependency "bpf" 00:04:04.815 Message: lib/cfgfile: Defining dependency "cfgfile" 00:04:04.815 Message: lib/compressdev: Defining dependency "compressdev" 00:04:04.815 Message: lib/cryptodev: Defining dependency "cryptodev" 00:04:04.815 Message: lib/distributor: Defining dependency "distributor" 00:04:04.815 Message: lib/dmadev: Defining dependency "dmadev" 00:04:04.815 Message: lib/efd: Defining dependency "efd" 00:04:04.815 Message: lib/eventdev: Defining dependency "eventdev" 00:04:04.815 Message: lib/dispatcher: Defining dependency "dispatcher" 00:04:04.815 Message: lib/gpudev: Defining dependency "gpudev" 00:04:04.815 Message: lib/gro: Defining dependency "gro" 00:04:04.815 Message: lib/gso: Defining dependency "gso" 00:04:04.815 Message: lib/ip_frag: Defining dependency "ip_frag" 00:04:04.815 Message: lib/jobstats: Defining dependency "jobstats" 00:04:04.815 Message: lib/latencystats: Defining dependency "latencystats" 00:04:04.815 Message: lib/lpm: Defining dependency "lpm" 00:04:04.815 Fetching value of define "__AVX512F__" : 1 (cached) 00:04:04.815 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:04:04.815 Fetching value of define "__AVX512IFMA__" : (undefined) 00:04:04.815 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:04:04.815 Message: lib/member: Defining dependency "member" 00:04:04.815 Message: lib/pcapng: Defining dependency "pcapng" 00:04:04.815 Compiler for C supports arguments -Wno-cast-qual: YES 00:04:04.815 Message: lib/power: Defining dependency "power" 00:04:04.815 Message: lib/rawdev: Defining dependency "rawdev" 00:04:04.815 Message: lib/regexdev: Defining dependency "regexdev" 00:04:04.815 Message: lib/mldev: Defining dependency "mldev" 00:04:04.815 Message: lib/rib: Defining dependency "rib" 00:04:04.815 Message: lib/reorder: Defining dependency "reorder" 00:04:04.815 Message: lib/sched: Defining dependency "sched" 00:04:04.815 Message: lib/security: Defining dependency "security" 00:04:04.815 Message: lib/stack: Defining dependency "stack" 00:04:04.815 Has header "linux/userfaultfd.h" : YES 00:04:04.815 Has header "linux/vduse.h" : YES 00:04:04.815 Message: lib/vhost: Defining dependency "vhost" 00:04:04.815 Message: lib/ipsec: Defining dependency "ipsec" 00:04:04.815 Message: lib/pdcp: Defining dependency "pdcp" 00:04:04.815 Fetching value of define "__AVX512F__" : 1 (cached) 00:04:04.815 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:04:04.815 Fetching value of define "__AVX512BW__" : 1 (cached) 00:04:04.815 Message: lib/fib: Defining dependency "fib" 00:04:04.815 Message: lib/port: Defining dependency "port" 00:04:04.815 Message: lib/pdump: Defining dependency "pdump" 00:04:04.815 Message: lib/table: Defining dependency "table" 00:04:04.815 Message: lib/pipeline: Defining dependency "pipeline" 00:04:04.815 Message: lib/graph: Defining dependency "graph" 00:04:04.815 Message: lib/node: Defining dependency "node" 00:04:04.815 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:04:10.091 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:04:10.091 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:04:10.091 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:04:10.091 Compiler for C supports arguments -std=c11: YES 00:04:10.091 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:04:10.091 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:04:10.091 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:04:10.091 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:04:10.091 Run-time dependency libmlx5 found: YES 1.24.44.0 00:04:10.091 Run-time dependency libibverbs found: YES 1.14.44.0 00:04:10.091 Library mtcr_ul found: NO 00:04:10.091 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:04:10.091 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseKR4_Full" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseCR4_Full" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseSR4_Full" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseLR4_Full" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseKR4_Full" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseCR4_Full" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseSR4_Full" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseLR4_Full" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_25000baseCR_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_50000baseCR2_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_100000baseKR4_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:04:10.091 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:04:12.627 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:04:12.627 Configuring mlx5_autoconf.h using configuration 00:04:12.627 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:04:12.627 Run-time dependency libcrypto found: YES 3.0.9 00:04:12.627 Library IPSec_MB found: YES 00:04:12.627 Fetching value of define "IMB_VERSION_STR" : "1.0.0" 00:04:12.627 Message: drivers/common/qat: Defining dependency "common_qat" 00:04:12.627 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:04:12.627 Compiler for C supports arguments -Wno-sign-compare: YES 00:04:12.627 Compiler for C supports arguments -Wno-unused-value: YES 00:04:12.627 Compiler for C supports arguments -Wno-format: YES 00:04:12.627 Compiler for C supports arguments -Wno-format-security: YES 00:04:12.627 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:04:12.627 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:04:12.627 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:04:12.627 Compiler for C supports arguments -Wno-unused-parameter: YES 00:04:12.627 Fetching value of define "__AVX512F__" : 1 (cached) 00:04:12.627 Fetching value of define "__AVX512BW__" : 1 (cached) 00:04:12.627 Compiler for C supports arguments -mavx512f: YES (cached) 00:04:12.627 Compiler for C supports arguments -mavx512bw: YES (cached) 00:04:12.627 Compiler for C supports arguments -march=skylake-avx512: YES 00:04:12.627 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:04:12.627 Library IPSec_MB found: YES 00:04:12.627 Fetching value of define "IMB_VERSION_STR" : "1.0.0" (cached) 00:04:12.628 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:04:12.628 Compiler for C supports arguments -std=c11: YES (cached) 00:04:12.628 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:04:12.628 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:04:12.628 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:04:12.628 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:04:12.628 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:04:12.628 Run-time dependency libisal found: YES 2.29.0 00:04:12.628 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:04:12.628 Compiler for C supports arguments -std=c11: YES (cached) 00:04:12.628 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:04:12.628 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:04:12.628 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:04:12.628 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:04:12.628 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:04:12.628 Has header "sys/epoll.h" : YES 00:04:12.628 Program doxygen found: YES (/usr/bin/doxygen) 00:04:12.628 Configuring doxy-api-html.conf using configuration 00:04:12.628 Configuring doxy-api-man.conf using configuration 00:04:12.628 Program mandb found: YES (/usr/bin/mandb) 00:04:12.628 Program sphinx-build found: NO 00:04:12.628 Configuring rte_build_config.h using configuration 00:04:12.628 Message: 00:04:12.628 ================= 00:04:12.628 Applications Enabled 00:04:12.628 ================= 00:04:12.628 00:04:12.628 apps: 00:04:12.628 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:04:12.628 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:04:12.628 test-pmd, test-regex, test-sad, test-security-perf, 00:04:12.628 00:04:12.628 Message: 00:04:12.628 ================= 00:04:12.628 Libraries Enabled 00:04:12.628 ================= 00:04:12.628 00:04:12.628 libs: 00:04:12.628 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:04:12.628 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:04:12.628 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:04:12.628 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:04:12.628 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:04:12.628 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:04:12.628 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:04:12.628 00:04:12.628 00:04:12.628 Message: 00:04:12.628 =============== 00:04:12.628 Drivers Enabled 00:04:12.628 =============== 00:04:12.628 00:04:12.628 common: 00:04:12.628 mlx5, qat, 00:04:12.628 bus: 00:04:12.628 auxiliary, pci, vdev, 00:04:12.628 mempool: 00:04:12.628 ring, 00:04:12.628 dma: 00:04:12.628 00:04:12.628 net: 00:04:12.628 i40e, 00:04:12.628 raw: 00:04:12.628 00:04:12.628 crypto: 00:04:12.628 ipsec_mb, mlx5, 00:04:12.628 compress: 00:04:12.628 isal, mlx5, 00:04:12.628 regex: 00:04:12.628 00:04:12.628 ml: 00:04:12.628 00:04:12.628 vdpa: 00:04:12.628 00:04:12.628 event: 00:04:12.628 00:04:12.628 baseband: 00:04:12.628 00:04:12.628 gpu: 00:04:12.628 00:04:12.628 00:04:13.195 Message: 00:04:13.195 ================= 00:04:13.195 Content Skipped 00:04:13.195 ================= 00:04:13.195 00:04:13.195 apps: 00:04:13.195 00:04:13.195 libs: 00:04:13.195 00:04:13.195 drivers: 00:04:13.195 common/cpt: not in enabled drivers build config 00:04:13.195 common/dpaax: not in enabled drivers build config 00:04:13.195 common/iavf: not in enabled drivers build config 00:04:13.195 common/idpf: not in enabled drivers build config 00:04:13.195 common/mvep: not in enabled drivers build config 00:04:13.195 common/octeontx: not in enabled drivers build config 00:04:13.195 bus/cdx: not in enabled drivers build config 00:04:13.195 bus/dpaa: not in enabled drivers build config 00:04:13.195 bus/fslmc: not in enabled drivers build config 00:04:13.195 bus/ifpga: not in enabled drivers build config 00:04:13.195 bus/platform: not in enabled drivers build config 00:04:13.195 bus/vmbus: not in enabled drivers build config 00:04:13.195 common/cnxk: not in enabled drivers build config 00:04:13.195 common/nfp: not in enabled drivers build config 00:04:13.195 common/sfc_efx: not in enabled drivers build config 00:04:13.195 mempool/bucket: not in enabled drivers build config 00:04:13.195 mempool/cnxk: not in enabled drivers build config 00:04:13.195 mempool/dpaa: not in enabled drivers build config 00:04:13.195 mempool/dpaa2: not in enabled drivers build config 00:04:13.195 mempool/octeontx: not in enabled drivers build config 00:04:13.195 mempool/stack: not in enabled drivers build config 00:04:13.195 dma/cnxk: not in enabled drivers build config 00:04:13.195 dma/dpaa: not in enabled drivers build config 00:04:13.195 dma/dpaa2: not in enabled drivers build config 00:04:13.195 dma/hisilicon: not in enabled drivers build config 00:04:13.195 dma/idxd: not in enabled drivers build config 00:04:13.195 dma/ioat: not in enabled drivers build config 00:04:13.195 dma/skeleton: not in enabled drivers build config 00:04:13.196 net/af_packet: not in enabled drivers build config 00:04:13.196 net/af_xdp: not in enabled drivers build config 00:04:13.196 net/ark: not in enabled drivers build config 00:04:13.196 net/atlantic: not in enabled drivers build config 00:04:13.196 net/avp: not in enabled drivers build config 00:04:13.196 net/axgbe: not in enabled drivers build config 00:04:13.196 net/bnx2x: not in enabled drivers build config 00:04:13.196 net/bnxt: not in enabled drivers build config 00:04:13.196 net/bonding: not in enabled drivers build config 00:04:13.196 net/cnxk: not in enabled drivers build config 00:04:13.196 net/cpfl: not in enabled drivers build config 00:04:13.196 net/cxgbe: not in enabled drivers build config 00:04:13.196 net/dpaa: not in enabled drivers build config 00:04:13.196 net/dpaa2: not in enabled drivers build config 00:04:13.196 net/e1000: not in enabled drivers build config 00:04:13.196 net/ena: not in enabled drivers build config 00:04:13.196 net/enetc: not in enabled drivers build config 00:04:13.196 net/enetfec: not in enabled drivers build config 00:04:13.196 net/enic: not in enabled drivers build config 00:04:13.196 net/failsafe: not in enabled drivers build config 00:04:13.196 net/fm10k: not in enabled drivers build config 00:04:13.196 net/gve: not in enabled drivers build config 00:04:13.196 net/hinic: not in enabled drivers build config 00:04:13.196 net/hns3: not in enabled drivers build config 00:04:13.196 net/iavf: not in enabled drivers build config 00:04:13.196 net/ice: not in enabled drivers build config 00:04:13.196 net/idpf: not in enabled drivers build config 00:04:13.196 net/igc: not in enabled drivers build config 00:04:13.196 net/ionic: not in enabled drivers build config 00:04:13.196 net/ipn3ke: not in enabled drivers build config 00:04:13.196 net/ixgbe: not in enabled drivers build config 00:04:13.196 net/mana: not in enabled drivers build config 00:04:13.196 net/memif: not in enabled drivers build config 00:04:13.196 net/mlx4: not in enabled drivers build config 00:04:13.196 net/mlx5: not in enabled drivers build config 00:04:13.196 net/mvneta: not in enabled drivers build config 00:04:13.196 net/mvpp2: not in enabled drivers build config 00:04:13.196 net/netvsc: not in enabled drivers build config 00:04:13.196 net/nfb: not in enabled drivers build config 00:04:13.196 net/nfp: not in enabled drivers build config 00:04:13.196 net/ngbe: not in enabled drivers build config 00:04:13.196 net/null: not in enabled drivers build config 00:04:13.196 net/octeontx: not in enabled drivers build config 00:04:13.196 net/octeon_ep: not in enabled drivers build config 00:04:13.196 net/pcap: not in enabled drivers build config 00:04:13.196 net/pfe: not in enabled drivers build config 00:04:13.196 net/qede: not in enabled drivers build config 00:04:13.196 net/ring: not in enabled drivers build config 00:04:13.196 net/sfc: not in enabled drivers build config 00:04:13.196 net/softnic: not in enabled drivers build config 00:04:13.196 net/tap: not in enabled drivers build config 00:04:13.196 net/thunderx: not in enabled drivers build config 00:04:13.196 net/txgbe: not in enabled drivers build config 00:04:13.196 net/vdev_netvsc: not in enabled drivers build config 00:04:13.196 net/vhost: not in enabled drivers build config 00:04:13.196 net/virtio: not in enabled drivers build config 00:04:13.196 net/vmxnet3: not in enabled drivers build config 00:04:13.196 raw/cnxk_bphy: not in enabled drivers build config 00:04:13.196 raw/cnxk_gpio: not in enabled drivers build config 00:04:13.196 raw/dpaa2_cmdif: not in enabled drivers build config 00:04:13.196 raw/ifpga: not in enabled drivers build config 00:04:13.196 raw/ntb: not in enabled drivers build config 00:04:13.196 raw/skeleton: not in enabled drivers build config 00:04:13.196 crypto/armv8: not in enabled drivers build config 00:04:13.196 crypto/bcmfs: not in enabled drivers build config 00:04:13.196 crypto/caam_jr: not in enabled drivers build config 00:04:13.196 crypto/ccp: not in enabled drivers build config 00:04:13.196 crypto/cnxk: not in enabled drivers build config 00:04:13.196 crypto/dpaa_sec: not in enabled drivers build config 00:04:13.196 crypto/dpaa2_sec: not in enabled drivers build config 00:04:13.196 crypto/mvsam: not in enabled drivers build config 00:04:13.196 crypto/nitrox: not in enabled drivers build config 00:04:13.196 crypto/null: not in enabled drivers build config 00:04:13.196 crypto/octeontx: not in enabled drivers build config 00:04:13.196 crypto/openssl: not in enabled drivers build config 00:04:13.196 crypto/scheduler: not in enabled drivers build config 00:04:13.196 crypto/uadk: not in enabled drivers build config 00:04:13.196 crypto/virtio: not in enabled drivers build config 00:04:13.196 compress/octeontx: not in enabled drivers build config 00:04:13.196 compress/zlib: not in enabled drivers build config 00:04:13.196 regex/mlx5: not in enabled drivers build config 00:04:13.196 regex/cn9k: not in enabled drivers build config 00:04:13.196 ml/cnxk: not in enabled drivers build config 00:04:13.196 vdpa/ifc: not in enabled drivers build config 00:04:13.196 vdpa/mlx5: not in enabled drivers build config 00:04:13.196 vdpa/nfp: not in enabled drivers build config 00:04:13.196 vdpa/sfc: not in enabled drivers build config 00:04:13.196 event/cnxk: not in enabled drivers build config 00:04:13.196 event/dlb2: not in enabled drivers build config 00:04:13.196 event/dpaa: not in enabled drivers build config 00:04:13.196 event/dpaa2: not in enabled drivers build config 00:04:13.196 event/dsw: not in enabled drivers build config 00:04:13.196 event/opdl: not in enabled drivers build config 00:04:13.196 event/skeleton: not in enabled drivers build config 00:04:13.196 event/sw: not in enabled drivers build config 00:04:13.196 event/octeontx: not in enabled drivers build config 00:04:13.196 baseband/acc: not in enabled drivers build config 00:04:13.196 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:04:13.196 baseband/fpga_lte_fec: not in enabled drivers build config 00:04:13.196 baseband/la12xx: not in enabled drivers build config 00:04:13.196 baseband/null: not in enabled drivers build config 00:04:13.196 baseband/turbo_sw: not in enabled drivers build config 00:04:13.196 gpu/cuda: not in enabled drivers build config 00:04:13.196 00:04:13.196 00:04:13.196 Build targets in project: 247 00:04:13.196 00:04:13.196 DPDK 23.11.0 00:04:13.196 00:04:13.196 User defined options 00:04:13.196 libdir : lib 00:04:13.196 prefix : /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:04:13.196 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow -I/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:04:13.196 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:04:13.196 enable_docs : false 00:04:13.196 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,crypto,crypto/ipsec_mb,crypto/qat,compress/qat,common/qat,bus/auxiliary,common/mlx5,common/mlx5/linux,crypto/mlx5,compress,compress/isal,compress/qat,common/qat,compress/mlx5, 00:04:13.196 enable_kmods : false 00:04:13.196 machine : native 00:04:13.196 tests : false 00:04:13.196 00:04:13.196 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:04:13.196 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:04:13.196 06:20:26 build_native_dpdk -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp -j112 00:04:13.196 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp' 00:04:13.463 [1/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:04:13.463 [2/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:04:13.463 [3/809] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:04:13.463 [4/809] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:04:13.463 [5/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:04:13.463 [6/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:04:13.463 [7/809] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:04:13.463 [8/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:04:13.463 [9/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:04:13.724 [10/809] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:04:13.724 [11/809] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:04:13.724 [12/809] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:04:13.725 [13/809] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:04:13.725 [14/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:04:13.725 [15/809] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:04:13.725 [16/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:04:13.725 [17/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:04:13.725 [18/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:04:13.725 [19/809] Linking static target lib/librte_kvargs.a 00:04:13.725 [20/809] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:04:13.725 [21/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:04:13.725 [22/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:04:13.725 [23/809] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:04:13.725 [24/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:04:13.725 [25/809] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:04:13.725 [26/809] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:04:13.725 [27/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:04:13.725 [28/809] Linking static target lib/librte_pci.a 00:04:13.725 [29/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:04:13.725 [30/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:04:13.725 [31/809] Compiling C object lib/librte_log.a.p/log_log.c.o 00:04:13.725 [32/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:04:13.725 [33/809] Linking static target lib/librte_log.a 00:04:13.725 [34/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:04:13.725 [35/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:04:13.983 [36/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:04:14.249 [37/809] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:04:14.249 [38/809] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:04:14.249 [39/809] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:04:14.249 [40/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:04:14.249 [41/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:04:14.249 [42/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:04:14.249 [43/809] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:04:14.249 [44/809] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:04:14.249 [45/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:04:14.249 [46/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:04:14.249 [47/809] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:04:14.249 [48/809] Linking static target lib/librte_ring.a 00:04:14.249 [49/809] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:04:14.249 [50/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:04:14.249 [51/809] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:04:14.249 [52/809] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:04:14.249 [53/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:04:14.249 [54/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:04:14.249 [55/809] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:04:14.249 [56/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:04:14.249 [57/809] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:04:14.249 [58/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:04:14.249 [59/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:04:14.249 [60/809] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:04:14.249 [61/809] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:04:14.250 [62/809] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:04:14.250 [63/809] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:04:14.250 [64/809] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:04:14.250 [65/809] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:04:14.250 [66/809] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:04:14.250 [67/809] Linking static target lib/librte_meter.a 00:04:14.250 [68/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:04:14.250 [69/809] Linking static target lib/librte_cmdline.a 00:04:14.250 [70/809] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:04:14.510 [71/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:04:14.510 [72/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:04:14.510 [73/809] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:04:14.510 [74/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:04:14.510 [75/809] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:04:14.510 [76/809] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:04:14.510 [77/809] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:04:14.510 [78/809] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:04:14.510 [79/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:04:14.510 [80/809] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:04:14.510 [81/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:04:14.510 [82/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:04:14.510 [83/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:04:14.510 [84/809] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:04:14.510 [85/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:04:14.510 [86/809] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:04:14.510 [87/809] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:04:14.510 [88/809] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:04:14.510 [89/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:04:14.510 [90/809] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:04:14.510 [91/809] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:04:14.510 [92/809] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:04:14.510 [93/809] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:04:14.510 [94/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:04:14.510 [95/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:04:14.510 [96/809] Linking static target lib/net/libnet_crc_avx512_lib.a 00:04:14.510 [97/809] Linking static target lib/librte_metrics.a 00:04:14.510 [98/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:04:14.510 [99/809] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:04:14.510 [100/809] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:04:14.510 [101/809] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:04:14.510 [102/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:04:14.510 [103/809] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:04:14.510 [104/809] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:04:14.510 [105/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:04:14.510 [106/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:04:14.510 [107/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:04:14.510 [108/809] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:04:14.510 [109/809] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:04:14.510 [110/809] Linking static target lib/librte_net.a 00:04:14.510 [111/809] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:04:14.510 [112/809] Linking static target lib/librte_bitratestats.a 00:04:14.510 [113/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:04:14.776 [114/809] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:04:14.776 [115/809] Linking static target lib/librte_cfgfile.a 00:04:14.776 [116/809] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:04:14.776 [117/809] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:04:14.776 [118/809] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:04:14.776 [119/809] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:04:14.776 [120/809] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:04:14.776 [121/809] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:04:14.776 [122/809] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:04:14.776 [123/809] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:04:14.776 [124/809] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:04:14.776 [125/809] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:04:14.776 [126/809] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:04:14.776 [127/809] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:04:14.776 [128/809] Linking target lib/librte_log.so.24.0 00:04:14.776 [129/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:04:14.776 [130/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:04:14.776 [131/809] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:04:14.776 [132/809] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:04:14.776 [133/809] Linking static target lib/librte_bbdev.a 00:04:14.776 [134/809] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:04:14.776 [135/809] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:04:14.776 [136/809] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:04:14.776 [137/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:04:15.038 [138/809] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:04:15.038 [139/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:04:15.038 [140/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:04:15.038 [141/809] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:04:15.038 [142/809] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:04:15.038 [143/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:04:15.038 [144/809] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:04:15.038 [145/809] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:04:15.038 [146/809] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:04:15.038 [147/809] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:04:15.038 [148/809] Linking static target lib/librte_timer.a 00:04:15.038 [149/809] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:04:15.038 [150/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:04:15.038 [151/809] Linking target lib/librte_kvargs.so.24.0 00:04:15.038 [152/809] Linking static target lib/librte_mempool.a 00:04:15.038 [153/809] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:04:15.038 [154/809] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:04:15.038 [155/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:04:15.038 [156/809] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:04:15.038 [157/809] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:04:15.038 [158/809] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:04:15.299 [159/809] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:04:15.299 [160/809] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:04:15.299 [161/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:04:15.299 [162/809] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:04:15.299 [163/809] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:04:15.299 [164/809] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:04:15.299 [165/809] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:04:15.299 [166/809] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:04:15.299 [167/809] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:04:15.299 [168/809] Linking static target lib/librte_jobstats.a 00:04:15.299 [169/809] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:04:15.299 [170/809] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:04:15.299 [171/809] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:04:15.299 [172/809] Linking static target lib/librte_compressdev.a 00:04:15.299 [173/809] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:04:15.299 [174/809] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:04:15.299 [175/809] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:04:15.299 [176/809] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:04:15.299 [177/809] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:04:15.299 [178/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:04:15.299 [179/809] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:04:15.299 [180/809] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:04:15.299 [181/809] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:04:15.299 [182/809] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:04:15.299 [183/809] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:04:15.299 [184/809] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:04:15.299 [185/809] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:04:15.299 [186/809] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:04:15.299 [187/809] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:04:15.299 [188/809] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:04:15.299 [189/809] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:04:15.299 [190/809] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:04:15.299 [191/809] Linking static target lib/librte_distributor.a 00:04:15.299 [192/809] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:04:15.299 [193/809] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:04:15.299 [194/809] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:04:15.299 [195/809] Linking static target lib/librte_dispatcher.a 00:04:15.299 [196/809] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:04:15.559 [197/809] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:04:15.559 [198/809] Linking static target lib/librte_telemetry.a 00:04:15.559 [199/809] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:04:15.559 [200/809] Linking static target lib/member/libsketch_avx512_tmp.a 00:04:15.559 [201/809] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:04:15.560 [202/809] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:04:15.560 [203/809] Linking static target lib/librte_latencystats.a 00:04:15.560 [204/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:04:15.560 [205/809] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:04:15.560 [206/809] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:04:15.560 [207/809] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:04:15.560 [208/809] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:04:15.560 [209/809] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:04:15.560 [210/809] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:04:15.560 [211/809] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:04:15.560 [212/809] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:04:15.560 [213/809] Linking static target lib/librte_rcu.a 00:04:15.560 [214/809] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:04:15.560 [215/809] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:04:15.560 [216/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:04:15.560 [217/809] Linking static target lib/librte_gpudev.a 00:04:15.560 [218/809] Linking static target lib/librte_eal.a 00:04:15.560 [219/809] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:04:15.560 [220/809] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:04:15.560 [221/809] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:04:15.560 [222/809] Linking static target lib/librte_gro.a 00:04:15.560 [223/809] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:04:15.560 [224/809] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:04:15.560 [225/809] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:04:15.560 [226/809] Linking static target lib/librte_stack.a 00:04:15.560 [227/809] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:04:15.560 [228/809] Linking static target lib/librte_dmadev.a 00:04:15.560 [229/809] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:04:15.560 [230/809] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:04:15.560 [231/809] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:04:15.560 [232/809] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:04:15.824 [233/809] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:04:15.824 [234/809] Linking static target lib/librte_gso.a 00:04:15.824 [235/809] Linking static target lib/librte_regexdev.a 00:04:15.824 [236/809] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:04:15.824 [237/809] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:04:15.824 [238/809] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:04:15.824 [239/809] Linking static target lib/librte_pcapng.a 00:04:15.824 [240/809] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:04:15.824 [241/809] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:04:15.824 [242/809] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:04:15.824 [243/809] Linking static target lib/librte_mldev.a 00:04:15.824 [244/809] Linking static target lib/librte_rawdev.a 00:04:15.824 [245/809] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:04:15.824 [246/809] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:04:15.824 [247/809] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:04:15.824 [248/809] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:04:15.824 [249/809] Linking static target lib/librte_ip_frag.a 00:04:15.824 [250/809] Linking static target lib/librte_mbuf.a 00:04:15.824 [251/809] Linking static target lib/librte_power.a 00:04:15.824 [252/809] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:04:15.824 [253/809] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:04:15.824 [254/809] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:04:15.824 [255/809] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:04:15.824 [256/809] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:04:15.824 [257/809] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:04:16.085 [258/809] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:04:16.085 [259/809] Linking static target lib/librte_reorder.a 00:04:16.085 [260/809] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:04:16.085 [261/809] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.085 [262/809] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:04:16.085 [263/809] Linking static target lib/librte_bpf.a 00:04:16.085 [264/809] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:04:16.085 [265/809] Linking static target lib/librte_security.a 00:04:16.085 [266/809] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:04:16.085 [267/809] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:04:16.085 [268/809] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:04:16.085 [269/809] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:04:16.085 [270/809] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.085 [271/809] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:04:16.085 [272/809] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:04:16.085 [273/809] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.085 [274/809] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.085 [275/809] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.085 [276/809] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:04:16.085 [277/809] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:04:16.085 [278/809] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.085 [279/809] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:04:16.085 [280/809] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.351 [281/809] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:04:16.351 [282/809] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:04:16.351 [283/809] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:04:16.351 [284/809] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.351 [285/809] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:04:16.351 [286/809] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.351 [287/809] Compiling C object lib/librte_node.a.p/node_null.c.o 00:04:16.351 [288/809] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:04:16.351 [289/809] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:04:16.351 [290/809] Linking static target lib/librte_lpm.a 00:04:16.351 [291/809] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:04:16.351 [292/809] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:04:16.351 [293/809] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:04:16.351 [294/809] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:04:16.351 [295/809] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.351 [296/809] Linking static target lib/librte_rib.a 00:04:16.351 [297/809] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:04:16.351 [298/809] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:04:16.351 [299/809] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.351 [300/809] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:04:16.351 [301/809] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.351 [302/809] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:04:16.351 [303/809] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:04:16.351 [304/809] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:04:16.351 [305/809] Linking target lib/librte_telemetry.so.24.0 00:04:16.351 [306/809] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:04:16.610 [307/809] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:04:16.610 [308/809] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:04:16.610 [309/809] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:04:16.610 [310/809] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:04:16.610 [311/809] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:04:16.610 [312/809] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:04:16.610 [313/809] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.610 [314/809] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.610 [315/809] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.610 [316/809] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:04:16.610 [317/809] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:04:16.610 [318/809] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:04:16.610 [319/809] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:04:16.610 [320/809] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:04:16.610 [321/809] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.610 [322/809] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:04:16.610 [323/809] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:04:16.610 [324/809] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:04:16.610 [325/809] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:04:16.610 [326/809] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:04:16.610 [327/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:04:16.610 [328/809] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:04:16.610 [329/809] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:04:16.610 [330/809] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:04:16.610 [331/809] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:04:16.873 [332/809] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:04:16.873 [333/809] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:04:16.873 [334/809] Linking static target lib/librte_efd.a 00:04:16.873 [335/809] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:04:16.873 [336/809] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:04:16.873 [337/809] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:04:16.873 [338/809] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:04:16.873 [339/809] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:04:16.873 [340/809] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:04:16.873 [341/809] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:04:16.873 [342/809] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.873 [343/809] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:04:16.873 [344/809] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:04:16.874 [345/809] Compiling C object lib/librte_node.a.p/node_log.c.o 00:04:16.874 [346/809] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:04:16.874 [347/809] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:04:16.874 [348/809] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:04:16.874 [349/809] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:04:16.874 [350/809] Linking static target lib/librte_pdump.a 00:04:16.874 [351/809] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:04:17.139 [352/809] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:04:17.139 [353/809] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:04:17.139 [354/809] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:04:17.139 [355/809] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:04:17.139 [356/809] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:04:17.139 [357/809] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.139 [358/809] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:04:17.139 [359/809] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:04:17.139 [360/809] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:04:17.139 [361/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:04:17.139 [362/809] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:04:17.139 [363/809] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:04:17.139 [364/809] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:04:17.139 [365/809] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.139 [366/809] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:04:17.139 [367/809] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:04:17.139 [368/809] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:04:17.139 [369/809] Linking static target lib/librte_fib.a 00:04:17.139 [370/809] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:04:17.139 [371/809] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:04:17.139 [372/809] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.139 [373/809] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:04:17.139 [374/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:04:17.401 [375/809] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:04:17.401 [376/809] Linking static target lib/librte_cryptodev.a 00:04:17.401 [377/809] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:04:17.401 [378/809] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:04:17.401 [379/809] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:04:17.401 [380/809] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:04:17.401 [381/809] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:04:17.401 [382/809] Linking static target drivers/libtmp_rte_bus_vdev.a 00:04:17.401 [383/809] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:04:17.401 [384/809] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.401 [385/809] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.401 [386/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:04:17.401 [387/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:04:17.401 [388/809] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:04:17.401 [389/809] Linking static target drivers/librte_bus_auxiliary.a 00:04:17.401 [390/809] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.401 [391/809] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:04:17.401 [392/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:04:17.401 [393/809] Compiling C object drivers/librte_bus_auxiliary.so.24.0.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:04:17.401 [394/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:04:17.401 [395/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:04:17.401 [396/809] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:04:17.401 [397/809] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:04:17.401 [398/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:04:17.401 [399/809] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.401 [400/809] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:04:17.401 [401/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:04:17.401 [402/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:04:17.401 [403/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:04:17.401 [404/809] Linking static target lib/librte_graph.a 00:04:17.401 [405/809] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.401 [406/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:04:17.401 [407/809] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:04:17.401 [408/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:04:17.664 [409/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:04:17.664 [410/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:04:17.664 [411/809] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:04:17.664 [412/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:04:17.664 [413/809] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:04:17.664 [414/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:04:17.664 [415/809] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:04:17.664 [416/809] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:04:17.664 [417/809] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:04:17.664 [418/809] Linking static target lib/librte_table.a 00:04:17.664 [419/809] Linking static target drivers/libtmp_rte_bus_pci.a 00:04:17.664 [420/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:04:17.664 [421/809] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:04:17.664 [422/809] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:04:17.664 [423/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:04:17.664 [424/809] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:04:17.664 [425/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:04:17.664 [426/809] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:04:17.664 [427/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:04:17.664 [428/809] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:04:17.664 [429/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:04:17.664 [430/809] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:04:17.664 [431/809] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:04:17.924 [432/809] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:04:17.924 [433/809] Linking static target drivers/librte_bus_vdev.a 00:04:17.924 [434/809] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:04:17.924 [435/809] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:04:17.924 [436/809] Linking static target lib/librte_sched.a 00:04:17.924 [437/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:04:17.924 [438/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:04:17.924 [439/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:04:17.924 [440/809] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.924 [441/809] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:04:17.924 [442/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:04:17.924 [443/809] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:04:17.924 [444/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:04:17.924 [445/809] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:04:17.924 [446/809] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:04:17.924 [447/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:04:18.186 [448/809] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:04:18.186 [449/809] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:04:18.186 [450/809] Linking static target drivers/librte_bus_pci.a 00:04:18.186 [451/809] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:04:18.186 [452/809] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:04:18.186 [453/809] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:04:18.186 [454/809] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:04:18.186 [455/809] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:04:18.186 [456/809] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:04:18.186 [457/809] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:04:18.186 [458/809] Linking static target lib/librte_ipsec.a 00:04:18.186 [459/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:04:18.186 [460/809] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:18.186 [461/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:04:18.186 [462/809] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:04:18.186 [463/809] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:04:18.186 [464/809] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:04:18.477 [465/809] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:04:18.477 [466/809] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:04:18.477 [467/809] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:04:18.477 [468/809] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:04:18.477 [469/809] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:04:18.477 [470/809] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:04:18.477 [471/809] Linking static target lib/librte_pdcp.a 00:04:18.477 [472/809] Linking static target lib/librte_member.a 00:04:18.477 [473/809] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:04:18.477 [474/809] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:04:18.477 [475/809] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:04:18.477 [476/809] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:04:18.477 [477/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:04:18.477 [478/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:04:18.477 [479/809] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:04:18.477 [480/809] Linking static target lib/librte_node.a 00:04:18.477 [481/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:04:18.477 [482/809] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:18.477 [483/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:04:18.477 [484/809] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:04:18.477 [485/809] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:04:18.477 [486/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:04:18.477 [487/809] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:04:18.477 [488/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:04:18.477 [489/809] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:04:18.752 [490/809] Linking static target lib/librte_hash.a 00:04:18.752 [491/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:04:18.752 [492/809] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:04:18.752 [493/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:04:18.752 [494/809] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:04:18.752 [495/809] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:04:18.752 [496/809] Linking static target lib/acl/libavx2_tmp.a 00:04:18.752 [497/809] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:04:18.752 [498/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:04:18.752 [499/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:04:18.752 [500/809] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:04:18.752 [501/809] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:04:18.752 [502/809] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:04:18.752 [503/809] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:04:18.752 [504/809] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:04:18.752 [505/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:04:18.752 [506/809] Linking static target drivers/libtmp_rte_mempool_ring.a 00:04:18.752 [507/809] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:04:18.752 [508/809] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:04:18.752 [509/809] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:04:18.752 [510/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:04:18.752 [511/809] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:04:18.752 [512/809] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:04:18.752 [513/809] Linking static target lib/librte_port.a 00:04:19.013 [514/809] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:04:19.013 [515/809] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:04:19.013 [516/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:04:19.013 [517/809] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.013 [518/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:04:19.013 [519/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:04:19.013 [520/809] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.013 [521/809] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:04:19.013 [522/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:04:19.013 [523/809] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:04:19.013 [524/809] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.013 [525/809] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:04:19.013 [526/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:04:19.013 [527/809] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.013 [528/809] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.013 [529/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:04:19.013 [530/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:04:19.013 [531/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:04:19.013 [532/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:04:19.013 [533/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:04:19.013 [534/809] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:04:19.013 [535/809] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:04:19.273 [536/809] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:04:19.273 [537/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:04:19.273 [538/809] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:04:19.273 [539/809] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:04:19.273 [540/809] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:04:19.273 [541/809] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:04:19.273 [542/809] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:04:19.273 [543/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:04:19.273 [544/809] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:04:19.273 [545/809] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:04:19.273 [546/809] Linking static target drivers/librte_mempool_ring.a 00:04:19.273 [547/809] Linking static target lib/librte_eventdev.a 00:04:19.273 [548/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:04:19.273 [549/809] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:04:19.273 [550/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:04:19.273 [551/809] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:04:19.273 [552/809] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:04:19.273 [553/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:04:19.273 [554/809] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:04:19.273 [555/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:04:19.273 [556/809] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:04:19.273 [557/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:04:19.273 [558/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:04:19.273 [559/809] Linking static target lib/librte_acl.a 00:04:19.273 [560/809] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:04:19.273 [561/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:04:19.273 [562/809] Compiling C object drivers/librte_compress_mlx5.so.24.0.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:04:19.273 [563/809] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:04:19.273 [564/809] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:04:19.273 [565/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:04:19.273 [566/809] Linking static target drivers/librte_compress_mlx5.a 00:04:19.273 [567/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:04:19.273 [568/809] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:04:19.273 [569/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:04:19.532 [570/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:04:19.532 [571/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:04:19.532 [572/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:04:19.532 [573/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:04:19.532 [574/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:04:19.532 [575/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:04:19.532 [576/809] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:04:19.532 [577/809] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:04:19.532 [578/809] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.532 [579/809] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:04:19.532 [580/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:04:19.532 [581/809] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:04:19.532 [582/809] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:04:19.533 [583/809] Compiling C object drivers/librte_crypto_mlx5.so.24.0.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:04:19.533 [584/809] Linking static target drivers/librte_crypto_mlx5.a 00:04:19.533 [585/809] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:04:19.533 [586/809] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:04:19.533 [587/809] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:04:19.533 [588/809] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:04:19.533 [589/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:04:19.533 [590/809] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.533 [591/809] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:04:19.533 [592/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:04:19.533 [593/809] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:04:19.533 [594/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:04:19.533 [595/809] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:04:19.533 [596/809] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:04:19.533 [597/809] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.533 [598/809] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:04:19.792 [599/809] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:04:19.792 [600/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:04:19.792 [601/809] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:04:19.792 [602/809] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:04:19.792 [603/809] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:04:19.792 [604/809] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:04:19.792 [605/809] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:04:19.792 [606/809] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:04:19.792 [607/809] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:04:19.792 [608/809] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:04:19.792 [609/809] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:04:19.792 [610/809] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:04:19.792 [611/809] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:04:19.792 [612/809] Linking static target drivers/libtmp_rte_compress_isal.a 00:04:19.792 [613/809] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:04:19.792 [614/809] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:04:19.792 [615/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:04:19.792 [616/809] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:04:19.792 [617/809] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:04:20.050 [618/809] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:04:20.050 [619/809] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:04:20.050 [620/809] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:04:20.050 [621/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:04:20.050 [622/809] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:04:20.050 [623/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:04:20.050 [624/809] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:04:20.050 [625/809] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:04:20.050 [626/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:04:20.050 [627/809] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:04:20.050 [628/809] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:04:20.050 [629/809] Linking static target drivers/net/i40e/base/libi40e_base.a 00:04:20.050 [630/809] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:04:20.050 [631/809] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:04:20.050 [632/809] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:04:20.050 [633/809] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:04:20.050 [634/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:04:20.050 [635/809] Compiling C object drivers/librte_compress_isal.so.24.0.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:04:20.050 [636/809] Linking static target drivers/librte_compress_isal.a 00:04:20.050 [637/809] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.0.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:04:20.050 [638/809] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:04:20.050 [639/809] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:04:20.308 [640/809] Linking static target drivers/librte_crypto_ipsec_mb.a 00:04:20.308 [641/809] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:04:20.308 [642/809] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:04:20.308 [643/809] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:04:20.308 [644/809] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:04:20.308 [645/809] Linking static target drivers/libtmp_rte_common_mlx5.a 00:04:20.308 [646/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:04:20.308 [647/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:04:20.567 [648/809] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:04:20.567 [649/809] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:04:20.567 [650/809] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:04:20.567 [651/809] Compiling C object drivers/librte_common_mlx5.so.24.0.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:04:20.567 [652/809] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:04:20.567 [653/809] Linking static target lib/librte_ethdev.a 00:04:20.567 [654/809] Linking static target drivers/librte_common_mlx5.a 00:04:20.567 [655/809] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:04:20.825 [656/809] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:04:20.825 [657/809] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:04:20.825 [658/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:04:20.825 [659/809] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:04:20.825 [660/809] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:04:21.083 [661/809] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:04:21.341 [662/809] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:04:21.600 [663/809] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:04:21.600 [664/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:04:21.859 [665/809] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:04:22.798 [666/809] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:04:22.798 [667/809] Linking static target drivers/libtmp_rte_net_i40e.a 00:04:22.798 [668/809] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:22.798 [669/809] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:04:23.056 [670/809] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:04:23.056 [671/809] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:04:23.056 [672/809] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:04:23.056 [673/809] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:04:23.056 [674/809] Linking static target drivers/librte_net_i40e.a 00:04:23.995 [675/809] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:04:23.995 [676/809] Linking static target drivers/libtmp_rte_common_qat.a 00:04:23.995 [677/809] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:04:24.255 [678/809] Generating drivers/rte_common_qat.pmd.c with a custom command 00:04:24.255 [679/809] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:04:24.255 [680/809] Compiling C object drivers/librte_common_qat.so.24.0.p/meson-generated_.._rte_common_qat.pmd.c.o 00:04:24.255 [681/809] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:04:24.255 [682/809] Linking static target drivers/librte_common_qat.a 00:04:25.634 [683/809] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:04:26.202 [684/809] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:04:28.739 [685/809] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:04:28.739 [686/809] Linking target lib/librte_eal.so.24.0 00:04:28.739 [687/809] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:04:28.739 [688/809] Linking target lib/librte_jobstats.so.24.0 00:04:28.739 [689/809] Linking target drivers/librte_bus_auxiliary.so.24.0 00:04:28.739 [690/809] Linking target lib/librte_ring.so.24.0 00:04:28.739 [691/809] Linking target lib/librte_pci.so.24.0 00:04:28.739 [692/809] Linking target lib/librte_meter.so.24.0 00:04:28.739 [693/809] Linking target lib/librte_timer.so.24.0 00:04:28.739 [694/809] Linking target lib/librte_cfgfile.so.24.0 00:04:28.739 [695/809] Linking target lib/librte_acl.so.24.0 00:04:28.739 [696/809] Linking target lib/librte_dmadev.so.24.0 00:04:28.739 [697/809] Linking target lib/librte_rawdev.so.24.0 00:04:28.739 [698/809] Linking target lib/librte_stack.so.24.0 00:04:28.739 [699/809] Linking target drivers/librte_bus_vdev.so.24.0 00:04:28.739 [700/809] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:04:28.739 [701/809] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:04:28.739 [702/809] Generating symbol file drivers/librte_bus_auxiliary.so.24.0.p/librte_bus_auxiliary.so.24.0.symbols 00:04:28.739 [703/809] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:04:28.739 [704/809] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:04:28.739 [705/809] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:04:28.739 [706/809] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:04:28.739 [707/809] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:04:28.739 [708/809] Linking target drivers/librte_bus_pci.so.24.0 00:04:28.999 [709/809] Linking target lib/librte_rcu.so.24.0 00:04:28.999 [710/809] Linking target lib/librte_mempool.so.24.0 00:04:28.999 [711/809] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:04:28.999 [712/809] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:04:28.999 [713/809] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:04:28.999 [714/809] Linking target drivers/librte_mempool_ring.so.24.0 00:04:28.999 [715/809] Linking target lib/librte_rib.so.24.0 00:04:28.999 [716/809] Linking target lib/librte_mbuf.so.24.0 00:04:29.258 [717/809] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:04:29.258 [718/809] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:04:29.258 [719/809] Linking target lib/librte_cryptodev.so.24.0 00:04:29.258 [720/809] Linking target lib/librte_gpudev.so.24.0 00:04:29.258 [721/809] Linking target lib/librte_bbdev.so.24.0 00:04:29.258 [722/809] Linking target lib/librte_reorder.so.24.0 00:04:29.258 [723/809] Linking target lib/librte_compressdev.so.24.0 00:04:29.258 [724/809] Linking target lib/librte_distributor.so.24.0 00:04:29.258 [725/809] Linking target lib/librte_net.so.24.0 00:04:29.258 [726/809] Linking target lib/librte_mldev.so.24.0 00:04:29.258 [727/809] Linking target lib/librte_fib.so.24.0 00:04:29.258 [728/809] Linking target lib/librte_sched.so.24.0 00:04:29.258 [729/809] Linking target lib/librte_regexdev.so.24.0 00:04:29.517 [730/809] Generating symbol file lib/librte_compressdev.so.24.0.p/librte_compressdev.so.24.0.symbols 00:04:29.517 [731/809] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:04:29.517 [732/809] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:04:29.517 [733/809] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:04:29.517 [734/809] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:04:29.517 [735/809] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:29.517 [736/809] Linking target drivers/librte_compress_isal.so.24.0 00:04:29.517 [737/809] Linking target lib/librte_security.so.24.0 00:04:29.517 [738/809] Linking target lib/librte_hash.so.24.0 00:04:29.517 [739/809] Linking target lib/librte_cmdline.so.24.0 00:04:29.517 [740/809] Linking target lib/librte_ethdev.so.24.0 00:04:29.806 [741/809] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:04:29.806 [742/809] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:04:29.806 [743/809] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:04:29.806 [744/809] Linking target lib/librte_lpm.so.24.0 00:04:29.806 [745/809] Linking target lib/librte_efd.so.24.0 00:04:29.806 [746/809] Linking target lib/librte_pdcp.so.24.0 00:04:29.806 [747/809] Linking target lib/librte_member.so.24.0 00:04:29.806 [748/809] Linking target drivers/librte_crypto_ipsec_mb.so.24.0 00:04:29.806 [749/809] Linking target drivers/librte_common_mlx5.so.24.0 00:04:29.806 [750/809] Linking target lib/librte_metrics.so.24.0 00:04:29.806 [751/809] Linking target lib/librte_pcapng.so.24.0 00:04:29.806 [752/809] Linking target lib/librte_gso.so.24.0 00:04:29.806 [753/809] Linking target lib/librte_gro.so.24.0 00:04:29.806 [754/809] Linking target lib/librte_ip_frag.so.24.0 00:04:29.806 [755/809] Linking target lib/librte_bpf.so.24.0 00:04:29.806 [756/809] Linking target lib/librte_power.so.24.0 00:04:29.806 [757/809] Linking target lib/librte_ipsec.so.24.0 00:04:29.806 [758/809] Linking target lib/librte_eventdev.so.24.0 00:04:29.806 [759/809] Linking target drivers/librte_common_qat.so.24.0 00:04:29.806 [760/809] Linking target drivers/librte_net_i40e.so.24.0 00:04:29.806 [761/809] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:04:29.806 [762/809] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:04:29.806 [763/809] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:04:30.064 [764/809] Generating symbol file drivers/librte_common_mlx5.so.24.0.p/librte_common_mlx5.so.24.0.symbols 00:04:30.064 [765/809] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:04:30.064 [766/809] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:04:30.064 [767/809] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:04:30.064 [768/809] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:04:30.064 [769/809] Linking target drivers/librte_compress_mlx5.so.24.0 00:04:30.064 [770/809] Linking target drivers/librte_crypto_mlx5.so.24.0 00:04:30.064 [771/809] Linking target lib/librte_bitratestats.so.24.0 00:04:30.064 [772/809] Linking target lib/librte_pdump.so.24.0 00:04:30.064 [773/809] Linking target lib/librte_graph.so.24.0 00:04:30.064 [774/809] Linking target lib/librte_dispatcher.so.24.0 00:04:30.064 [775/809] Linking target lib/librte_port.so.24.0 00:04:30.064 [776/809] Linking target lib/librte_latencystats.so.24.0 00:04:30.064 [777/809] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:04:30.323 [778/809] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:04:30.323 [779/809] Linking target lib/librte_node.so.24.0 00:04:30.323 [780/809] Linking target lib/librte_table.so.24.0 00:04:30.323 [781/809] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:04:32.230 [782/809] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:04:32.230 [783/809] Linking static target lib/librte_pipeline.a 00:04:33.611 [784/809] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:04:33.611 [785/809] Linking static target lib/librte_vhost.a 00:04:34.178 [786/809] Linking target app/dpdk-test-cmdline 00:04:34.178 [787/809] Linking target app/dpdk-test-sad 00:04:34.178 [788/809] Linking target app/dpdk-dumpcap 00:04:34.178 [789/809] Linking target app/dpdk-proc-info 00:04:34.178 [790/809] Linking target app/dpdk-pdump 00:04:34.178 [791/809] Linking target app/dpdk-test-dma-perf 00:04:34.178 [792/809] Linking target app/dpdk-test-acl 00:04:34.178 [793/809] Linking target app/dpdk-test-regex 00:04:34.178 [794/809] Linking target app/dpdk-graph 00:04:34.178 [795/809] Linking target app/dpdk-test-fib 00:04:34.178 [796/809] Linking target app/dpdk-test-gpudev 00:04:34.178 [797/809] Linking target app/dpdk-test-security-perf 00:04:34.178 [798/809] Linking target app/dpdk-test-bbdev 00:04:34.178 [799/809] Linking target app/dpdk-test-crypto-perf 00:04:34.178 [800/809] Linking target app/dpdk-test-pipeline 00:04:34.178 [801/809] Linking target app/dpdk-test-mldev 00:04:34.178 [802/809] Linking target app/dpdk-test-compress-perf 00:04:34.178 [803/809] Linking target app/dpdk-test-flow-perf 00:04:34.178 [804/809] Linking target app/dpdk-test-eventdev 00:04:34.436 [805/809] Linking target app/dpdk-testpmd 00:04:35.815 [806/809] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:04:35.815 [807/809] Linking target lib/librte_vhost.so.24.0 00:04:37.721 [808/809] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:04:37.721 [809/809] Linking target lib/librte_pipeline.so.24.0 00:04:37.721 06:20:51 build_native_dpdk -- common/autobuild_common.sh@191 -- $ uname -s 00:04:37.721 06:20:51 build_native_dpdk -- common/autobuild_common.sh@191 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:04:37.721 06:20:51 build_native_dpdk -- common/autobuild_common.sh@204 -- $ ninja -C /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp -j112 install 00:04:37.980 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp' 00:04:37.980 [0/1] Installing files. 00:04:38.245 Installing subdir /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/common 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:04:38.245 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.246 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:04:38.247 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:04:38.248 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:04:38.249 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:04:38.250 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:04:38.251 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:04:38.251 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:04:38.251 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:04:38.251 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:04:38.251 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:04:38.251 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:04:38.251 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:04:38.251 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:04:38.251 Installing lib/librte_log.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_kvargs.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_telemetry.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_eal.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_ring.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_rcu.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_mempool.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_mbuf.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_net.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_meter.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_ethdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_pci.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_cmdline.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_metrics.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_hash.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_timer.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_acl.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_bbdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_bpf.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_compressdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_distributor.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_dmadev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_efd.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_eventdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_gpudev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_gro.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_gso.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_jobstats.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_latencystats.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_lpm.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_member.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_pcapng.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_power.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_rawdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_regexdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_mldev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_rib.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_reorder.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_sched.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_security.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_stack.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_vhost.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_ipsec.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_pdcp.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_fib.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_port.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.251 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_pdump.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_table.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_pipeline.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_graph.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_node.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_bus_auxiliary.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_bus_auxiliary.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_common_mlx5.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_common_mlx5.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_common_qat.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_common_qat.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_crypto_ipsec_mb.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_crypto_ipsec_mb.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_crypto_mlx5.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_crypto_mlx5.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_compress_isal.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_compress_isal.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing drivers/librte_compress_mlx5.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:38.825 Installing drivers/librte_compress_mlx5.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:04:38.825 Installing app/dpdk-dumpcap to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-graph to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-pdump to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-proc-info to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-acl to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-fib to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-mldev to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-testpmd to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-regex to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-sad to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.825 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.826 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.827 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.828 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/pkgconfig 00:04:38.829 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/pkgconfig 00:04:38.829 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_log.so.24 00:04:38.829 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_log.so 00:04:38.829 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:04:38.829 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:04:38.829 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:04:38.829 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:04:38.829 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:04:38.829 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_eal.so 00:04:38.829 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:04:38.829 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ring.so 00:04:38.829 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:04:38.829 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rcu.so 00:04:38.829 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:04:38.829 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mempool.so 00:04:38.829 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:04:38.829 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:04:38.829 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_net.so.24 00:04:38.829 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_net.so 00:04:38.829 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:04:38.829 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_meter.so 00:04:38.829 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:04:38.829 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:04:38.829 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:04:38.829 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pci.so 00:04:38.829 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:04:38.829 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:04:38.829 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:04:38.829 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_metrics.so 00:04:38.829 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:04:38.829 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_hash.so 00:04:38.829 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:04:38.829 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_timer.so 00:04:38.829 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:04:38.829 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_acl.so 00:04:38.829 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:04:38.829 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:04:38.829 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:04:38.829 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:04:38.829 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:04:38.830 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bpf.so 00:04:38.830 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:04:38.830 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:04:38.830 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:04:38.830 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:04:38.830 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:04:38.830 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:04:38.830 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:04:38.830 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_distributor.so 00:04:38.830 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:04:38.830 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:04:38.830 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:04:38.830 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_efd.so 00:04:38.830 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:04:38.830 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:04:38.830 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:04:38.830 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:04:38.830 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:04:38.830 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:04:38.830 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:04:38.830 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gro.so 00:04:38.830 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:04:38.830 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gso.so 00:04:38.830 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:04:38.830 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:04:38.830 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:04:38.830 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:04:38.830 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:04:38.830 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:04:38.830 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:04:38.830 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_lpm.so 00:04:38.830 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_member.so.24 00:04:38.830 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_member.so 00:04:38.830 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:04:38.830 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:04:38.830 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_power.so.24 00:04:38.830 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_power.so 00:04:38.830 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:04:38.830 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:04:38.830 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:04:38.830 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:04:38.830 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:04:38.830 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mldev.so 00:04:38.830 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:04:38.830 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rib.so 00:04:38.830 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:04:38.830 './librte_bus_auxiliary.so' -> 'dpdk/pmds-24.0/librte_bus_auxiliary.so' 00:04:38.830 './librte_bus_auxiliary.so.24' -> 'dpdk/pmds-24.0/librte_bus_auxiliary.so.24' 00:04:38.830 './librte_bus_auxiliary.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_auxiliary.so.24.0' 00:04:38.830 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:04:38.830 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:04:38.830 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:04:38.830 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:04:38.830 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:04:38.830 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:04:38.830 './librte_common_mlx5.so' -> 'dpdk/pmds-24.0/librte_common_mlx5.so' 00:04:38.830 './librte_common_mlx5.so.24' -> 'dpdk/pmds-24.0/librte_common_mlx5.so.24' 00:04:38.830 './librte_common_mlx5.so.24.0' -> 'dpdk/pmds-24.0/librte_common_mlx5.so.24.0' 00:04:38.830 './librte_common_qat.so' -> 'dpdk/pmds-24.0/librte_common_qat.so' 00:04:38.830 './librte_common_qat.so.24' -> 'dpdk/pmds-24.0/librte_common_qat.so.24' 00:04:38.830 './librte_common_qat.so.24.0' -> 'dpdk/pmds-24.0/librte_common_qat.so.24.0' 00:04:38.830 './librte_compress_isal.so' -> 'dpdk/pmds-24.0/librte_compress_isal.so' 00:04:38.830 './librte_compress_isal.so.24' -> 'dpdk/pmds-24.0/librte_compress_isal.so.24' 00:04:38.830 './librte_compress_isal.so.24.0' -> 'dpdk/pmds-24.0/librte_compress_isal.so.24.0' 00:04:38.830 './librte_compress_mlx5.so' -> 'dpdk/pmds-24.0/librte_compress_mlx5.so' 00:04:38.830 './librte_compress_mlx5.so.24' -> 'dpdk/pmds-24.0/librte_compress_mlx5.so.24' 00:04:38.830 './librte_compress_mlx5.so.24.0' -> 'dpdk/pmds-24.0/librte_compress_mlx5.so.24.0' 00:04:38.830 './librte_crypto_ipsec_mb.so' -> 'dpdk/pmds-24.0/librte_crypto_ipsec_mb.so' 00:04:38.830 './librte_crypto_ipsec_mb.so.24' -> 'dpdk/pmds-24.0/librte_crypto_ipsec_mb.so.24' 00:04:38.830 './librte_crypto_ipsec_mb.so.24.0' -> 'dpdk/pmds-24.0/librte_crypto_ipsec_mb.so.24.0' 00:04:38.830 './librte_crypto_mlx5.so' -> 'dpdk/pmds-24.0/librte_crypto_mlx5.so' 00:04:38.830 './librte_crypto_mlx5.so.24' -> 'dpdk/pmds-24.0/librte_crypto_mlx5.so.24' 00:04:38.830 './librte_crypto_mlx5.so.24.0' -> 'dpdk/pmds-24.0/librte_crypto_mlx5.so.24.0' 00:04:38.830 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:04:38.830 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:04:38.830 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:04:38.830 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:04:38.830 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:04:38.830 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:04:38.830 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_reorder.so 00:04:38.830 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:04:38.830 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_sched.so 00:04:38.830 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_security.so.24 00:04:38.830 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_security.so 00:04:38.830 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:04:38.830 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_stack.so 00:04:38.830 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:04:38.830 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_vhost.so 00:04:38.830 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:04:38.830 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:04:38.830 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:04:38.830 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:04:38.830 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:04:38.830 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_fib.so 00:04:38.830 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_port.so.24 00:04:38.830 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_port.so 00:04:38.830 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:04:38.830 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pdump.so 00:04:38.830 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_table.so.24 00:04:38.830 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_table.so 00:04:38.831 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:04:38.831 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:04:38.831 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:04:38.831 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_graph.so 00:04:38.831 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_node.so.24 00:04:38.831 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_node.so 00:04:38.831 Installing symlink pointing to librte_bus_auxiliary.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_auxiliary.so.24 00:04:38.831 Installing symlink pointing to librte_bus_auxiliary.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_auxiliary.so 00:04:38.831 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:04:38.831 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:04:38.831 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:04:38.831 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:04:38.831 Installing symlink pointing to librte_common_mlx5.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_common_mlx5.so.24 00:04:38.831 Installing symlink pointing to librte_common_mlx5.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_common_mlx5.so 00:04:38.831 Installing symlink pointing to librte_common_qat.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_common_qat.so.24 00:04:38.831 Installing symlink pointing to librte_common_qat.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_common_qat.so 00:04:38.831 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:04:38.831 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:04:38.831 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:04:38.831 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:04:38.831 Installing symlink pointing to librte_crypto_ipsec_mb.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_crypto_ipsec_mb.so.24 00:04:38.831 Installing symlink pointing to librte_crypto_ipsec_mb.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_crypto_ipsec_mb.so 00:04:38.831 Installing symlink pointing to librte_crypto_mlx5.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_crypto_mlx5.so.24 00:04:38.831 Installing symlink pointing to librte_crypto_mlx5.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_crypto_mlx5.so 00:04:38.831 Installing symlink pointing to librte_compress_isal.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_compress_isal.so.24 00:04:38.831 Installing symlink pointing to librte_compress_isal.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_compress_isal.so 00:04:38.831 Installing symlink pointing to librte_compress_mlx5.so.24.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_compress_mlx5.so.24 00:04:38.831 Installing symlink pointing to librte_compress_mlx5.so.24 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_compress_mlx5.so 00:04:38.831 Running custom install script '/bin/sh /var/jenkins/workspace/crypto-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:04:38.831 06:20:52 build_native_dpdk -- common/autobuild_common.sh@210 -- $ cat 00:04:38.831 06:20:52 build_native_dpdk -- common/autobuild_common.sh@215 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:38.831 00:04:38.831 real 2m49.953s 00:04:38.831 user 19m26.578s 00:04:38.831 sys 3m48.987s 00:04:38.831 06:20:52 build_native_dpdk -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:04:38.831 06:20:52 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:04:38.831 ************************************ 00:04:38.831 END TEST build_native_dpdk 00:04:38.831 ************************************ 00:04:38.831 06:20:52 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:04:38.831 06:20:52 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:04:38.831 06:20:52 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:04:38.831 06:20:52 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:04:38.831 06:20:52 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:04:38.831 06:20:52 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:04:38.831 06:20:52 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:04:38.831 06:20:52 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build --with-shared 00:04:39.091 Using /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:04:39.350 DPDK libraries: /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:04:39.350 DPDK includes: //var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:04:39.350 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:04:39.609 Using 'verbs' RDMA provider 00:04:55.876 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:05:10.762 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:05:10.762 Creating mk/config.mk...done. 00:05:10.762 Creating mk/cc.flags.mk...done. 00:05:10.762 Type 'make' to build. 00:05:10.762 06:21:23 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:05:10.762 06:21:23 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:05:10.762 06:21:23 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:05:10.762 06:21:23 -- common/autotest_common.sh@10 -- $ set +x 00:05:10.762 ************************************ 00:05:10.762 START TEST make 00:05:10.762 ************************************ 00:05:10.762 06:21:23 make -- common/autotest_common.sh@1125 -- $ make -j112 00:05:10.762 make[1]: Nothing to be done for 'all'. 00:05:25.679 CC lib/log/log_flags.o 00:05:25.679 CC lib/log/log.o 00:05:25.679 CC lib/ut/ut.o 00:05:25.679 CC lib/log/log_deprecated.o 00:05:25.679 CC lib/ut_mock/mock.o 00:05:25.679 LIB libspdk_log.a 00:05:25.679 LIB libspdk_ut_mock.a 00:05:25.679 SO libspdk_ut_mock.so.6.0 00:05:25.679 SO libspdk_log.so.7.0 00:05:25.679 SYMLINK libspdk_ut_mock.so 00:05:25.679 SYMLINK libspdk_log.so 00:05:25.679 LIB libspdk_ut.a 00:05:25.679 SO libspdk_ut.so.2.0 00:05:25.679 SYMLINK libspdk_ut.so 00:05:25.679 CC lib/util/base64.o 00:05:25.679 CC lib/util/cpuset.o 00:05:25.679 CC lib/util/bit_array.o 00:05:25.679 CC lib/util/crc16.o 00:05:25.679 CC lib/util/crc32.o 00:05:25.679 CC lib/util/crc32c.o 00:05:25.679 CC lib/util/crc64.o 00:05:25.679 CC lib/util/crc32_ieee.o 00:05:25.679 CC lib/util/dif.o 00:05:25.679 CC lib/util/fd.o 00:05:25.679 CC lib/util/fd_group.o 00:05:25.679 CC lib/util/file.o 00:05:25.679 CC lib/util/hexlify.o 00:05:25.679 CC lib/util/iov.o 00:05:25.679 CC lib/util/math.o 00:05:25.679 CC lib/util/net.o 00:05:25.679 CC lib/util/pipe.o 00:05:25.679 CXX lib/trace_parser/trace.o 00:05:25.679 CC lib/dma/dma.o 00:05:25.679 CC lib/util/strerror_tls.o 00:05:25.679 CC lib/util/string.o 00:05:25.679 CC lib/util/uuid.o 00:05:25.679 CC lib/util/xor.o 00:05:25.679 CC lib/util/zipf.o 00:05:25.679 CC lib/ioat/ioat.o 00:05:25.679 CC lib/vfio_user/host/vfio_user_pci.o 00:05:25.679 CC lib/vfio_user/host/vfio_user.o 00:05:25.679 LIB libspdk_dma.a 00:05:25.679 SO libspdk_dma.so.4.0 00:05:25.679 LIB libspdk_ioat.a 00:05:25.679 SYMLINK libspdk_dma.so 00:05:25.679 SO libspdk_ioat.so.7.0 00:05:25.937 SYMLINK libspdk_ioat.so 00:05:25.937 LIB libspdk_vfio_user.a 00:05:25.937 SO libspdk_vfio_user.so.5.0 00:05:25.937 LIB libspdk_util.a 00:05:25.937 SYMLINK libspdk_vfio_user.so 00:05:25.937 SO libspdk_util.so.10.0 00:05:26.195 SYMLINK libspdk_util.so 00:05:26.195 LIB libspdk_trace_parser.a 00:05:26.453 SO libspdk_trace_parser.so.5.0 00:05:26.453 SYMLINK libspdk_trace_parser.so 00:05:26.453 CC lib/conf/conf.o 00:05:26.453 CC lib/rdma_provider/common.o 00:05:26.453 CC lib/rdma_provider/rdma_provider_verbs.o 00:05:26.453 CC lib/json/json_util.o 00:05:26.453 CC lib/json/json_parse.o 00:05:26.453 CC lib/json/json_write.o 00:05:26.453 CC lib/rdma_utils/rdma_utils.o 00:05:26.453 CC lib/vmd/vmd.o 00:05:26.453 CC lib/vmd/led.o 00:05:26.453 CC lib/env_dpdk/env.o 00:05:26.453 CC lib/env_dpdk/memory.o 00:05:26.453 CC lib/env_dpdk/pci.o 00:05:26.453 CC lib/idxd/idxd.o 00:05:26.453 CC lib/env_dpdk/init.o 00:05:26.453 CC lib/env_dpdk/threads.o 00:05:26.453 CC lib/env_dpdk/pci_ioat.o 00:05:26.711 CC lib/idxd/idxd_user.o 00:05:26.711 CC lib/reduce/reduce.o 00:05:26.711 CC lib/env_dpdk/pci_virtio.o 00:05:26.711 CC lib/idxd/idxd_kernel.o 00:05:26.711 CC lib/env_dpdk/pci_vmd.o 00:05:26.711 CC lib/env_dpdk/pci_idxd.o 00:05:26.711 CC lib/env_dpdk/pci_event.o 00:05:26.711 CC lib/env_dpdk/sigbus_handler.o 00:05:26.711 CC lib/env_dpdk/pci_dpdk.o 00:05:26.711 CC lib/env_dpdk/pci_dpdk_2207.o 00:05:26.711 CC lib/env_dpdk/pci_dpdk_2211.o 00:05:26.711 LIB libspdk_rdma_provider.a 00:05:26.711 LIB libspdk_conf.a 00:05:26.969 SO libspdk_rdma_provider.so.6.0 00:05:26.969 SO libspdk_conf.so.6.0 00:05:26.969 LIB libspdk_rdma_utils.a 00:05:26.969 SO libspdk_rdma_utils.so.1.0 00:05:26.969 SYMLINK libspdk_conf.so 00:05:26.969 SYMLINK libspdk_rdma_provider.so 00:05:26.969 SYMLINK libspdk_rdma_utils.so 00:05:26.969 LIB libspdk_idxd.a 00:05:26.969 SO libspdk_idxd.so.12.0 00:05:27.227 SYMLINK libspdk_idxd.so 00:05:27.227 LIB libspdk_vmd.a 00:05:27.227 LIB libspdk_json.a 00:05:27.227 SO libspdk_vmd.so.6.0 00:05:27.227 LIB libspdk_reduce.a 00:05:27.227 SO libspdk_json.so.6.0 00:05:27.227 SO libspdk_reduce.so.6.1 00:05:27.227 SYMLINK libspdk_vmd.so 00:05:27.227 SYMLINK libspdk_json.so 00:05:27.485 SYMLINK libspdk_reduce.so 00:05:27.743 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:05:27.743 CC lib/jsonrpc/jsonrpc_server.o 00:05:27.743 CC lib/jsonrpc/jsonrpc_client.o 00:05:27.743 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:05:28.000 LIB libspdk_env_dpdk.a 00:05:28.000 LIB libspdk_jsonrpc.a 00:05:28.000 SO libspdk_jsonrpc.so.6.0 00:05:28.000 SO libspdk_env_dpdk.so.15.0 00:05:28.000 SYMLINK libspdk_jsonrpc.so 00:05:28.259 SYMLINK libspdk_env_dpdk.so 00:05:28.518 CC lib/rpc/rpc.o 00:05:28.518 LIB libspdk_rpc.a 00:05:28.777 SO libspdk_rpc.so.6.0 00:05:28.777 SYMLINK libspdk_rpc.so 00:05:29.035 CC lib/notify/notify.o 00:05:29.035 CC lib/notify/notify_rpc.o 00:05:29.035 CC lib/trace/trace_flags.o 00:05:29.035 CC lib/trace/trace.o 00:05:29.035 CC lib/trace/trace_rpc.o 00:05:29.035 CC lib/keyring/keyring.o 00:05:29.035 CC lib/keyring/keyring_rpc.o 00:05:29.293 LIB libspdk_notify.a 00:05:29.293 SO libspdk_notify.so.6.0 00:05:29.293 LIB libspdk_keyring.a 00:05:29.293 SO libspdk_keyring.so.1.0 00:05:29.293 LIB libspdk_trace.a 00:05:29.293 SYMLINK libspdk_notify.so 00:05:29.551 SO libspdk_trace.so.10.0 00:05:29.552 SYMLINK libspdk_keyring.so 00:05:29.552 SYMLINK libspdk_trace.so 00:05:29.863 CC lib/sock/sock.o 00:05:29.863 CC lib/sock/sock_rpc.o 00:05:29.863 CC lib/thread/thread.o 00:05:29.863 CC lib/thread/iobuf.o 00:05:30.429 LIB libspdk_sock.a 00:05:30.429 SO libspdk_sock.so.10.0 00:05:30.429 SYMLINK libspdk_sock.so 00:05:30.687 CC lib/nvme/nvme_ctrlr.o 00:05:30.687 CC lib/nvme/nvme_fabric.o 00:05:30.687 CC lib/nvme/nvme_ctrlr_cmd.o 00:05:30.687 CC lib/nvme/nvme_ns_cmd.o 00:05:30.687 CC lib/nvme/nvme_ns.o 00:05:30.687 CC lib/nvme/nvme_pcie_common.o 00:05:30.687 CC lib/nvme/nvme_pcie.o 00:05:30.687 CC lib/nvme/nvme_qpair.o 00:05:30.687 CC lib/nvme/nvme.o 00:05:30.687 CC lib/nvme/nvme_quirks.o 00:05:30.687 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:05:30.687 CC lib/nvme/nvme_transport.o 00:05:30.687 CC lib/nvme/nvme_discovery.o 00:05:30.687 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:05:30.687 CC lib/nvme/nvme_tcp.o 00:05:30.687 CC lib/nvme/nvme_opal.o 00:05:30.687 CC lib/nvme/nvme_io_msg.o 00:05:30.687 CC lib/nvme/nvme_poll_group.o 00:05:30.687 CC lib/nvme/nvme_zns.o 00:05:30.687 CC lib/nvme/nvme_stubs.o 00:05:30.687 CC lib/nvme/nvme_auth.o 00:05:30.687 CC lib/nvme/nvme_cuse.o 00:05:30.687 CC lib/nvme/nvme_rdma.o 00:05:31.254 LIB libspdk_thread.a 00:05:31.254 SO libspdk_thread.so.10.1 00:05:31.512 SYMLINK libspdk_thread.so 00:05:31.770 CC lib/blob/blobstore.o 00:05:31.770 CC lib/blob/request.o 00:05:31.770 CC lib/blob/zeroes.o 00:05:31.770 CC lib/init/json_config.o 00:05:31.770 CC lib/init/subsystem.o 00:05:31.770 CC lib/blob/blob_bs_dev.o 00:05:31.770 CC lib/init/subsystem_rpc.o 00:05:31.770 CC lib/init/rpc.o 00:05:31.770 CC lib/virtio/virtio.o 00:05:31.770 CC lib/virtio/virtio_vhost_user.o 00:05:31.770 CC lib/virtio/virtio_vfio_user.o 00:05:31.770 CC lib/virtio/virtio_pci.o 00:05:31.770 CC lib/accel/accel.o 00:05:31.770 CC lib/accel/accel_rpc.o 00:05:31.770 CC lib/accel/accel_sw.o 00:05:32.028 LIB libspdk_init.a 00:05:32.028 SO libspdk_init.so.5.0 00:05:32.028 LIB libspdk_virtio.a 00:05:32.028 SO libspdk_virtio.so.7.0 00:05:32.287 SYMLINK libspdk_init.so 00:05:32.287 SYMLINK libspdk_virtio.so 00:05:32.545 CC lib/event/app.o 00:05:32.545 CC lib/event/reactor.o 00:05:32.545 CC lib/event/log_rpc.o 00:05:32.545 CC lib/event/app_rpc.o 00:05:32.545 CC lib/event/scheduler_static.o 00:05:32.804 LIB libspdk_accel.a 00:05:32.804 SO libspdk_accel.so.16.0 00:05:32.804 LIB libspdk_nvme.a 00:05:32.804 SYMLINK libspdk_accel.so 00:05:33.062 LIB libspdk_event.a 00:05:33.062 SO libspdk_nvme.so.13.1 00:05:33.062 SO libspdk_event.so.14.0 00:05:33.062 SYMLINK libspdk_event.so 00:05:33.320 CC lib/bdev/bdev.o 00:05:33.320 CC lib/bdev/bdev_rpc.o 00:05:33.320 CC lib/bdev/bdev_zone.o 00:05:33.320 CC lib/bdev/part.o 00:05:33.320 CC lib/bdev/scsi_nvme.o 00:05:33.320 SYMLINK libspdk_nvme.so 00:05:34.694 LIB libspdk_blob.a 00:05:34.694 SO libspdk_blob.so.11.0 00:05:34.694 SYMLINK libspdk_blob.so 00:05:34.952 CC lib/blobfs/tree.o 00:05:34.952 CC lib/blobfs/blobfs.o 00:05:35.210 CC lib/lvol/lvol.o 00:05:35.467 LIB libspdk_bdev.a 00:05:35.725 SO libspdk_bdev.so.16.0 00:05:35.725 SYMLINK libspdk_bdev.so 00:05:35.725 LIB libspdk_blobfs.a 00:05:35.983 SO libspdk_blobfs.so.10.0 00:05:35.983 SYMLINK libspdk_blobfs.so 00:05:35.983 LIB libspdk_lvol.a 00:05:35.983 SO libspdk_lvol.so.10.0 00:05:35.983 SYMLINK libspdk_lvol.so 00:05:35.983 CC lib/nbd/nbd.o 00:05:36.242 CC lib/nbd/nbd_rpc.o 00:05:36.242 CC lib/scsi/dev.o 00:05:36.242 CC lib/scsi/lun.o 00:05:36.242 CC lib/scsi/port.o 00:05:36.242 CC lib/scsi/scsi.o 00:05:36.242 CC lib/scsi/scsi_pr.o 00:05:36.242 CC lib/scsi/scsi_bdev.o 00:05:36.242 CC lib/scsi/scsi_rpc.o 00:05:36.242 CC lib/scsi/task.o 00:05:36.242 CC lib/nvmf/ctrlr.o 00:05:36.242 CC lib/nvmf/ctrlr_discovery.o 00:05:36.242 CC lib/nvmf/ctrlr_bdev.o 00:05:36.242 CC lib/nvmf/subsystem.o 00:05:36.242 CC lib/nvmf/nvmf.o 00:05:36.242 CC lib/nvmf/nvmf_rpc.o 00:05:36.242 CC lib/nvmf/transport.o 00:05:36.242 CC lib/nvmf/tcp.o 00:05:36.242 CC lib/nvmf/rdma.o 00:05:36.242 CC lib/nvmf/stubs.o 00:05:36.242 CC lib/nvmf/mdns_server.o 00:05:36.242 CC lib/nvmf/auth.o 00:05:36.242 CC lib/ftl/ftl_core.o 00:05:36.242 CC lib/ftl/ftl_init.o 00:05:36.242 CC lib/ftl/ftl_layout.o 00:05:36.242 CC lib/ublk/ublk.o 00:05:36.242 CC lib/ftl/ftl_debug.o 00:05:36.242 CC lib/ublk/ublk_rpc.o 00:05:36.242 CC lib/ftl/ftl_io.o 00:05:36.242 CC lib/ftl/ftl_sb.o 00:05:36.242 CC lib/ftl/ftl_l2p.o 00:05:36.242 CC lib/ftl/ftl_l2p_flat.o 00:05:36.242 CC lib/ftl/ftl_nv_cache.o 00:05:36.242 CC lib/ftl/ftl_band.o 00:05:36.242 CC lib/ftl/ftl_band_ops.o 00:05:36.242 CC lib/ftl/ftl_writer.o 00:05:36.242 CC lib/ftl/ftl_rq.o 00:05:36.242 CC lib/ftl/ftl_reloc.o 00:05:36.242 CC lib/ftl/ftl_l2p_cache.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt.o 00:05:36.242 CC lib/ftl/ftl_p2l.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_startup.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_md.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_misc.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_band.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:05:36.242 CC lib/ftl/utils/ftl_conf.o 00:05:36.242 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:05:36.242 CC lib/ftl/utils/ftl_md.o 00:05:36.242 CC lib/ftl/utils/ftl_mempool.o 00:05:36.242 CC lib/ftl/utils/ftl_bitmap.o 00:05:36.242 CC lib/ftl/utils/ftl_property.o 00:05:36.242 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:05:36.242 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:05:36.242 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:05:36.242 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:05:36.242 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:05:36.242 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:05:36.242 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:05:36.242 CC lib/ftl/upgrade/ftl_sb_v3.o 00:05:36.242 CC lib/ftl/upgrade/ftl_sb_v5.o 00:05:36.242 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:05:36.242 CC lib/ftl/nvc/ftl_nvc_dev.o 00:05:36.242 CC lib/ftl/base/ftl_base_dev.o 00:05:36.242 CC lib/ftl/base/ftl_base_bdev.o 00:05:36.242 CC lib/ftl/ftl_trace.o 00:05:36.813 LIB libspdk_scsi.a 00:05:36.813 LIB libspdk_nbd.a 00:05:36.813 SO libspdk_scsi.so.9.0 00:05:36.813 SO libspdk_nbd.so.7.0 00:05:37.106 LIB libspdk_ublk.a 00:05:37.106 SYMLINK libspdk_nbd.so 00:05:37.106 SYMLINK libspdk_scsi.so 00:05:37.106 SO libspdk_ublk.so.3.0 00:05:37.106 SYMLINK libspdk_ublk.so 00:05:37.364 CC lib/vhost/vhost.o 00:05:37.364 CC lib/vhost/vhost_rpc.o 00:05:37.364 CC lib/vhost/vhost_scsi.o 00:05:37.364 CC lib/vhost/vhost_blk.o 00:05:37.364 CC lib/vhost/rte_vhost_user.o 00:05:37.364 CC lib/iscsi/conn.o 00:05:37.364 CC lib/iscsi/init_grp.o 00:05:37.364 CC lib/iscsi/iscsi.o 00:05:37.364 CC lib/iscsi/param.o 00:05:37.364 CC lib/iscsi/md5.o 00:05:37.364 CC lib/iscsi/portal_grp.o 00:05:37.364 CC lib/iscsi/tgt_node.o 00:05:37.364 CC lib/iscsi/iscsi_subsystem.o 00:05:37.364 CC lib/iscsi/iscsi_rpc.o 00:05:37.364 CC lib/iscsi/task.o 00:05:37.364 LIB libspdk_ftl.a 00:05:37.622 SO libspdk_ftl.so.9.0 00:05:38.188 SYMLINK libspdk_ftl.so 00:05:38.188 LIB libspdk_nvmf.a 00:05:38.446 SO libspdk_nvmf.so.19.0 00:05:38.704 SYMLINK libspdk_nvmf.so 00:05:38.704 LIB libspdk_iscsi.a 00:05:38.704 SO libspdk_iscsi.so.8.0 00:05:38.961 LIB libspdk_vhost.a 00:05:38.961 SYMLINK libspdk_iscsi.so 00:05:38.961 SO libspdk_vhost.so.8.0 00:05:38.961 SYMLINK libspdk_vhost.so 00:05:39.526 CC module/env_dpdk/env_dpdk_rpc.o 00:05:39.784 CC module/accel/iaa/accel_iaa_rpc.o 00:05:39.784 CC module/accel/iaa/accel_iaa.o 00:05:39.784 CC module/keyring/file/keyring.o 00:05:39.784 CC module/keyring/file/keyring_rpc.o 00:05:39.784 CC module/accel/error/accel_error_rpc.o 00:05:39.784 CC module/accel/error/accel_error.o 00:05:39.784 CC module/blob/bdev/blob_bdev.o 00:05:39.784 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:05:39.784 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:05:39.784 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:05:39.784 CC module/scheduler/dynamic/scheduler_dynamic.o 00:05:39.784 CC module/accel/dsa/accel_dsa.o 00:05:39.784 CC module/accel/dsa/accel_dsa_rpc.o 00:05:39.784 LIB libspdk_env_dpdk_rpc.a 00:05:39.784 CC module/keyring/linux/keyring.o 00:05:39.784 CC module/keyring/linux/keyring_rpc.o 00:05:39.784 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:05:39.784 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:05:39.784 CC module/accel/ioat/accel_ioat_rpc.o 00:05:39.784 CC module/accel/ioat/accel_ioat.o 00:05:39.784 CC module/scheduler/gscheduler/gscheduler.o 00:05:39.784 CC module/sock/posix/posix.o 00:05:39.784 SO libspdk_env_dpdk_rpc.so.6.0 00:05:39.784 SYMLINK libspdk_env_dpdk_rpc.so 00:05:40.041 LIB libspdk_keyring_file.a 00:05:40.041 LIB libspdk_scheduler_dpdk_governor.a 00:05:40.041 LIB libspdk_accel_error.a 00:05:40.041 LIB libspdk_scheduler_gscheduler.a 00:05:40.041 SO libspdk_scheduler_dpdk_governor.so.4.0 00:05:40.041 LIB libspdk_accel_iaa.a 00:05:40.041 SO libspdk_keyring_file.so.1.0 00:05:40.041 LIB libspdk_scheduler_dynamic.a 00:05:40.041 SO libspdk_accel_error.so.2.0 00:05:40.041 LIB libspdk_blob_bdev.a 00:05:40.041 SO libspdk_scheduler_gscheduler.so.4.0 00:05:40.041 LIB libspdk_accel_ioat.a 00:05:40.041 SO libspdk_accel_iaa.so.3.0 00:05:40.041 SO libspdk_scheduler_dynamic.so.4.0 00:05:40.041 SYMLINK libspdk_scheduler_dpdk_governor.so 00:05:40.041 SYMLINK libspdk_keyring_file.so 00:05:40.041 SO libspdk_blob_bdev.so.11.0 00:05:40.041 LIB libspdk_accel_dsa.a 00:05:40.041 SO libspdk_accel_ioat.so.6.0 00:05:40.041 SYMLINK libspdk_accel_error.so 00:05:40.041 LIB libspdk_keyring_linux.a 00:05:40.041 SYMLINK libspdk_scheduler_gscheduler.so 00:05:40.041 SO libspdk_accel_dsa.so.5.0 00:05:40.041 SYMLINK libspdk_accel_iaa.so 00:05:40.041 SYMLINK libspdk_scheduler_dynamic.so 00:05:40.041 SYMLINK libspdk_blob_bdev.so 00:05:40.041 SO libspdk_keyring_linux.so.1.0 00:05:40.298 SYMLINK libspdk_accel_ioat.so 00:05:40.298 SYMLINK libspdk_accel_dsa.so 00:05:40.298 SYMLINK libspdk_keyring_linux.so 00:05:40.555 LIB libspdk_sock_posix.a 00:05:40.555 SO libspdk_sock_posix.so.6.0 00:05:40.555 SYMLINK libspdk_sock_posix.so 00:05:40.813 CC module/blobfs/bdev/blobfs_bdev.o 00:05:40.813 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:05:40.813 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:05:40.813 CC module/bdev/iscsi/bdev_iscsi.o 00:05:40.813 CC module/bdev/malloc/bdev_malloc_rpc.o 00:05:40.813 CC module/bdev/malloc/bdev_malloc.o 00:05:40.813 CC module/bdev/gpt/gpt.o 00:05:40.813 CC module/bdev/gpt/vbdev_gpt.o 00:05:40.813 CC module/bdev/split/vbdev_split_rpc.o 00:05:40.813 CC module/bdev/split/vbdev_split.o 00:05:40.813 CC module/bdev/raid/bdev_raid.o 00:05:40.813 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:05:40.813 CC module/bdev/crypto/vbdev_crypto.o 00:05:40.813 CC module/bdev/ftl/bdev_ftl.o 00:05:40.813 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:05:40.813 CC module/bdev/compress/vbdev_compress.o 00:05:40.813 CC module/bdev/lvol/vbdev_lvol.o 00:05:40.813 CC module/bdev/ftl/bdev_ftl_rpc.o 00:05:40.813 CC module/bdev/zone_block/vbdev_zone_block.o 00:05:40.813 CC module/bdev/raid/bdev_raid_rpc.o 00:05:40.813 CC module/bdev/compress/vbdev_compress_rpc.o 00:05:40.813 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:05:40.813 CC module/bdev/error/vbdev_error.o 00:05:40.813 CC module/bdev/aio/bdev_aio_rpc.o 00:05:40.813 CC module/bdev/aio/bdev_aio.o 00:05:40.813 CC module/bdev/raid/bdev_raid_sb.o 00:05:40.813 CC module/bdev/nvme/bdev_nvme.o 00:05:40.813 CC module/bdev/raid/raid0.o 00:05:40.813 CC module/bdev/error/vbdev_error_rpc.o 00:05:40.813 CC module/bdev/passthru/vbdev_passthru.o 00:05:40.813 CC module/bdev/nvme/nvme_rpc.o 00:05:40.813 CC module/bdev/raid/raid1.o 00:05:40.813 CC module/bdev/nvme/bdev_nvme_rpc.o 00:05:40.813 CC module/bdev/raid/concat.o 00:05:40.813 CC module/bdev/null/bdev_null.o 00:05:40.813 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:05:40.813 CC module/bdev/nvme/vbdev_opal.o 00:05:40.813 CC module/bdev/nvme/bdev_mdns_client.o 00:05:40.813 CC module/bdev/delay/vbdev_delay.o 00:05:40.813 CC module/bdev/null/bdev_null_rpc.o 00:05:40.813 CC module/bdev/delay/vbdev_delay_rpc.o 00:05:40.813 CC module/bdev/nvme/vbdev_opal_rpc.o 00:05:40.813 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:05:40.813 CC module/bdev/virtio/bdev_virtio_scsi.o 00:05:40.813 CC module/bdev/virtio/bdev_virtio_rpc.o 00:05:40.813 CC module/bdev/virtio/bdev_virtio_blk.o 00:05:40.813 LIB libspdk_accel_dpdk_compressdev.a 00:05:40.813 SO libspdk_accel_dpdk_compressdev.so.3.0 00:05:41.070 LIB libspdk_blobfs_bdev.a 00:05:41.070 SYMLINK libspdk_accel_dpdk_compressdev.so 00:05:41.070 SO libspdk_blobfs_bdev.so.6.0 00:05:41.070 LIB libspdk_bdev_null.a 00:05:41.070 LIB libspdk_accel_dpdk_cryptodev.a 00:05:41.070 LIB libspdk_bdev_split.a 00:05:41.070 LIB libspdk_bdev_gpt.a 00:05:41.070 SYMLINK libspdk_blobfs_bdev.so 00:05:41.070 SO libspdk_bdev_null.so.6.0 00:05:41.070 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:05:41.070 SO libspdk_bdev_gpt.so.6.0 00:05:41.070 SO libspdk_bdev_split.so.6.0 00:05:41.070 LIB libspdk_bdev_error.a 00:05:41.070 LIB libspdk_bdev_ftl.a 00:05:41.070 LIB libspdk_bdev_malloc.a 00:05:41.070 LIB libspdk_bdev_aio.a 00:05:41.070 LIB libspdk_bdev_passthru.a 00:05:41.070 LIB libspdk_bdev_iscsi.a 00:05:41.070 LIB libspdk_bdev_crypto.a 00:05:41.070 SO libspdk_bdev_error.so.6.0 00:05:41.070 SYMLINK libspdk_bdev_null.so 00:05:41.070 SO libspdk_bdev_ftl.so.6.0 00:05:41.328 LIB libspdk_bdev_zone_block.a 00:05:41.328 SYMLINK libspdk_bdev_gpt.so 00:05:41.328 SYMLINK libspdk_bdev_split.so 00:05:41.328 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:05:41.328 SO libspdk_bdev_passthru.so.6.0 00:05:41.328 SO libspdk_bdev_malloc.so.6.0 00:05:41.328 SO libspdk_bdev_aio.so.6.0 00:05:41.328 SO libspdk_bdev_iscsi.so.6.0 00:05:41.328 SO libspdk_bdev_crypto.so.6.0 00:05:41.328 LIB libspdk_bdev_compress.a 00:05:41.328 LIB libspdk_bdev_delay.a 00:05:41.328 SYMLINK libspdk_bdev_error.so 00:05:41.328 SYMLINK libspdk_bdev_ftl.so 00:05:41.328 SO libspdk_bdev_zone_block.so.6.0 00:05:41.328 SO libspdk_bdev_compress.so.6.0 00:05:41.328 SO libspdk_bdev_delay.so.6.0 00:05:41.328 SYMLINK libspdk_bdev_passthru.so 00:05:41.328 SYMLINK libspdk_bdev_malloc.so 00:05:41.328 SYMLINK libspdk_bdev_aio.so 00:05:41.328 SYMLINK libspdk_bdev_iscsi.so 00:05:41.328 SYMLINK libspdk_bdev_crypto.so 00:05:41.328 LIB libspdk_bdev_lvol.a 00:05:41.328 SYMLINK libspdk_bdev_zone_block.so 00:05:41.328 SYMLINK libspdk_bdev_compress.so 00:05:41.328 SYMLINK libspdk_bdev_delay.so 00:05:41.328 LIB libspdk_bdev_virtio.a 00:05:41.328 SO libspdk_bdev_lvol.so.6.0 00:05:41.328 SO libspdk_bdev_virtio.so.6.0 00:05:41.586 SYMLINK libspdk_bdev_lvol.so 00:05:41.586 SYMLINK libspdk_bdev_virtio.so 00:05:41.844 LIB libspdk_bdev_raid.a 00:05:41.844 SO libspdk_bdev_raid.so.6.0 00:05:42.102 SYMLINK libspdk_bdev_raid.so 00:05:43.035 LIB libspdk_bdev_nvme.a 00:05:43.035 SO libspdk_bdev_nvme.so.7.0 00:05:43.035 SYMLINK libspdk_bdev_nvme.so 00:05:43.967 CC module/event/subsystems/vmd/vmd.o 00:05:43.967 CC module/event/subsystems/vmd/vmd_rpc.o 00:05:43.967 CC module/event/subsystems/scheduler/scheduler.o 00:05:43.967 CC module/event/subsystems/iobuf/iobuf.o 00:05:43.967 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:05:43.967 CC module/event/subsystems/sock/sock.o 00:05:43.967 CC module/event/subsystems/keyring/keyring.o 00:05:43.967 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:05:43.967 LIB libspdk_event_vhost_blk.a 00:05:43.967 LIB libspdk_event_vmd.a 00:05:43.967 LIB libspdk_event_scheduler.a 00:05:43.967 LIB libspdk_event_keyring.a 00:05:43.967 LIB libspdk_event_sock.a 00:05:43.967 LIB libspdk_event_iobuf.a 00:05:43.967 SO libspdk_event_vhost_blk.so.3.0 00:05:43.967 SO libspdk_event_scheduler.so.4.0 00:05:43.967 SO libspdk_event_vmd.so.6.0 00:05:43.967 SO libspdk_event_keyring.so.1.0 00:05:43.967 SO libspdk_event_sock.so.5.0 00:05:43.967 SO libspdk_event_iobuf.so.3.0 00:05:44.225 SYMLINK libspdk_event_vhost_blk.so 00:05:44.225 SYMLINK libspdk_event_scheduler.so 00:05:44.225 SYMLINK libspdk_event_keyring.so 00:05:44.225 SYMLINK libspdk_event_vmd.so 00:05:44.225 SYMLINK libspdk_event_sock.so 00:05:44.225 SYMLINK libspdk_event_iobuf.so 00:05:44.482 CC module/event/subsystems/accel/accel.o 00:05:44.740 LIB libspdk_event_accel.a 00:05:44.740 SO libspdk_event_accel.so.6.0 00:05:44.740 SYMLINK libspdk_event_accel.so 00:05:45.307 CC module/event/subsystems/bdev/bdev.o 00:05:45.307 LIB libspdk_event_bdev.a 00:05:45.564 SO libspdk_event_bdev.so.6.0 00:05:45.564 SYMLINK libspdk_event_bdev.so 00:05:45.823 CC module/event/subsystems/nbd/nbd.o 00:05:45.823 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:05:45.823 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:05:45.823 CC module/event/subsystems/scsi/scsi.o 00:05:45.823 CC module/event/subsystems/ublk/ublk.o 00:05:46.080 LIB libspdk_event_nbd.a 00:05:46.081 LIB libspdk_event_ublk.a 00:05:46.081 LIB libspdk_event_scsi.a 00:05:46.081 SO libspdk_event_nbd.so.6.0 00:05:46.081 SO libspdk_event_ublk.so.3.0 00:05:46.081 SO libspdk_event_scsi.so.6.0 00:05:46.081 LIB libspdk_event_nvmf.a 00:05:46.081 SYMLINK libspdk_event_nbd.so 00:05:46.081 SO libspdk_event_nvmf.so.6.0 00:05:46.081 SYMLINK libspdk_event_ublk.so 00:05:46.081 SYMLINK libspdk_event_scsi.so 00:05:46.339 SYMLINK libspdk_event_nvmf.so 00:05:46.597 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:05:46.597 CC module/event/subsystems/iscsi/iscsi.o 00:05:46.855 LIB libspdk_event_vhost_scsi.a 00:05:46.855 LIB libspdk_event_iscsi.a 00:05:46.855 SO libspdk_event_vhost_scsi.so.3.0 00:05:46.855 SO libspdk_event_iscsi.so.6.0 00:05:46.855 SYMLINK libspdk_event_vhost_scsi.so 00:05:46.855 SYMLINK libspdk_event_iscsi.so 00:05:47.112 SO libspdk.so.6.0 00:05:47.112 SYMLINK libspdk.so 00:05:47.370 CXX app/trace/trace.o 00:05:47.370 CC app/spdk_nvme_identify/identify.o 00:05:47.370 CC app/trace_record/trace_record.o 00:05:47.370 TEST_HEADER include/spdk/accel.h 00:05:47.370 TEST_HEADER include/spdk/assert.h 00:05:47.370 TEST_HEADER include/spdk/accel_module.h 00:05:47.370 TEST_HEADER include/spdk/barrier.h 00:05:47.370 TEST_HEADER include/spdk/base64.h 00:05:47.370 CC app/spdk_lspci/spdk_lspci.o 00:05:47.370 CC app/spdk_top/spdk_top.o 00:05:47.370 TEST_HEADER include/spdk/bdev.h 00:05:47.370 TEST_HEADER include/spdk/bdev_module.h 00:05:47.370 TEST_HEADER include/spdk/bdev_zone.h 00:05:47.370 CC app/spdk_nvme_perf/perf.o 00:05:47.370 TEST_HEADER include/spdk/bit_array.h 00:05:47.370 TEST_HEADER include/spdk/bit_pool.h 00:05:47.370 TEST_HEADER include/spdk/blob_bdev.h 00:05:47.370 TEST_HEADER include/spdk/blobfs_bdev.h 00:05:47.370 CC app/spdk_nvme_discover/discovery_aer.o 00:05:47.370 TEST_HEADER include/spdk/blobfs.h 00:05:47.370 TEST_HEADER include/spdk/blob.h 00:05:47.370 TEST_HEADER include/spdk/conf.h 00:05:47.370 TEST_HEADER include/spdk/config.h 00:05:47.370 TEST_HEADER include/spdk/crc32.h 00:05:47.370 TEST_HEADER include/spdk/cpuset.h 00:05:47.370 TEST_HEADER include/spdk/crc16.h 00:05:47.370 TEST_HEADER include/spdk/crc64.h 00:05:47.370 CC test/rpc_client/rpc_client_test.o 00:05:47.370 TEST_HEADER include/spdk/dif.h 00:05:47.370 TEST_HEADER include/spdk/endian.h 00:05:47.370 TEST_HEADER include/spdk/dma.h 00:05:47.370 TEST_HEADER include/spdk/env_dpdk.h 00:05:47.370 TEST_HEADER include/spdk/env.h 00:05:47.370 TEST_HEADER include/spdk/event.h 00:05:47.370 TEST_HEADER include/spdk/fd_group.h 00:05:47.370 TEST_HEADER include/spdk/fd.h 00:05:47.370 TEST_HEADER include/spdk/file.h 00:05:47.370 TEST_HEADER include/spdk/ftl.h 00:05:47.370 TEST_HEADER include/spdk/gpt_spec.h 00:05:47.641 TEST_HEADER include/spdk/hexlify.h 00:05:47.641 TEST_HEADER include/spdk/histogram_data.h 00:05:47.641 TEST_HEADER include/spdk/idxd.h 00:05:47.641 TEST_HEADER include/spdk/idxd_spec.h 00:05:47.641 TEST_HEADER include/spdk/ioat.h 00:05:47.641 TEST_HEADER include/spdk/init.h 00:05:47.641 CC examples/interrupt_tgt/interrupt_tgt.o 00:05:47.641 CC app/iscsi_tgt/iscsi_tgt.o 00:05:47.641 TEST_HEADER include/spdk/ioat_spec.h 00:05:47.641 TEST_HEADER include/spdk/iscsi_spec.h 00:05:47.641 TEST_HEADER include/spdk/jsonrpc.h 00:05:47.641 TEST_HEADER include/spdk/json.h 00:05:47.641 TEST_HEADER include/spdk/keyring.h 00:05:47.641 TEST_HEADER include/spdk/keyring_module.h 00:05:47.641 TEST_HEADER include/spdk/likely.h 00:05:47.641 TEST_HEADER include/spdk/log.h 00:05:47.641 TEST_HEADER include/spdk/lvol.h 00:05:47.641 TEST_HEADER include/spdk/memory.h 00:05:47.641 TEST_HEADER include/spdk/nbd.h 00:05:47.641 TEST_HEADER include/spdk/mmio.h 00:05:47.641 TEST_HEADER include/spdk/net.h 00:05:47.641 TEST_HEADER include/spdk/notify.h 00:05:47.641 TEST_HEADER include/spdk/nvme_intel.h 00:05:47.641 TEST_HEADER include/spdk/nvme.h 00:05:47.641 CC app/spdk_dd/spdk_dd.o 00:05:47.641 CC app/spdk_tgt/spdk_tgt.o 00:05:47.641 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:05:47.641 TEST_HEADER include/spdk/nvme_ocssd.h 00:05:47.641 TEST_HEADER include/spdk/nvme_zns.h 00:05:47.641 TEST_HEADER include/spdk/nvme_spec.h 00:05:47.641 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:05:47.641 TEST_HEADER include/spdk/nvmf_cmd.h 00:05:47.641 TEST_HEADER include/spdk/nvmf.h 00:05:47.641 TEST_HEADER include/spdk/nvmf_spec.h 00:05:47.641 TEST_HEADER include/spdk/nvmf_transport.h 00:05:47.641 TEST_HEADER include/spdk/opal.h 00:05:47.641 TEST_HEADER include/spdk/opal_spec.h 00:05:47.641 TEST_HEADER include/spdk/pci_ids.h 00:05:47.641 TEST_HEADER include/spdk/pipe.h 00:05:47.641 TEST_HEADER include/spdk/queue.h 00:05:47.641 CC app/nvmf_tgt/nvmf_main.o 00:05:47.641 TEST_HEADER include/spdk/rpc.h 00:05:47.641 TEST_HEADER include/spdk/reduce.h 00:05:47.641 TEST_HEADER include/spdk/scheduler.h 00:05:47.641 TEST_HEADER include/spdk/scsi.h 00:05:47.641 TEST_HEADER include/spdk/sock.h 00:05:47.641 TEST_HEADER include/spdk/scsi_spec.h 00:05:47.641 TEST_HEADER include/spdk/stdinc.h 00:05:47.641 TEST_HEADER include/spdk/string.h 00:05:47.641 TEST_HEADER include/spdk/trace.h 00:05:47.641 TEST_HEADER include/spdk/thread.h 00:05:47.641 TEST_HEADER include/spdk/tree.h 00:05:47.641 TEST_HEADER include/spdk/trace_parser.h 00:05:47.641 TEST_HEADER include/spdk/ublk.h 00:05:47.641 TEST_HEADER include/spdk/util.h 00:05:47.641 TEST_HEADER include/spdk/uuid.h 00:05:47.641 TEST_HEADER include/spdk/vfio_user_pci.h 00:05:47.641 TEST_HEADER include/spdk/version.h 00:05:47.641 TEST_HEADER include/spdk/vhost.h 00:05:47.641 TEST_HEADER include/spdk/vfio_user_spec.h 00:05:47.641 TEST_HEADER include/spdk/xor.h 00:05:47.641 TEST_HEADER include/spdk/vmd.h 00:05:47.641 CXX test/cpp_headers/accel.o 00:05:47.641 TEST_HEADER include/spdk/zipf.h 00:05:47.641 CXX test/cpp_headers/accel_module.o 00:05:47.641 CXX test/cpp_headers/assert.o 00:05:47.641 CXX test/cpp_headers/barrier.o 00:05:47.641 CXX test/cpp_headers/base64.o 00:05:47.641 CXX test/cpp_headers/bdev.o 00:05:47.641 CXX test/cpp_headers/bdev_module.o 00:05:47.641 CXX test/cpp_headers/bdev_zone.o 00:05:47.641 CXX test/cpp_headers/bit_array.o 00:05:47.641 CXX test/cpp_headers/blob_bdev.o 00:05:47.641 CXX test/cpp_headers/bit_pool.o 00:05:47.641 CXX test/cpp_headers/blob.o 00:05:47.641 CXX test/cpp_headers/conf.o 00:05:47.641 CXX test/cpp_headers/blobfs.o 00:05:47.641 CXX test/cpp_headers/blobfs_bdev.o 00:05:47.641 CXX test/cpp_headers/config.o 00:05:47.641 CXX test/cpp_headers/cpuset.o 00:05:47.641 CXX test/cpp_headers/crc16.o 00:05:47.641 CXX test/cpp_headers/crc32.o 00:05:47.641 CXX test/cpp_headers/crc64.o 00:05:47.641 CXX test/cpp_headers/dif.o 00:05:47.641 CXX test/cpp_headers/dma.o 00:05:47.641 CXX test/cpp_headers/endian.o 00:05:47.641 CXX test/cpp_headers/env_dpdk.o 00:05:47.641 CXX test/cpp_headers/env.o 00:05:47.641 CXX test/cpp_headers/event.o 00:05:47.641 CXX test/cpp_headers/fd_group.o 00:05:47.641 CXX test/cpp_headers/fd.o 00:05:47.641 CXX test/cpp_headers/file.o 00:05:47.641 CXX test/cpp_headers/ftl.o 00:05:47.641 CXX test/cpp_headers/gpt_spec.o 00:05:47.641 CXX test/cpp_headers/histogram_data.o 00:05:47.641 CXX test/cpp_headers/hexlify.o 00:05:47.641 CXX test/cpp_headers/idxd.o 00:05:47.641 CXX test/cpp_headers/idxd_spec.o 00:05:47.641 CXX test/cpp_headers/init.o 00:05:47.641 CXX test/cpp_headers/ioat.o 00:05:47.641 CXX test/cpp_headers/iscsi_spec.o 00:05:47.641 CXX test/cpp_headers/ioat_spec.o 00:05:47.641 CXX test/cpp_headers/json.o 00:05:47.641 CXX test/cpp_headers/jsonrpc.o 00:05:47.641 CXX test/cpp_headers/keyring.o 00:05:47.641 CXX test/cpp_headers/keyring_module.o 00:05:47.641 CXX test/cpp_headers/log.o 00:05:47.641 CXX test/cpp_headers/likely.o 00:05:47.641 CC examples/util/zipf/zipf.o 00:05:47.641 CXX test/cpp_headers/lvol.o 00:05:47.641 CXX test/cpp_headers/memory.o 00:05:47.641 CXX test/cpp_headers/nbd.o 00:05:47.641 CXX test/cpp_headers/mmio.o 00:05:47.641 CXX test/cpp_headers/notify.o 00:05:47.641 CXX test/cpp_headers/net.o 00:05:47.641 CXX test/cpp_headers/nvme_intel.o 00:05:47.641 CXX test/cpp_headers/nvme_ocssd.o 00:05:47.641 CXX test/cpp_headers/nvme.o 00:05:47.641 CXX test/cpp_headers/nvme_ocssd_spec.o 00:05:47.641 CXX test/cpp_headers/nvme_spec.o 00:05:47.641 CXX test/cpp_headers/nvme_zns.o 00:05:47.641 CXX test/cpp_headers/nvmf.o 00:05:47.641 CXX test/cpp_headers/nvmf_cmd.o 00:05:47.641 CXX test/cpp_headers/nvmf_fc_spec.o 00:05:47.641 CXX test/cpp_headers/nvmf_spec.o 00:05:47.641 CXX test/cpp_headers/nvmf_transport.o 00:05:47.641 CXX test/cpp_headers/opal.o 00:05:47.641 CXX test/cpp_headers/opal_spec.o 00:05:47.641 CXX test/cpp_headers/pci_ids.o 00:05:47.641 CXX test/cpp_headers/pipe.o 00:05:47.641 CXX test/cpp_headers/rpc.o 00:05:47.641 CXX test/cpp_headers/queue.o 00:05:47.641 CXX test/cpp_headers/reduce.o 00:05:47.641 CXX test/cpp_headers/scheduler.o 00:05:47.642 CXX test/cpp_headers/scsi.o 00:05:47.642 CXX test/cpp_headers/scsi_spec.o 00:05:47.642 CC examples/ioat/verify/verify.o 00:05:47.642 CXX test/cpp_headers/sock.o 00:05:47.642 CXX test/cpp_headers/stdinc.o 00:05:47.642 CXX test/cpp_headers/string.o 00:05:47.642 CXX test/cpp_headers/thread.o 00:05:47.642 CXX test/cpp_headers/trace.o 00:05:47.642 CXX test/cpp_headers/trace_parser.o 00:05:47.642 CC examples/ioat/perf/perf.o 00:05:47.642 CXX test/cpp_headers/tree.o 00:05:47.642 CXX test/cpp_headers/ublk.o 00:05:47.642 CXX test/cpp_headers/util.o 00:05:47.642 CXX test/cpp_headers/version.o 00:05:47.642 CXX test/cpp_headers/uuid.o 00:05:47.920 CC test/thread/poller_perf/poller_perf.o 00:05:47.920 CXX test/cpp_headers/vfio_user_pci.o 00:05:47.920 CC test/app/jsoncat/jsoncat.o 00:05:47.920 CC test/app/histogram_perf/histogram_perf.o 00:05:47.920 CC test/env/vtophys/vtophys.o 00:05:47.920 CC test/env/memory/memory_ut.o 00:05:47.920 CC app/fio/nvme/fio_plugin.o 00:05:47.920 CC test/env/pci/pci_ut.o 00:05:47.920 CC test/app/stub/stub.o 00:05:47.920 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:05:47.920 CXX test/cpp_headers/vfio_user_spec.o 00:05:47.920 CC test/app/bdev_svc/bdev_svc.o 00:05:47.920 CC app/fio/bdev/fio_plugin.o 00:05:47.920 LINK spdk_lspci 00:05:47.920 CC test/dma/test_dma/test_dma.o 00:05:48.193 LINK rpc_client_test 00:05:48.453 LINK spdk_nvme_discover 00:05:48.453 LINK spdk_tgt 00:05:48.453 LINK nvmf_tgt 00:05:48.453 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:05:48.453 LINK interrupt_tgt 00:05:48.453 CC test/env/mem_callbacks/mem_callbacks.o 00:05:48.453 CXX test/cpp_headers/vhost.o 00:05:48.453 LINK iscsi_tgt 00:05:48.453 LINK spdk_trace_record 00:05:48.453 CXX test/cpp_headers/vmd.o 00:05:48.453 LINK jsoncat 00:05:48.453 CXX test/cpp_headers/xor.o 00:05:48.453 CXX test/cpp_headers/zipf.o 00:05:48.453 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:05:48.453 LINK vtophys 00:05:48.453 LINK zipf 00:05:48.453 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:05:48.453 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:05:48.453 LINK poller_perf 00:05:48.712 LINK stub 00:05:48.712 LINK histogram_perf 00:05:48.712 LINK ioat_perf 00:05:48.712 LINK bdev_svc 00:05:48.712 LINK env_dpdk_post_init 00:05:48.712 LINK verify 00:05:48.712 LINK spdk_dd 00:05:48.712 LINK spdk_trace 00:05:48.712 LINK pci_ut 00:05:48.971 LINK nvme_fuzz 00:05:48.971 LINK test_dma 00:05:48.971 LINK spdk_nvme 00:05:48.971 LINK vhost_fuzz 00:05:48.971 LINK spdk_bdev 00:05:48.971 LINK spdk_nvme_identify 00:05:49.230 CC examples/sock/hello_world/hello_sock.o 00:05:49.230 CC examples/vmd/lsvmd/lsvmd.o 00:05:49.230 CC examples/idxd/perf/perf.o 00:05:49.230 LINK mem_callbacks 00:05:49.230 CC test/event/event_perf/event_perf.o 00:05:49.230 CC examples/vmd/led/led.o 00:05:49.230 CC app/vhost/vhost.o 00:05:49.230 LINK spdk_nvme_perf 00:05:49.230 CC test/event/reactor_perf/reactor_perf.o 00:05:49.230 CC test/event/reactor/reactor.o 00:05:49.230 CC test/event/app_repeat/app_repeat.o 00:05:49.230 CC examples/thread/thread/thread_ex.o 00:05:49.230 CC test/event/scheduler/scheduler.o 00:05:49.230 LINK spdk_top 00:05:49.230 LINK lsvmd 00:05:49.230 LINK event_perf 00:05:49.230 LINK led 00:05:49.230 LINK reactor_perf 00:05:49.230 LINK reactor 00:05:49.488 LINK hello_sock 00:05:49.488 LINK vhost 00:05:49.488 LINK app_repeat 00:05:49.488 LINK scheduler 00:05:49.488 LINK thread 00:05:49.488 LINK idxd_perf 00:05:49.488 LINK memory_ut 00:05:49.488 CC test/nvme/reset/reset.o 00:05:49.488 CC test/nvme/sgl/sgl.o 00:05:49.488 CC test/nvme/e2edp/nvme_dp.o 00:05:49.488 CC test/nvme/aer/aer.o 00:05:49.488 CC test/nvme/fused_ordering/fused_ordering.o 00:05:49.488 CC test/nvme/err_injection/err_injection.o 00:05:49.488 CC test/nvme/reserve/reserve.o 00:05:49.488 CC test/nvme/connect_stress/connect_stress.o 00:05:49.488 CC test/nvme/overhead/overhead.o 00:05:49.488 CC test/nvme/doorbell_aers/doorbell_aers.o 00:05:49.488 CC test/nvme/compliance/nvme_compliance.o 00:05:49.488 CC test/nvme/cuse/cuse.o 00:05:49.488 CC test/nvme/fdp/fdp.o 00:05:49.488 CC test/nvme/boot_partition/boot_partition.o 00:05:49.488 CC test/nvme/startup/startup.o 00:05:49.488 CC test/nvme/simple_copy/simple_copy.o 00:05:49.746 CC test/accel/dif/dif.o 00:05:49.746 CC test/blobfs/mkfs/mkfs.o 00:05:49.746 CC test/lvol/esnap/esnap.o 00:05:49.746 LINK startup 00:05:49.746 LINK reset 00:05:49.746 LINK boot_partition 00:05:49.746 LINK err_injection 00:05:49.746 LINK reserve 00:05:49.746 LINK fused_ordering 00:05:49.746 LINK connect_stress 00:05:49.746 LINK doorbell_aers 00:05:50.003 LINK simple_copy 00:05:50.003 LINK sgl 00:05:50.003 LINK nvme_dp 00:05:50.003 LINK aer 00:05:50.003 LINK overhead 00:05:50.003 CC examples/nvme/cmb_copy/cmb_copy.o 00:05:50.003 CC examples/nvme/reconnect/reconnect.o 00:05:50.003 CC examples/nvme/hello_world/hello_world.o 00:05:50.003 CC examples/nvme/nvme_manage/nvme_manage.o 00:05:50.003 LINK nvme_compliance 00:05:50.003 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:05:50.003 CC examples/nvme/abort/abort.o 00:05:50.003 LINK fdp 00:05:50.003 LINK mkfs 00:05:50.003 CC examples/nvme/arbitration/arbitration.o 00:05:50.003 CC examples/nvme/hotplug/hotplug.o 00:05:50.003 CC examples/accel/perf/accel_perf.o 00:05:50.003 CC examples/blob/hello_world/hello_blob.o 00:05:50.003 CC examples/blob/cli/blobcli.o 00:05:50.003 LINK dif 00:05:50.261 LINK cmb_copy 00:05:50.261 LINK pmr_persistence 00:05:50.261 LINK hello_world 00:05:50.261 LINK iscsi_fuzz 00:05:50.261 LINK hotplug 00:05:50.261 LINK reconnect 00:05:50.261 LINK arbitration 00:05:50.261 LINK abort 00:05:50.520 LINK hello_blob 00:05:50.520 LINK nvme_manage 00:05:50.520 LINK blobcli 00:05:50.520 LINK accel_perf 00:05:50.778 CC test/bdev/bdevio/bdevio.o 00:05:50.778 LINK cuse 00:05:51.379 CC examples/bdev/hello_world/hello_bdev.o 00:05:51.379 CC examples/bdev/bdevperf/bdevperf.o 00:05:51.379 LINK bdevio 00:05:51.379 LINK hello_bdev 00:05:51.948 LINK bdevperf 00:05:52.516 CC examples/nvmf/nvmf/nvmf.o 00:05:53.083 LINK nvmf 00:05:53.342 LINK esnap 00:05:53.601 00:05:53.601 real 0m43.687s 00:05:53.601 user 13m0.467s 00:05:53.601 sys 3m40.466s 00:05:53.601 06:22:07 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:05:53.601 06:22:07 make -- common/autotest_common.sh@10 -- $ set +x 00:05:53.601 ************************************ 00:05:53.601 END TEST make 00:05:53.601 ************************************ 00:05:53.861 06:22:07 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:05:53.861 06:22:07 -- pm/common@29 -- $ signal_monitor_resources TERM 00:05:53.861 06:22:07 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:05:53.861 06:22:07 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:05:53.861 06:22:07 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:05:53.861 06:22:07 -- pm/common@44 -- $ pid=865182 00:05:53.861 06:22:07 -- pm/common@50 -- $ kill -TERM 865182 00:05:53.861 06:22:07 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:05:53.861 06:22:07 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:05:53.861 06:22:07 -- pm/common@44 -- $ pid=865184 00:05:53.861 06:22:07 -- pm/common@50 -- $ kill -TERM 865184 00:05:53.861 06:22:07 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:05:53.861 06:22:07 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:05:53.861 06:22:07 -- pm/common@44 -- $ pid=865187 00:05:53.861 06:22:07 -- pm/common@50 -- $ kill -TERM 865187 00:05:53.861 06:22:07 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:05:53.861 06:22:07 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:05:53.861 06:22:07 -- pm/common@44 -- $ pid=865209 00:05:53.861 06:22:07 -- pm/common@50 -- $ sudo -E kill -TERM 865209 00:05:53.861 06:22:07 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:53.861 06:22:07 -- nvmf/common.sh@7 -- # uname -s 00:05:53.861 06:22:07 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:53.861 06:22:07 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:53.861 06:22:07 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:53.861 06:22:07 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:53.861 06:22:07 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:53.861 06:22:07 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:53.861 06:22:07 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:53.861 06:22:07 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:53.861 06:22:07 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:53.861 06:22:07 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:53.861 06:22:07 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:05:53.861 06:22:07 -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:05:53.861 06:22:07 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:53.861 06:22:07 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:53.861 06:22:07 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:53.861 06:22:07 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:53.861 06:22:07 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:53.861 06:22:07 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:53.861 06:22:07 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:53.861 06:22:07 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:53.861 06:22:07 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:53.861 06:22:07 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:53.861 06:22:07 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:53.861 06:22:07 -- paths/export.sh@5 -- # export PATH 00:05:53.861 06:22:07 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:53.861 06:22:07 -- nvmf/common.sh@47 -- # : 0 00:05:53.861 06:22:07 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:53.861 06:22:07 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:53.861 06:22:07 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:53.861 06:22:07 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:53.861 06:22:07 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:53.861 06:22:07 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:53.861 06:22:07 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:53.861 06:22:07 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:53.861 06:22:07 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:05:53.861 06:22:07 -- spdk/autotest.sh@32 -- # uname -s 00:05:53.861 06:22:07 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:05:53.861 06:22:07 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:05:53.861 06:22:07 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:05:53.861 06:22:07 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:05:53.861 06:22:07 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:05:53.861 06:22:07 -- spdk/autotest.sh@44 -- # modprobe nbd 00:05:53.861 06:22:07 -- spdk/autotest.sh@46 -- # type -P udevadm 00:05:53.861 06:22:07 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:05:53.861 06:22:07 -- spdk/autotest.sh@48 -- # udevadm_pid=979973 00:05:53.861 06:22:07 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:05:53.861 06:22:07 -- pm/common@17 -- # local monitor 00:05:53.861 06:22:07 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:05:53.861 06:22:07 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:05:53.861 06:22:07 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:05:53.861 06:22:07 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:05:53.861 06:22:07 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:05:53.861 06:22:07 -- pm/common@25 -- # sleep 1 00:05:53.861 06:22:07 -- pm/common@21 -- # date +%s 00:05:53.861 06:22:07 -- pm/common@21 -- # date +%s 00:05:53.861 06:22:07 -- pm/common@21 -- # date +%s 00:05:53.861 06:22:07 -- pm/common@21 -- # date +%s 00:05:53.861 06:22:07 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721881327 00:05:53.861 06:22:07 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721881327 00:05:53.861 06:22:07 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721881327 00:05:53.861 06:22:07 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721881327 00:05:53.861 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721881327_collect-vmstat.pm.log 00:05:53.861 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721881327_collect-cpu-load.pm.log 00:05:53.861 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721881327_collect-cpu-temp.pm.log 00:05:53.861 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721881327_collect-bmc-pm.bmc.pm.log 00:05:54.799 06:22:08 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:05:54.799 06:22:08 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:05:54.799 06:22:08 -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:54.799 06:22:08 -- common/autotest_common.sh@10 -- # set +x 00:05:54.799 06:22:08 -- spdk/autotest.sh@59 -- # create_test_list 00:05:54.799 06:22:08 -- common/autotest_common.sh@748 -- # xtrace_disable 00:05:54.799 06:22:08 -- common/autotest_common.sh@10 -- # set +x 00:05:55.058 06:22:08 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:05:55.058 06:22:08 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:55.058 06:22:08 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:55.058 06:22:08 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:05:55.058 06:22:08 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:55.058 06:22:08 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:05:55.058 06:22:08 -- common/autotest_common.sh@1455 -- # uname 00:05:55.058 06:22:08 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:05:55.058 06:22:08 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:05:55.058 06:22:08 -- common/autotest_common.sh@1475 -- # uname 00:05:55.058 06:22:08 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:05:55.058 06:22:08 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:05:55.058 06:22:08 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:05:55.058 06:22:08 -- spdk/autotest.sh@72 -- # hash lcov 00:05:55.058 06:22:08 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:05:55.058 06:22:08 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:05:55.058 --rc lcov_branch_coverage=1 00:05:55.058 --rc lcov_function_coverage=1 00:05:55.058 --rc genhtml_branch_coverage=1 00:05:55.058 --rc genhtml_function_coverage=1 00:05:55.058 --rc genhtml_legend=1 00:05:55.058 --rc geninfo_all_blocks=1 00:05:55.058 ' 00:05:55.058 06:22:08 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:05:55.058 --rc lcov_branch_coverage=1 00:05:55.058 --rc lcov_function_coverage=1 00:05:55.058 --rc genhtml_branch_coverage=1 00:05:55.058 --rc genhtml_function_coverage=1 00:05:55.058 --rc genhtml_legend=1 00:05:55.058 --rc geninfo_all_blocks=1 00:05:55.058 ' 00:05:55.058 06:22:08 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:05:55.058 --rc lcov_branch_coverage=1 00:05:55.058 --rc lcov_function_coverage=1 00:05:55.058 --rc genhtml_branch_coverage=1 00:05:55.058 --rc genhtml_function_coverage=1 00:05:55.058 --rc genhtml_legend=1 00:05:55.058 --rc geninfo_all_blocks=1 00:05:55.058 --no-external' 00:05:55.058 06:22:08 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:05:55.058 --rc lcov_branch_coverage=1 00:05:55.058 --rc lcov_function_coverage=1 00:05:55.058 --rc genhtml_branch_coverage=1 00:05:55.058 --rc genhtml_function_coverage=1 00:05:55.058 --rc genhtml_legend=1 00:05:55.058 --rc geninfo_all_blocks=1 00:05:55.058 --no-external' 00:05:55.058 06:22:08 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:05:55.058 lcov: LCOV version 1.14 00:05:55.058 06:22:08 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:06:13.142 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:06:13.142 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:06:16.431 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:06:16.431 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:06:16.431 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:06:16.431 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:06:16.691 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:06:16.691 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:06:16.692 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:06:16.692 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:06:16.692 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:06:16.692 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:06:16.692 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:06:16.692 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:06:16.692 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:06:16.692 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:06:16.692 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:06:16.692 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:06:16.692 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:06:16.692 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:06:16.952 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:06:16.952 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:06:17.212 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:06:17.212 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:06:17.212 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:06:17.212 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:06:17.212 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:06:17.212 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:06:17.212 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:06:17.213 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:06:17.213 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:06:17.473 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:06:17.473 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:06:20.009 06:22:33 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:06:20.009 06:22:33 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:20.009 06:22:33 -- common/autotest_common.sh@10 -- # set +x 00:06:20.009 06:22:33 -- spdk/autotest.sh@91 -- # rm -f 00:06:20.009 06:22:33 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:24.241 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:06:24.241 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:06:24.241 06:22:37 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:06:24.241 06:22:37 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:06:24.241 06:22:37 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:06:24.241 06:22:37 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:06:24.241 06:22:37 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:06:24.241 06:22:37 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:06:24.241 06:22:37 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:06:24.241 06:22:37 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:24.241 06:22:37 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:06:24.241 06:22:37 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:06:24.242 06:22:37 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:06:24.242 06:22:37 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:06:24.242 06:22:37 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:06:24.242 06:22:37 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:06:24.242 06:22:37 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:06:24.242 No valid GPT data, bailing 00:06:24.242 06:22:37 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:06:24.242 06:22:37 -- scripts/common.sh@391 -- # pt= 00:06:24.242 06:22:37 -- scripts/common.sh@392 -- # return 1 00:06:24.242 06:22:37 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:06:24.242 1+0 records in 00:06:24.242 1+0 records out 00:06:24.242 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00505692 s, 207 MB/s 00:06:24.242 06:22:37 -- spdk/autotest.sh@118 -- # sync 00:06:24.242 06:22:37 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:06:24.242 06:22:37 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:06:24.242 06:22:37 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:06:32.357 06:22:44 -- spdk/autotest.sh@124 -- # uname -s 00:06:32.357 06:22:44 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:06:32.357 06:22:44 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:06:32.357 06:22:44 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:32.357 06:22:44 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:32.357 06:22:44 -- common/autotest_common.sh@10 -- # set +x 00:06:32.357 ************************************ 00:06:32.357 START TEST setup.sh 00:06:32.357 ************************************ 00:06:32.357 06:22:44 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:06:32.357 * Looking for test storage... 00:06:32.357 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:32.357 06:22:45 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:06:32.357 06:22:45 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:06:32.358 06:22:45 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:06:32.358 06:22:45 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:32.358 06:22:45 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:32.358 06:22:45 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:32.358 ************************************ 00:06:32.358 START TEST acl 00:06:32.358 ************************************ 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:06:32.358 * Looking for test storage... 00:06:32.358 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:32.358 06:22:45 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:32.358 06:22:45 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:06:32.358 06:22:45 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:06:32.358 06:22:45 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:06:32.358 06:22:45 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:06:32.358 06:22:45 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:06:32.358 06:22:45 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:06:32.358 06:22:45 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:32.358 06:22:45 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:36.546 06:22:49 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:06:36.546 06:22:49 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:06:36.546 06:22:49 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:36.546 06:22:49 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:06:36.546 06:22:49 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:06:36.546 06:22:49 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:06:40.734 Hugepages 00:06:40.734 node hugesize free / total 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 00:06:40.734 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.734 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:40.735 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.994 06:22:54 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:06:40.994 06:22:54 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:06:40.994 06:22:54 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:06:40.994 06:22:54 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:06:40.994 06:22:54 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:06:40.994 06:22:54 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:40.994 06:22:54 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:06:40.994 06:22:54 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:06:40.994 06:22:54 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:40.994 06:22:54 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:40.994 06:22:54 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:06:40.994 ************************************ 00:06:40.994 START TEST denied 00:06:40.994 ************************************ 00:06:40.994 06:22:54 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:06:40.994 06:22:54 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:06:40.994 06:22:54 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:06:40.994 06:22:54 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:06:40.994 06:22:54 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:06:40.994 06:22:54 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:45.187 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:45.187 06:22:58 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:50.462 00:06:50.462 real 0m9.341s 00:06:50.462 user 0m2.773s 00:06:50.462 sys 0m5.816s 00:06:50.462 06:23:03 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:50.462 06:23:03 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:06:50.462 ************************************ 00:06:50.462 END TEST denied 00:06:50.462 ************************************ 00:06:50.462 06:23:03 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:06:50.462 06:23:03 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:50.462 06:23:03 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:50.462 06:23:03 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:06:50.462 ************************************ 00:06:50.462 START TEST allowed 00:06:50.462 ************************************ 00:06:50.462 06:23:03 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:06:50.462 06:23:03 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:06:50.462 06:23:03 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:06:50.462 06:23:03 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:06:50.462 06:23:03 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:06:50.462 06:23:03 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:57.080 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:57.080 06:23:10 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:06:57.080 06:23:10 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:06:57.080 06:23:10 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:06:57.080 06:23:10 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:57.080 06:23:10 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:02.352 00:07:02.352 real 0m11.108s 00:07:02.352 user 0m3.107s 00:07:02.352 sys 0m6.243s 00:07:02.352 06:23:14 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:02.352 06:23:14 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:07:02.352 ************************************ 00:07:02.352 END TEST allowed 00:07:02.352 ************************************ 00:07:02.352 00:07:02.352 real 0m29.807s 00:07:02.352 user 0m9.229s 00:07:02.352 sys 0m18.396s 00:07:02.352 06:23:14 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:02.352 06:23:14 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:07:02.352 ************************************ 00:07:02.352 END TEST acl 00:07:02.352 ************************************ 00:07:02.352 06:23:15 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:07:02.352 06:23:15 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:02.352 06:23:15 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:02.352 06:23:15 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:07:02.352 ************************************ 00:07:02.352 START TEST hugepages 00:07:02.352 ************************************ 00:07:02.352 06:23:15 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:07:02.352 * Looking for test storage... 00:07:02.352 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.352 06:23:15 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 39398584 kB' 'MemAvailable: 43396072 kB' 'Buffers: 6064 kB' 'Cached: 12557280 kB' 'SwapCached: 0 kB' 'Active: 9378992 kB' 'Inactive: 3689560 kB' 'Active(anon): 8980568 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509060 kB' 'Mapped: 183724 kB' 'Shmem: 8475360 kB' 'KReclaimable: 557212 kB' 'Slab: 1213264 kB' 'SReclaimable: 557212 kB' 'SUnreclaim: 656052 kB' 'KernelStack: 22048 kB' 'PageTables: 8756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439060 kB' 'Committed_AS: 10456920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218732 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.353 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:07:02.354 06:23:15 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:07:02.354 06:23:15 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:02.354 06:23:15 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:02.354 06:23:15 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:02.354 ************************************ 00:07:02.354 START TEST default_setup 00:07:02.354 ************************************ 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # default_setup 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:07:02.354 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:07:02.355 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:07:02.355 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:07:02.355 06:23:15 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:07:02.355 06:23:15 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:07:02.355 06:23:15 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:06.542 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:07:06.542 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:08.451 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.451 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41547980 kB' 'MemAvailable: 45544884 kB' 'Buffers: 6064 kB' 'Cached: 12557432 kB' 'SwapCached: 0 kB' 'Active: 9398668 kB' 'Inactive: 3689560 kB' 'Active(anon): 9000244 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528596 kB' 'Mapped: 183844 kB' 'Shmem: 8475512 kB' 'KReclaimable: 556628 kB' 'Slab: 1211236 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654608 kB' 'KernelStack: 22208 kB' 'PageTables: 8864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10474260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218844 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.452 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41552476 kB' 'MemAvailable: 45549380 kB' 'Buffers: 6064 kB' 'Cached: 12557436 kB' 'SwapCached: 0 kB' 'Active: 9398164 kB' 'Inactive: 3689560 kB' 'Active(anon): 8999740 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527680 kB' 'Mapped: 183832 kB' 'Shmem: 8475516 kB' 'KReclaimable: 556628 kB' 'Slab: 1211204 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654576 kB' 'KernelStack: 22192 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10474276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218876 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.453 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.454 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41553480 kB' 'MemAvailable: 45550384 kB' 'Buffers: 6064 kB' 'Cached: 12557452 kB' 'SwapCached: 0 kB' 'Active: 9398416 kB' 'Inactive: 3689560 kB' 'Active(anon): 8999992 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527768 kB' 'Mapped: 183832 kB' 'Shmem: 8475532 kB' 'KReclaimable: 556628 kB' 'Slab: 1210476 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 653848 kB' 'KernelStack: 22368 kB' 'PageTables: 9488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10474300 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218892 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.455 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.456 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:07:08.457 nr_hugepages=1024 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:08.457 resv_hugepages=0 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:08.457 surplus_hugepages=0 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:08.457 anon_hugepages=0 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41553420 kB' 'MemAvailable: 45550324 kB' 'Buffers: 6064 kB' 'Cached: 12557472 kB' 'SwapCached: 0 kB' 'Active: 9398108 kB' 'Inactive: 3689560 kB' 'Active(anon): 8999684 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527464 kB' 'Mapped: 183832 kB' 'Shmem: 8475552 kB' 'KReclaimable: 556628 kB' 'Slab: 1210252 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 653624 kB' 'KernelStack: 22416 kB' 'PageTables: 9572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10474320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218940 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.457 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.458 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 24274000 kB' 'MemUsed: 8365140 kB' 'SwapCached: 0 kB' 'Active: 4442632 kB' 'Inactive: 231284 kB' 'Active(anon): 4309584 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4304500 kB' 'Mapped: 88656 kB' 'AnonPages: 372576 kB' 'Shmem: 3940168 kB' 'KernelStack: 12536 kB' 'PageTables: 6152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224332 kB' 'Slab: 529592 kB' 'SReclaimable: 224332 kB' 'SUnreclaim: 305260 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.459 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:08.460 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:08.461 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:08.461 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:07:08.461 node0=1024 expecting 1024 00:07:08.461 06:23:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:07:08.461 00:07:08.461 real 0m6.609s 00:07:08.461 user 0m1.740s 00:07:08.461 sys 0m3.032s 00:07:08.461 06:23:21 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:08.461 06:23:21 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:07:08.461 ************************************ 00:07:08.461 END TEST default_setup 00:07:08.461 ************************************ 00:07:08.461 06:23:21 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:07:08.461 06:23:21 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:08.461 06:23:21 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:08.461 06:23:21 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:08.461 ************************************ 00:07:08.461 START TEST per_node_1G_alloc 00:07:08.461 ************************************ 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # per_node_1G_alloc 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:07:08.461 06:23:21 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:12.649 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:12.649 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:12.650 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:12.650 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:12.650 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41542756 kB' 'MemAvailable: 45539652 kB' 'Buffers: 6064 kB' 'Cached: 12557612 kB' 'SwapCached: 0 kB' 'Active: 9397312 kB' 'Inactive: 3689560 kB' 'Active(anon): 8998888 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525980 kB' 'Mapped: 182760 kB' 'Shmem: 8475692 kB' 'KReclaimable: 556620 kB' 'Slab: 1209888 kB' 'SReclaimable: 556620 kB' 'SUnreclaim: 653268 kB' 'KernelStack: 22096 kB' 'PageTables: 8564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10466912 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218812 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.650 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.915 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41542836 kB' 'MemAvailable: 45539732 kB' 'Buffers: 6064 kB' 'Cached: 12557616 kB' 'SwapCached: 0 kB' 'Active: 9397584 kB' 'Inactive: 3689560 kB' 'Active(anon): 8999160 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526268 kB' 'Mapped: 182760 kB' 'Shmem: 8475696 kB' 'KReclaimable: 556620 kB' 'Slab: 1209892 kB' 'SReclaimable: 556620 kB' 'SUnreclaim: 653272 kB' 'KernelStack: 22048 kB' 'PageTables: 8424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10467280 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218828 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.916 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.917 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41543680 kB' 'MemAvailable: 45540576 kB' 'Buffers: 6064 kB' 'Cached: 12557632 kB' 'SwapCached: 0 kB' 'Active: 9397060 kB' 'Inactive: 3689560 kB' 'Active(anon): 8998636 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526180 kB' 'Mapped: 182680 kB' 'Shmem: 8475712 kB' 'KReclaimable: 556620 kB' 'Slab: 1209892 kB' 'SReclaimable: 556620 kB' 'SUnreclaim: 653272 kB' 'KernelStack: 22048 kB' 'PageTables: 8056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10467060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.918 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.919 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:07:12.920 nr_hugepages=1024 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:12.920 resv_hugepages=0 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:12.920 surplus_hugepages=0 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:12.920 anon_hugepages=0 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41543680 kB' 'MemAvailable: 45540576 kB' 'Buffers: 6064 kB' 'Cached: 12557632 kB' 'SwapCached: 0 kB' 'Active: 9397376 kB' 'Inactive: 3689560 kB' 'Active(anon): 8998952 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526496 kB' 'Mapped: 182680 kB' 'Shmem: 8475712 kB' 'KReclaimable: 556620 kB' 'Slab: 1209892 kB' 'SReclaimable: 556620 kB' 'SUnreclaim: 653272 kB' 'KernelStack: 22224 kB' 'PageTables: 8844 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10467084 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218956 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.920 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.921 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 25306272 kB' 'MemUsed: 7332868 kB' 'SwapCached: 0 kB' 'Active: 4444140 kB' 'Inactive: 231284 kB' 'Active(anon): 4311092 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4304580 kB' 'Mapped: 87876 kB' 'AnonPages: 374048 kB' 'Shmem: 3940248 kB' 'KernelStack: 12312 kB' 'PageTables: 5632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224332 kB' 'Slab: 529400 kB' 'SReclaimable: 224332 kB' 'SUnreclaim: 305068 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.922 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.923 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16239152 kB' 'MemUsed: 11416928 kB' 'SwapCached: 0 kB' 'Active: 4953652 kB' 'Inactive: 3458276 kB' 'Active(anon): 4688276 kB' 'Inactive(anon): 0 kB' 'Active(file): 265376 kB' 'Inactive(file): 3458276 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8259120 kB' 'Mapped: 94804 kB' 'AnonPages: 152352 kB' 'Shmem: 4535468 kB' 'KernelStack: 9752 kB' 'PageTables: 2908 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 332288 kB' 'Slab: 680492 kB' 'SReclaimable: 332288 kB' 'SUnreclaim: 348204 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.924 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:07:12.925 node0=512 expecting 512 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:07:12.925 node1=512 expecting 512 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:07:12.925 00:07:12.925 real 0m4.412s 00:07:12.925 user 0m1.584s 00:07:12.925 sys 0m2.905s 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:12.925 06:23:26 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:07:12.925 ************************************ 00:07:12.925 END TEST per_node_1G_alloc 00:07:12.925 ************************************ 00:07:12.925 06:23:26 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:07:12.925 06:23:26 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:12.925 06:23:26 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:12.925 06:23:26 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:12.925 ************************************ 00:07:12.925 START TEST even_2G_alloc 00:07:12.925 ************************************ 00:07:12.925 06:23:26 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:07:12.925 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:12.926 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:07:13.184 06:23:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:17.378 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:17.378 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41547588 kB' 'MemAvailable: 45544484 kB' 'Buffers: 6064 kB' 'Cached: 12557792 kB' 'SwapCached: 0 kB' 'Active: 9397624 kB' 'Inactive: 3689560 kB' 'Active(anon): 8999200 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526688 kB' 'Mapped: 182708 kB' 'Shmem: 8475872 kB' 'KReclaimable: 556620 kB' 'Slab: 1209536 kB' 'SReclaimable: 556620 kB' 'SUnreclaim: 652916 kB' 'KernelStack: 22080 kB' 'PageTables: 8468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10467288 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218844 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.378 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.379 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41544660 kB' 'MemAvailable: 45541556 kB' 'Buffers: 6064 kB' 'Cached: 12557796 kB' 'SwapCached: 0 kB' 'Active: 9399820 kB' 'Inactive: 3689560 kB' 'Active(anon): 9001396 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528932 kB' 'Mapped: 183192 kB' 'Shmem: 8475876 kB' 'KReclaimable: 556620 kB' 'Slab: 1209580 kB' 'SReclaimable: 556620 kB' 'SUnreclaim: 652960 kB' 'KernelStack: 22080 kB' 'PageTables: 8480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10470216 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218828 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.380 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.381 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41540880 kB' 'MemAvailable: 45537776 kB' 'Buffers: 6064 kB' 'Cached: 12557796 kB' 'SwapCached: 0 kB' 'Active: 9402780 kB' 'Inactive: 3689560 kB' 'Active(anon): 9004356 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531944 kB' 'Mapped: 183500 kB' 'Shmem: 8475876 kB' 'KReclaimable: 556620 kB' 'Slab: 1209580 kB' 'SReclaimable: 556620 kB' 'SUnreclaim: 652960 kB' 'KernelStack: 22064 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10473020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218832 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.382 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:07:17.383 nr_hugepages=1024 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:17.383 resv_hugepages=0 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:17.383 surplus_hugepages=0 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:17.383 anon_hugepages=0 00:07:17.383 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41540628 kB' 'MemAvailable: 45537524 kB' 'Buffers: 6064 kB' 'Cached: 12557796 kB' 'SwapCached: 0 kB' 'Active: 9397284 kB' 'Inactive: 3689560 kB' 'Active(anon): 8998860 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526380 kB' 'Mapped: 183060 kB' 'Shmem: 8475876 kB' 'KReclaimable: 556620 kB' 'Slab: 1209580 kB' 'SReclaimable: 556620 kB' 'SUnreclaim: 652960 kB' 'KernelStack: 22080 kB' 'PageTables: 8504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10466920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218860 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.384 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:17.385 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 25296528 kB' 'MemUsed: 7342612 kB' 'SwapCached: 0 kB' 'Active: 4443820 kB' 'Inactive: 231284 kB' 'Active(anon): 4310772 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4304680 kB' 'Mapped: 87884 kB' 'AnonPages: 373588 kB' 'Shmem: 3940348 kB' 'KernelStack: 12248 kB' 'PageTables: 5308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224332 kB' 'Slab: 529100 kB' 'SReclaimable: 224332 kB' 'SUnreclaim: 304768 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.386 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16244776 kB' 'MemUsed: 11411304 kB' 'SwapCached: 0 kB' 'Active: 4953444 kB' 'Inactive: 3458276 kB' 'Active(anon): 4688068 kB' 'Inactive(anon): 0 kB' 'Active(file): 265376 kB' 'Inactive(file): 3458276 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8259260 kB' 'Mapped: 94804 kB' 'AnonPages: 152756 kB' 'Shmem: 4535608 kB' 'KernelStack: 9816 kB' 'PageTables: 3120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 332288 kB' 'Slab: 680480 kB' 'SReclaimable: 332288 kB' 'SUnreclaim: 348192 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.387 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:07:17.388 node0=512 expecting 512 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:07:17.388 node1=512 expecting 512 00:07:17.388 06:23:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:07:17.388 00:07:17.388 real 0m4.443s 00:07:17.388 user 0m1.637s 00:07:17.388 sys 0m2.883s 00:07:17.389 06:23:30 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:17.389 06:23:30 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:07:17.389 ************************************ 00:07:17.389 END TEST even_2G_alloc 00:07:17.389 ************************************ 00:07:17.647 06:23:30 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:07:17.647 06:23:30 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:17.647 06:23:30 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:17.647 06:23:30 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:17.647 ************************************ 00:07:17.647 START TEST odd_alloc 00:07:17.647 ************************************ 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:07:17.647 06:23:30 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:21.867 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:21.867 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41563052 kB' 'MemAvailable: 45559980 kB' 'Buffers: 6064 kB' 'Cached: 12557960 kB' 'SwapCached: 0 kB' 'Active: 9399272 kB' 'Inactive: 3689560 kB' 'Active(anon): 9000848 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528088 kB' 'Mapped: 183144 kB' 'Shmem: 8476040 kB' 'KReclaimable: 556652 kB' 'Slab: 1209976 kB' 'SReclaimable: 556652 kB' 'SUnreclaim: 653324 kB' 'KernelStack: 22128 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 10469152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.867 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.868 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41556560 kB' 'MemAvailable: 45553488 kB' 'Buffers: 6064 kB' 'Cached: 12557964 kB' 'SwapCached: 0 kB' 'Active: 9403416 kB' 'Inactive: 3689560 kB' 'Active(anon): 9004992 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532824 kB' 'Mapped: 183144 kB' 'Shmem: 8476044 kB' 'KReclaimable: 556652 kB' 'Slab: 1210028 kB' 'SReclaimable: 556652 kB' 'SUnreclaim: 653376 kB' 'KernelStack: 22096 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 10473932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218784 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.869 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:21.870 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41556104 kB' 'MemAvailable: 45553032 kB' 'Buffers: 6064 kB' 'Cached: 12558148 kB' 'SwapCached: 0 kB' 'Active: 9398748 kB' 'Inactive: 3689560 kB' 'Active(anon): 9000324 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527412 kB' 'Mapped: 183012 kB' 'Shmem: 8476228 kB' 'KReclaimable: 556652 kB' 'Slab: 1210028 kB' 'SReclaimable: 556652 kB' 'SUnreclaim: 653376 kB' 'KernelStack: 22112 kB' 'PageTables: 8648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 10468000 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.871 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:07:21.872 nr_hugepages=1025 00:07:21.872 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:21.872 resv_hugepages=0 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:21.873 surplus_hugepages=0 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:21.873 anon_hugepages=0 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41555852 kB' 'MemAvailable: 45552780 kB' 'Buffers: 6064 kB' 'Cached: 12558168 kB' 'SwapCached: 0 kB' 'Active: 9398548 kB' 'Inactive: 3689560 kB' 'Active(anon): 9000124 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527228 kB' 'Mapped: 182640 kB' 'Shmem: 8476248 kB' 'KReclaimable: 556652 kB' 'Slab: 1210028 kB' 'SReclaimable: 556652 kB' 'SUnreclaim: 653376 kB' 'KernelStack: 22096 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 10468024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.873 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.874 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 25312712 kB' 'MemUsed: 7326428 kB' 'SwapCached: 0 kB' 'Active: 4442884 kB' 'Inactive: 231284 kB' 'Active(anon): 4309836 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4305112 kB' 'Mapped: 87896 kB' 'AnonPages: 372308 kB' 'Shmem: 3940780 kB' 'KernelStack: 12232 kB' 'PageTables: 5308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224364 kB' 'Slab: 529472 kB' 'SReclaimable: 224364 kB' 'SUnreclaim: 305108 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.875 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16242892 kB' 'MemUsed: 11413188 kB' 'SwapCached: 0 kB' 'Active: 4956048 kB' 'Inactive: 3458276 kB' 'Active(anon): 4690672 kB' 'Inactive(anon): 0 kB' 'Active(file): 265376 kB' 'Inactive(file): 3458276 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8259160 kB' 'Mapped: 94744 kB' 'AnonPages: 155272 kB' 'Shmem: 4535508 kB' 'KernelStack: 9880 kB' 'PageTables: 3324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 332288 kB' 'Slab: 680556 kB' 'SReclaimable: 332288 kB' 'SUnreclaim: 348268 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.876 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:07:21.877 node0=512 expecting 513 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:07:21.877 node1=513 expecting 512 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:07:21.877 00:07:21.877 real 0m4.244s 00:07:21.877 user 0m1.478s 00:07:21.877 sys 0m2.752s 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:21.877 06:23:35 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:07:21.877 ************************************ 00:07:21.877 END TEST odd_alloc 00:07:21.877 ************************************ 00:07:21.877 06:23:35 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:07:21.877 06:23:35 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:21.877 06:23:35 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:21.877 06:23:35 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:21.877 ************************************ 00:07:21.877 START TEST custom_alloc 00:07:21.877 ************************************ 00:07:21.877 06:23:35 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:07:21.877 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:07:21.877 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:07:21.877 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:07:21.877 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:07:21.878 06:23:35 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:26.071 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:26.071 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 40486020 kB' 'MemAvailable: 44482948 kB' 'Buffers: 6064 kB' 'Cached: 12558272 kB' 'SwapCached: 0 kB' 'Active: 9399648 kB' 'Inactive: 3689560 kB' 'Active(anon): 9001224 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527964 kB' 'Mapped: 182712 kB' 'Shmem: 8476352 kB' 'KReclaimable: 556652 kB' 'Slab: 1210492 kB' 'SReclaimable: 556652 kB' 'SUnreclaim: 653840 kB' 'KernelStack: 22256 kB' 'PageTables: 9068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 10471648 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 219020 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.071 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.072 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 40486844 kB' 'MemAvailable: 44483772 kB' 'Buffers: 6064 kB' 'Cached: 12558272 kB' 'SwapCached: 0 kB' 'Active: 9399256 kB' 'Inactive: 3689560 kB' 'Active(anon): 9000832 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527596 kB' 'Mapped: 182704 kB' 'Shmem: 8476352 kB' 'KReclaimable: 556652 kB' 'Slab: 1210464 kB' 'SReclaimable: 556652 kB' 'SUnreclaim: 653812 kB' 'KernelStack: 22144 kB' 'PageTables: 8524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 10471664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218988 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.073 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.074 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 40486060 kB' 'MemAvailable: 44482988 kB' 'Buffers: 6064 kB' 'Cached: 12558272 kB' 'SwapCached: 0 kB' 'Active: 9399096 kB' 'Inactive: 3689560 kB' 'Active(anon): 9000672 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527416 kB' 'Mapped: 182704 kB' 'Shmem: 8476352 kB' 'KReclaimable: 556652 kB' 'Slab: 1210464 kB' 'SReclaimable: 556652 kB' 'SUnreclaim: 653812 kB' 'KernelStack: 22000 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 10468816 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218876 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.075 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.076 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:07:26.077 nr_hugepages=1536 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:26.077 resv_hugepages=0 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:26.077 surplus_hugepages=0 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:26.077 anon_hugepages=0 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 40485580 kB' 'MemAvailable: 44482508 kB' 'Buffers: 6064 kB' 'Cached: 12558332 kB' 'SwapCached: 0 kB' 'Active: 9398888 kB' 'Inactive: 3689560 kB' 'Active(anon): 9000464 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527220 kB' 'Mapped: 182712 kB' 'Shmem: 8476412 kB' 'KReclaimable: 556652 kB' 'Slab: 1210720 kB' 'SReclaimable: 556652 kB' 'SUnreclaim: 654068 kB' 'KernelStack: 22048 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 10468836 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218860 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.077 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.078 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 25289892 kB' 'MemUsed: 7349248 kB' 'SwapCached: 0 kB' 'Active: 4443332 kB' 'Inactive: 231284 kB' 'Active(anon): 4310284 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4305112 kB' 'Mapped: 87908 kB' 'AnonPages: 372632 kB' 'Shmem: 3940780 kB' 'KernelStack: 12248 kB' 'PageTables: 5316 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224364 kB' 'Slab: 529744 kB' 'SReclaimable: 224364 kB' 'SUnreclaim: 305380 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.079 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:07:26.080 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 15195688 kB' 'MemUsed: 12460392 kB' 'SwapCached: 0 kB' 'Active: 4955932 kB' 'Inactive: 3458276 kB' 'Active(anon): 4690556 kB' 'Inactive(anon): 0 kB' 'Active(file): 265376 kB' 'Inactive(file): 3458276 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8259308 kB' 'Mapped: 94804 kB' 'AnonPages: 154964 kB' 'Shmem: 4535656 kB' 'KernelStack: 9800 kB' 'PageTables: 3068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 332288 kB' 'Slab: 680976 kB' 'SReclaimable: 332288 kB' 'SUnreclaim: 348688 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.081 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:07:26.082 node0=512 expecting 512 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:07:26.082 node1=1024 expecting 1024 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:07:26.082 00:07:26.082 real 0m4.004s 00:07:26.082 user 0m1.327s 00:07:26.082 sys 0m2.652s 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:26.082 06:23:39 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:07:26.082 ************************************ 00:07:26.082 END TEST custom_alloc 00:07:26.082 ************************************ 00:07:26.082 06:23:39 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:07:26.082 06:23:39 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:26.082 06:23:39 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:26.082 06:23:39 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:26.082 ************************************ 00:07:26.082 START TEST no_shrink_alloc 00:07:26.082 ************************************ 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:07:26.082 06:23:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:30.276 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:30.276 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41517380 kB' 'MemAvailable: 45514284 kB' 'Buffers: 6064 kB' 'Cached: 12558436 kB' 'SwapCached: 0 kB' 'Active: 9401292 kB' 'Inactive: 3689560 kB' 'Active(anon): 9002868 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529600 kB' 'Mapped: 182740 kB' 'Shmem: 8476516 kB' 'KReclaimable: 556628 kB' 'Slab: 1210960 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654332 kB' 'KernelStack: 22272 kB' 'PageTables: 9068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10472952 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218956 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.276 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.277 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41519960 kB' 'MemAvailable: 45516864 kB' 'Buffers: 6064 kB' 'Cached: 12558440 kB' 'SwapCached: 0 kB' 'Active: 9401940 kB' 'Inactive: 3689560 kB' 'Active(anon): 9003516 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530288 kB' 'Mapped: 182728 kB' 'Shmem: 8476520 kB' 'KReclaimable: 556628 kB' 'Slab: 1211024 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654396 kB' 'KernelStack: 22448 kB' 'PageTables: 9356 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10472968 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 219036 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.278 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.279 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41523480 kB' 'MemAvailable: 45520384 kB' 'Buffers: 6064 kB' 'Cached: 12558460 kB' 'SwapCached: 0 kB' 'Active: 9401420 kB' 'Inactive: 3689560 kB' 'Active(anon): 9002996 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529728 kB' 'Mapped: 182652 kB' 'Shmem: 8476540 kB' 'KReclaimable: 556628 kB' 'Slab: 1211116 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654488 kB' 'KernelStack: 22320 kB' 'PageTables: 9648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10472992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218876 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.280 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.281 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:07:30.282 nr_hugepages=1024 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:30.282 resv_hugepages=0 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:30.282 surplus_hugepages=0 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:30.282 anon_hugepages=0 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41523504 kB' 'MemAvailable: 45520408 kB' 'Buffers: 6064 kB' 'Cached: 12558480 kB' 'SwapCached: 0 kB' 'Active: 9402188 kB' 'Inactive: 3689560 kB' 'Active(anon): 9003764 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530488 kB' 'Mapped: 182720 kB' 'Shmem: 8476560 kB' 'KReclaimable: 556628 kB' 'Slab: 1210956 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654328 kB' 'KernelStack: 22496 kB' 'PageTables: 10080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10473012 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 219004 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.282 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.283 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 24241400 kB' 'MemUsed: 8397740 kB' 'SwapCached: 0 kB' 'Active: 4444960 kB' 'Inactive: 231284 kB' 'Active(anon): 4311912 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4305124 kB' 'Mapped: 87916 kB' 'AnonPages: 374304 kB' 'Shmem: 3940792 kB' 'KernelStack: 12680 kB' 'PageTables: 6264 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224364 kB' 'Slab: 529860 kB' 'SReclaimable: 224364 kB' 'SUnreclaim: 305496 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.284 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:07:30.285 node0=1024 expecting 1024 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:07:30.285 06:23:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:34.482 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:34.482 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:07:34.482 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41537312 kB' 'MemAvailable: 45534216 kB' 'Buffers: 6064 kB' 'Cached: 12558596 kB' 'SwapCached: 0 kB' 'Active: 9401388 kB' 'Inactive: 3689560 kB' 'Active(anon): 9002964 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529520 kB' 'Mapped: 182692 kB' 'Shmem: 8476676 kB' 'KReclaimable: 556628 kB' 'Slab: 1211004 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654376 kB' 'KernelStack: 22016 kB' 'PageTables: 8272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10473756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218812 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.482 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.483 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41540692 kB' 'MemAvailable: 45537596 kB' 'Buffers: 6064 kB' 'Cached: 12558596 kB' 'SwapCached: 0 kB' 'Active: 9401544 kB' 'Inactive: 3689560 kB' 'Active(anon): 9003120 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529652 kB' 'Mapped: 182728 kB' 'Shmem: 8476676 kB' 'KReclaimable: 556628 kB' 'Slab: 1211060 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654432 kB' 'KernelStack: 22048 kB' 'PageTables: 8508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10473776 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218780 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.484 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.485 06:23:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41539972 kB' 'MemAvailable: 45536876 kB' 'Buffers: 6064 kB' 'Cached: 12558600 kB' 'SwapCached: 0 kB' 'Active: 9401964 kB' 'Inactive: 3689560 kB' 'Active(anon): 9003540 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530572 kB' 'Mapped: 183232 kB' 'Shmem: 8476680 kB' 'KReclaimable: 556628 kB' 'Slab: 1211060 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654432 kB' 'KernelStack: 22176 kB' 'PageTables: 8648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10474888 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218908 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.486 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.487 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:07:34.749 nr_hugepages=1024 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:34.749 resv_hugepages=0 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:34.749 surplus_hugepages=0 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:34.749 anon_hugepages=0 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41533196 kB' 'MemAvailable: 45530100 kB' 'Buffers: 6064 kB' 'Cached: 12558640 kB' 'SwapCached: 0 kB' 'Active: 9406292 kB' 'Inactive: 3689560 kB' 'Active(anon): 9007868 kB' 'Inactive(anon): 0 kB' 'Active(file): 398424 kB' 'Inactive(file): 3689560 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534344 kB' 'Mapped: 183232 kB' 'Shmem: 8476720 kB' 'KReclaimable: 556628 kB' 'Slab: 1211060 kB' 'SReclaimable: 556628 kB' 'SUnreclaim: 654432 kB' 'KernelStack: 22096 kB' 'PageTables: 8204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10478320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218876 kB' 'VmallocChunk: 0 kB' 'Percpu: 112000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3444084 kB' 'DirectMap2M: 19310592 kB' 'DirectMap1G: 46137344 kB' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.749 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.750 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 24246696 kB' 'MemUsed: 8392444 kB' 'SwapCached: 0 kB' 'Active: 4446428 kB' 'Inactive: 231284 kB' 'Active(anon): 4313380 kB' 'Inactive(anon): 0 kB' 'Active(file): 133048 kB' 'Inactive(file): 231284 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4305180 kB' 'Mapped: 87924 kB' 'AnonPages: 375688 kB' 'Shmem: 3940848 kB' 'KernelStack: 12552 kB' 'PageTables: 5816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224364 kB' 'Slab: 529908 kB' 'SReclaimable: 224364 kB' 'SUnreclaim: 305544 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.751 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.752 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:07:34.753 node0=1024 expecting 1024 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:07:34.753 00:07:34.753 real 0m8.708s 00:07:34.753 user 0m3.214s 00:07:34.753 sys 0m5.630s 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:34.753 06:23:48 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:07:34.753 ************************************ 00:07:34.753 END TEST no_shrink_alloc 00:07:34.753 ************************************ 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:07:34.753 06:23:48 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:07:34.753 00:07:34.753 real 0m33.122s 00:07:34.753 user 0m11.273s 00:07:34.753 sys 0m20.317s 00:07:34.753 06:23:48 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:34.753 06:23:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:34.753 ************************************ 00:07:34.753 END TEST hugepages 00:07:34.753 ************************************ 00:07:34.753 06:23:48 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:07:34.753 06:23:48 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:34.753 06:23:48 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:34.753 06:23:48 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:07:34.753 ************************************ 00:07:34.753 START TEST driver 00:07:34.753 ************************************ 00:07:34.753 06:23:48 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:07:35.012 * Looking for test storage... 00:07:35.012 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:07:35.012 06:23:48 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:07:35.012 06:23:48 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:07:35.012 06:23:48 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:41.575 06:23:54 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:07:41.575 06:23:54 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:41.575 06:23:54 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:41.575 06:23:54 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:07:41.575 ************************************ 00:07:41.575 START TEST guess_driver 00:07:41.575 ************************************ 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 256 > 0 )) 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:07:41.575 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:07:41.575 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:07:41.575 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:07:41.575 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:07:41.575 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:07:41.575 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:07:41.575 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:07:41.575 Looking for driver=vfio-pci 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:07:41.575 06:23:54 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:44.861 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:45.120 06:23:58 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:47.060 06:24:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:47.060 06:24:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:47.060 06:24:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:47.060 06:24:00 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:07:47.060 06:24:00 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:07:47.060 06:24:00 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:07:47.060 06:24:00 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:52.328 00:07:52.328 real 0m11.684s 00:07:52.328 user 0m2.945s 00:07:52.328 sys 0m5.916s 00:07:52.328 06:24:05 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:52.328 06:24:05 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:07:52.328 ************************************ 00:07:52.328 END TEST guess_driver 00:07:52.328 ************************************ 00:07:52.328 00:07:52.328 real 0m17.573s 00:07:52.328 user 0m4.628s 00:07:52.328 sys 0m9.295s 00:07:52.328 06:24:05 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:52.328 06:24:05 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:07:52.328 ************************************ 00:07:52.328 END TEST driver 00:07:52.328 ************************************ 00:07:52.328 06:24:05 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:07:52.328 06:24:05 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:52.328 06:24:05 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:52.328 06:24:05 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:07:52.586 ************************************ 00:07:52.586 START TEST devices 00:07:52.586 ************************************ 00:07:52.586 06:24:05 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:07:52.586 * Looking for test storage... 00:07:52.586 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:07:52.586 06:24:05 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:07:52.586 06:24:05 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:07:52.586 06:24:05 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:07:52.586 06:24:05 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:07:57.859 06:24:10 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:07:57.859 06:24:10 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:07:57.859 No valid GPT data, bailing 00:07:57.859 06:24:10 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:07:57.859 06:24:10 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:07:57.859 06:24:10 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:07:57.859 06:24:10 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:07:57.859 06:24:10 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:07:57.859 06:24:10 setup.sh.devices -- setup/common.sh@80 -- # echo 2000398934016 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@204 -- # (( 2000398934016 >= min_disk_size )) 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:07:57.859 06:24:10 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:57.859 06:24:10 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:07:57.859 ************************************ 00:07:57.859 START TEST nvme_mount 00:07:57.859 ************************************ 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:07:57.859 06:24:10 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:07:58.119 Creating new GPT entries in memory. 00:07:58.119 GPT data structures destroyed! You may now partition the disk using fdisk or 00:07:58.119 other utilities. 00:07:58.119 06:24:11 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:07:58.119 06:24:11 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:07:58.119 06:24:11 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:07:58.119 06:24:11 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:07:58.119 06:24:11 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:07:59.499 Creating new GPT entries in memory. 00:07:59.499 The operation has completed successfully. 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1020763 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:07:59.499 06:24:12 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.786 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:02.787 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:08:03.046 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:08:03.046 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:08:03.306 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:08:03.306 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:08:03.306 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:08:03.306 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:08:03.306 06:24:16 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:08:07.500 06:24:20 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.694 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:08:11.695 06:24:24 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:11.695 06:24:25 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:08:11.695 06:24:25 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:08:11.695 06:24:25 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:08:11.695 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:08:11.695 00:08:11.695 real 0m14.360s 00:08:11.695 user 0m4.085s 00:08:11.695 sys 0m8.156s 00:08:11.695 06:24:25 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:11.695 06:24:25 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:08:11.695 ************************************ 00:08:11.695 END TEST nvme_mount 00:08:11.695 ************************************ 00:08:11.695 06:24:25 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:08:11.695 06:24:25 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:11.695 06:24:25 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:11.695 06:24:25 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:08:11.695 ************************************ 00:08:11.695 START TEST dm_mount 00:08:11.695 ************************************ 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:08:11.695 06:24:25 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:08:12.631 Creating new GPT entries in memory. 00:08:12.631 GPT data structures destroyed! You may now partition the disk using fdisk or 00:08:12.631 other utilities. 00:08:12.631 06:24:26 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:08:12.631 06:24:26 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:08:12.631 06:24:26 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:08:12.631 06:24:26 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:08:12.631 06:24:26 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:08:13.602 Creating new GPT entries in memory. 00:08:13.602 The operation has completed successfully. 00:08:13.602 06:24:27 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:08:13.602 06:24:27 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:08:13.602 06:24:27 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:08:13.602 06:24:27 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:08:13.602 06:24:27 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:08:14.981 The operation has completed successfully. 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1025927 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-2 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-2 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-2 ]] 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-2 ]] 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:08:14.981 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:08:14.982 06:24:28 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-2,holder@nvme0n1p2:dm-2,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-2,holder@nvme0n1p2:dm-2 '' '' 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-2,holder@nvme0n1p2:dm-2 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:08:19.172 06:24:32 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-2,holder@nvme0n1p2:dm-2, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\2\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\2* ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:08:23.359 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:08:23.359 00:08:23.359 real 0m11.690s 00:08:23.359 user 0m2.941s 00:08:23.359 sys 0m5.863s 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:23.359 06:24:36 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:08:23.359 ************************************ 00:08:23.359 END TEST dm_mount 00:08:23.359 ************************************ 00:08:23.359 06:24:36 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:08:23.359 06:24:36 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:08:23.359 06:24:36 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:08:23.359 06:24:36 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:08:23.359 06:24:36 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:08:23.359 06:24:36 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:08:23.359 06:24:36 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:08:23.619 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:08:23.619 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:08:23.619 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:08:23.619 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:08:23.619 06:24:37 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:08:23.619 06:24:37 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:08:23.619 06:24:37 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:08:23.619 06:24:37 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:08:23.619 06:24:37 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:08:23.619 06:24:37 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:08:23.619 06:24:37 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:08:23.619 00:08:23.619 real 0m31.211s 00:08:23.619 user 0m8.724s 00:08:23.619 sys 0m17.406s 00:08:23.619 06:24:37 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:23.619 06:24:37 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:08:23.619 ************************************ 00:08:23.619 END TEST devices 00:08:23.619 ************************************ 00:08:23.619 00:08:23.619 real 1m52.167s 00:08:23.619 user 0m34.012s 00:08:23.619 sys 1m5.750s 00:08:23.619 06:24:37 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:23.619 06:24:37 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:08:23.619 ************************************ 00:08:23.619 END TEST setup.sh 00:08:23.619 ************************************ 00:08:23.879 06:24:37 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:08:28.071 Hugepages 00:08:28.071 node hugesize free / total 00:08:28.071 node0 1048576kB 0 / 0 00:08:28.071 node0 2048kB 1024 / 1024 00:08:28.071 node1 1048576kB 0 / 0 00:08:28.071 node1 2048kB 1024 / 1024 00:08:28.071 00:08:28.071 Type BDF Vendor Device NUMA Driver Device Block devices 00:08:28.071 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:08:28.071 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:08:28.071 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:08:28.071 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:08:28.071 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:08:28.071 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:08:28.071 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:08:28.071 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:08:28.071 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:08:28.071 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:08:28.071 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:08:28.071 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:08:28.071 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:08:28.071 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:08:28.071 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:08:28.071 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:08:28.071 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:08:28.071 06:24:41 -- spdk/autotest.sh@130 -- # uname -s 00:08:28.071 06:24:41 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:08:28.071 06:24:41 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:08:28.071 06:24:41 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:08:32.258 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:32.258 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:34.162 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:08:34.162 06:24:47 -- common/autotest_common.sh@1532 -- # sleep 1 00:08:35.100 06:24:48 -- common/autotest_common.sh@1533 -- # bdfs=() 00:08:35.100 06:24:48 -- common/autotest_common.sh@1533 -- # local bdfs 00:08:35.100 06:24:48 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:08:35.100 06:24:48 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:08:35.100 06:24:48 -- common/autotest_common.sh@1513 -- # bdfs=() 00:08:35.100 06:24:48 -- common/autotest_common.sh@1513 -- # local bdfs 00:08:35.100 06:24:48 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:08:35.100 06:24:48 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:08:35.100 06:24:48 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:08:35.100 06:24:48 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:08:35.100 06:24:48 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:d8:00.0 00:08:35.100 06:24:48 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:08:39.356 Waiting for block devices as requested 00:08:39.356 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:39.356 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:39.356 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:39.615 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:39.615 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:39.615 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:39.874 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:39.874 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:40.132 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:40.132 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:40.132 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:40.390 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:40.390 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:40.390 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:40.648 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:40.648 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:40.648 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:40.908 06:24:54 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:08:40.908 06:24:54 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:08:40.908 06:24:54 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:08:40.908 06:24:54 -- common/autotest_common.sh@1502 -- # grep 0000:d8:00.0/nvme/nvme 00:08:40.908 06:24:54 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:08:40.908 06:24:54 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:08:40.908 06:24:54 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:08:40.908 06:24:54 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:08:40.908 06:24:54 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:08:40.908 06:24:54 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:08:40.908 06:24:54 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:08:40.908 06:24:54 -- common/autotest_common.sh@1545 -- # grep oacs 00:08:40.908 06:24:54 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:08:40.908 06:24:54 -- common/autotest_common.sh@1545 -- # oacs=' 0xe' 00:08:40.908 06:24:54 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:08:40.908 06:24:54 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:08:40.908 06:24:54 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:08:40.908 06:24:54 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:08:40.908 06:24:54 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:08:40.908 06:24:54 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:08:40.908 06:24:54 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:08:40.908 06:24:54 -- common/autotest_common.sh@1557 -- # continue 00:08:40.908 06:24:54 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:08:40.908 06:24:54 -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:40.908 06:24:54 -- common/autotest_common.sh@10 -- # set +x 00:08:40.908 06:24:54 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:08:40.908 06:24:54 -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:40.908 06:24:54 -- common/autotest_common.sh@10 -- # set +x 00:08:40.908 06:24:54 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:08:45.095 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:45.095 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:45.096 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:47.001 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:08:47.001 06:25:00 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:08:47.001 06:25:00 -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:47.001 06:25:00 -- common/autotest_common.sh@10 -- # set +x 00:08:47.001 06:25:00 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:08:47.001 06:25:00 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:08:47.001 06:25:00 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:08:47.001 06:25:00 -- common/autotest_common.sh@1577 -- # bdfs=() 00:08:47.001 06:25:00 -- common/autotest_common.sh@1577 -- # local bdfs 00:08:47.001 06:25:00 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:08:47.001 06:25:00 -- common/autotest_common.sh@1513 -- # bdfs=() 00:08:47.001 06:25:00 -- common/autotest_common.sh@1513 -- # local bdfs 00:08:47.002 06:25:00 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:08:47.002 06:25:00 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:08:47.002 06:25:00 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:08:47.002 06:25:00 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:08:47.002 06:25:00 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:d8:00.0 00:08:47.002 06:25:00 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:08:47.002 06:25:00 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:08:47.002 06:25:00 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:08:47.002 06:25:00 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:08:47.002 06:25:00 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:08:47.002 06:25:00 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:d8:00.0 00:08:47.002 06:25:00 -- common/autotest_common.sh@1592 -- # [[ -z 0000:d8:00.0 ]] 00:08:47.002 06:25:00 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=1037206 00:08:47.002 06:25:00 -- common/autotest_common.sh@1598 -- # waitforlisten 1037206 00:08:47.002 06:25:00 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:47.002 06:25:00 -- common/autotest_common.sh@831 -- # '[' -z 1037206 ']' 00:08:47.002 06:25:00 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:47.002 06:25:00 -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:47.002 06:25:00 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:47.002 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:47.002 06:25:00 -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:47.002 06:25:00 -- common/autotest_common.sh@10 -- # set +x 00:08:47.002 [2024-07-25 06:25:00.404294] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:08:47.002 [2024-07-25 06:25:00.404355] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1037206 ] 00:08:47.002 [2024-07-25 06:25:00.529102] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.261 [2024-07-25 06:25:00.573765] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.829 06:25:01 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:47.829 06:25:01 -- common/autotest_common.sh@864 -- # return 0 00:08:47.829 06:25:01 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:08:47.829 06:25:01 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:08:47.829 06:25:01 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:08:51.121 nvme0n1 00:08:51.121 06:25:04 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:08:51.121 [2024-07-25 06:25:04.573287] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:08:51.121 request: 00:08:51.121 { 00:08:51.121 "nvme_ctrlr_name": "nvme0", 00:08:51.121 "password": "test", 00:08:51.121 "method": "bdev_nvme_opal_revert", 00:08:51.121 "req_id": 1 00:08:51.121 } 00:08:51.121 Got JSON-RPC error response 00:08:51.121 response: 00:08:51.121 { 00:08:51.121 "code": -32602, 00:08:51.121 "message": "Invalid parameters" 00:08:51.121 } 00:08:51.121 06:25:04 -- common/autotest_common.sh@1604 -- # true 00:08:51.121 06:25:04 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:08:51.121 06:25:04 -- common/autotest_common.sh@1608 -- # killprocess 1037206 00:08:51.121 06:25:04 -- common/autotest_common.sh@950 -- # '[' -z 1037206 ']' 00:08:51.121 06:25:04 -- common/autotest_common.sh@954 -- # kill -0 1037206 00:08:51.121 06:25:04 -- common/autotest_common.sh@955 -- # uname 00:08:51.121 06:25:04 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:51.121 06:25:04 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1037206 00:08:51.121 06:25:04 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:51.121 06:25:04 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:51.121 06:25:04 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1037206' 00:08:51.121 killing process with pid 1037206 00:08:51.121 06:25:04 -- common/autotest_common.sh@969 -- # kill 1037206 00:08:51.121 06:25:04 -- common/autotest_common.sh@974 -- # wait 1037206 00:08:53.656 06:25:07 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:08:53.656 06:25:07 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:08:53.656 06:25:07 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:08:53.656 06:25:07 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:08:53.656 06:25:07 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:08:54.594 Restarting all devices. 00:09:01.167 lstat() error: No such file or directory 00:09:01.167 QAT Error: No GENERAL section found 00:09:01.167 Failed to configure qat_dev0 00:09:01.167 lstat() error: No such file or directory 00:09:01.167 QAT Error: No GENERAL section found 00:09:01.167 Failed to configure qat_dev1 00:09:01.167 lstat() error: No such file or directory 00:09:01.167 QAT Error: No GENERAL section found 00:09:01.167 Failed to configure qat_dev2 00:09:01.167 lstat() error: No such file or directory 00:09:01.167 QAT Error: No GENERAL section found 00:09:01.167 Failed to configure qat_dev3 00:09:01.167 lstat() error: No such file or directory 00:09:01.167 QAT Error: No GENERAL section found 00:09:01.167 Failed to configure qat_dev4 00:09:01.167 enable sriov 00:09:01.167 Checking status of all devices. 00:09:01.167 There is 5 QAT acceleration device(s) in the system: 00:09:01.167 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:1a:00.0, #accel: 5 #engines: 10 state: down 00:09:01.167 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:1c:00.0, #accel: 5 #engines: 10 state: down 00:09:01.167 qat_dev2 - type: c6xx, inst_id: 2, node_id: 0, bsf: 0000:1e:00.0, #accel: 5 #engines: 10 state: down 00:09:01.167 qat_dev3 - type: c6xx, inst_id: 3, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:09:01.167 qat_dev4 - type: c6xx, inst_id: 4, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:09:01.167 0000:1a:00.0 set to 16 VFs 00:09:01.735 0000:1c:00.0 set to 16 VFs 00:09:02.673 0000:1e:00.0 set to 16 VFs 00:09:03.302 0000:3d:00.0 set to 16 VFs 00:09:04.240 0000:3f:00.0 set to 16 VFs 00:09:06.768 Properly configured the qat device with driver uio_pci_generic. 00:09:06.768 06:25:19 -- spdk/autotest.sh@162 -- # timing_enter lib 00:09:06.768 06:25:19 -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:06.768 06:25:19 -- common/autotest_common.sh@10 -- # set +x 00:09:06.768 06:25:19 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:09:06.768 06:25:19 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:09:06.768 06:25:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:06.768 06:25:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:06.768 06:25:19 -- common/autotest_common.sh@10 -- # set +x 00:09:06.768 ************************************ 00:09:06.768 START TEST env 00:09:06.768 ************************************ 00:09:06.768 06:25:19 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:09:06.768 * Looking for test storage... 00:09:06.768 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:09:06.768 06:25:20 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:09:06.768 06:25:20 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:06.768 06:25:20 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:06.768 06:25:20 env -- common/autotest_common.sh@10 -- # set +x 00:09:06.768 ************************************ 00:09:06.768 START TEST env_memory 00:09:06.768 ************************************ 00:09:06.768 06:25:20 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:09:06.768 00:09:06.768 00:09:06.768 CUnit - A unit testing framework for C - Version 2.1-3 00:09:06.768 http://cunit.sourceforge.net/ 00:09:06.768 00:09:06.768 00:09:06.768 Suite: memory 00:09:06.768 Test: alloc and free memory map ...[2024-07-25 06:25:20.163160] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:09:06.768 passed 00:09:06.768 Test: mem map translation ...[2024-07-25 06:25:20.190066] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:09:06.768 [2024-07-25 06:25:20.190087] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:09:06.768 [2024-07-25 06:25:20.190143] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:09:06.768 [2024-07-25 06:25:20.190155] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:09:06.768 passed 00:09:06.768 Test: mem map registration ...[2024-07-25 06:25:20.243211] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:09:06.768 [2024-07-25 06:25:20.243232] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:09:06.768 passed 00:09:06.768 Test: mem map adjacent registrations ...passed 00:09:06.768 00:09:06.768 Run Summary: Type Total Ran Passed Failed Inactive 00:09:06.768 suites 1 1 n/a 0 0 00:09:06.768 tests 4 4 4 0 0 00:09:06.768 asserts 152 152 152 0 n/a 00:09:06.768 00:09:06.768 Elapsed time = 0.186 seconds 00:09:06.768 00:09:06.768 real 0m0.199s 00:09:06.768 user 0m0.187s 00:09:06.768 sys 0m0.011s 00:09:06.768 06:25:20 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:06.768 06:25:20 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:09:06.768 ************************************ 00:09:06.768 END TEST env_memory 00:09:06.768 ************************************ 00:09:07.027 06:25:20 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:09:07.027 06:25:20 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:07.027 06:25:20 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:07.027 06:25:20 env -- common/autotest_common.sh@10 -- # set +x 00:09:07.027 ************************************ 00:09:07.027 START TEST env_vtophys 00:09:07.027 ************************************ 00:09:07.027 06:25:20 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:09:07.027 EAL: lib.eal log level changed from notice to debug 00:09:07.027 EAL: Detected lcore 0 as core 0 on socket 0 00:09:07.027 EAL: Detected lcore 1 as core 1 on socket 0 00:09:07.027 EAL: Detected lcore 2 as core 2 on socket 0 00:09:07.027 EAL: Detected lcore 3 as core 3 on socket 0 00:09:07.027 EAL: Detected lcore 4 as core 4 on socket 0 00:09:07.027 EAL: Detected lcore 5 as core 5 on socket 0 00:09:07.027 EAL: Detected lcore 6 as core 6 on socket 0 00:09:07.027 EAL: Detected lcore 7 as core 8 on socket 0 00:09:07.027 EAL: Detected lcore 8 as core 9 on socket 0 00:09:07.027 EAL: Detected lcore 9 as core 10 on socket 0 00:09:07.027 EAL: Detected lcore 10 as core 11 on socket 0 00:09:07.027 EAL: Detected lcore 11 as core 12 on socket 0 00:09:07.027 EAL: Detected lcore 12 as core 13 on socket 0 00:09:07.027 EAL: Detected lcore 13 as core 14 on socket 0 00:09:07.027 EAL: Detected lcore 14 as core 16 on socket 0 00:09:07.027 EAL: Detected lcore 15 as core 17 on socket 0 00:09:07.027 EAL: Detected lcore 16 as core 18 on socket 0 00:09:07.027 EAL: Detected lcore 17 as core 19 on socket 0 00:09:07.027 EAL: Detected lcore 18 as core 20 on socket 0 00:09:07.027 EAL: Detected lcore 19 as core 21 on socket 0 00:09:07.027 EAL: Detected lcore 20 as core 22 on socket 0 00:09:07.027 EAL: Detected lcore 21 as core 24 on socket 0 00:09:07.027 EAL: Detected lcore 22 as core 25 on socket 0 00:09:07.027 EAL: Detected lcore 23 as core 26 on socket 0 00:09:07.027 EAL: Detected lcore 24 as core 27 on socket 0 00:09:07.027 EAL: Detected lcore 25 as core 28 on socket 0 00:09:07.027 EAL: Detected lcore 26 as core 29 on socket 0 00:09:07.027 EAL: Detected lcore 27 as core 30 on socket 0 00:09:07.027 EAL: Detected lcore 28 as core 0 on socket 1 00:09:07.027 EAL: Detected lcore 29 as core 1 on socket 1 00:09:07.027 EAL: Detected lcore 30 as core 2 on socket 1 00:09:07.027 EAL: Detected lcore 31 as core 3 on socket 1 00:09:07.027 EAL: Detected lcore 32 as core 4 on socket 1 00:09:07.028 EAL: Detected lcore 33 as core 5 on socket 1 00:09:07.028 EAL: Detected lcore 34 as core 6 on socket 1 00:09:07.028 EAL: Detected lcore 35 as core 8 on socket 1 00:09:07.028 EAL: Detected lcore 36 as core 9 on socket 1 00:09:07.028 EAL: Detected lcore 37 as core 10 on socket 1 00:09:07.028 EAL: Detected lcore 38 as core 11 on socket 1 00:09:07.028 EAL: Detected lcore 39 as core 12 on socket 1 00:09:07.028 EAL: Detected lcore 40 as core 13 on socket 1 00:09:07.028 EAL: Detected lcore 41 as core 14 on socket 1 00:09:07.028 EAL: Detected lcore 42 as core 16 on socket 1 00:09:07.028 EAL: Detected lcore 43 as core 17 on socket 1 00:09:07.028 EAL: Detected lcore 44 as core 18 on socket 1 00:09:07.028 EAL: Detected lcore 45 as core 19 on socket 1 00:09:07.028 EAL: Detected lcore 46 as core 20 on socket 1 00:09:07.028 EAL: Detected lcore 47 as core 21 on socket 1 00:09:07.028 EAL: Detected lcore 48 as core 22 on socket 1 00:09:07.028 EAL: Detected lcore 49 as core 24 on socket 1 00:09:07.028 EAL: Detected lcore 50 as core 25 on socket 1 00:09:07.028 EAL: Detected lcore 51 as core 26 on socket 1 00:09:07.028 EAL: Detected lcore 52 as core 27 on socket 1 00:09:07.028 EAL: Detected lcore 53 as core 28 on socket 1 00:09:07.028 EAL: Detected lcore 54 as core 29 on socket 1 00:09:07.028 EAL: Detected lcore 55 as core 30 on socket 1 00:09:07.028 EAL: Detected lcore 56 as core 0 on socket 0 00:09:07.028 EAL: Detected lcore 57 as core 1 on socket 0 00:09:07.028 EAL: Detected lcore 58 as core 2 on socket 0 00:09:07.028 EAL: Detected lcore 59 as core 3 on socket 0 00:09:07.028 EAL: Detected lcore 60 as core 4 on socket 0 00:09:07.028 EAL: Detected lcore 61 as core 5 on socket 0 00:09:07.028 EAL: Detected lcore 62 as core 6 on socket 0 00:09:07.028 EAL: Detected lcore 63 as core 8 on socket 0 00:09:07.028 EAL: Detected lcore 64 as core 9 on socket 0 00:09:07.028 EAL: Detected lcore 65 as core 10 on socket 0 00:09:07.028 EAL: Detected lcore 66 as core 11 on socket 0 00:09:07.028 EAL: Detected lcore 67 as core 12 on socket 0 00:09:07.028 EAL: Detected lcore 68 as core 13 on socket 0 00:09:07.028 EAL: Detected lcore 69 as core 14 on socket 0 00:09:07.028 EAL: Detected lcore 70 as core 16 on socket 0 00:09:07.028 EAL: Detected lcore 71 as core 17 on socket 0 00:09:07.028 EAL: Detected lcore 72 as core 18 on socket 0 00:09:07.028 EAL: Detected lcore 73 as core 19 on socket 0 00:09:07.028 EAL: Detected lcore 74 as core 20 on socket 0 00:09:07.028 EAL: Detected lcore 75 as core 21 on socket 0 00:09:07.028 EAL: Detected lcore 76 as core 22 on socket 0 00:09:07.028 EAL: Detected lcore 77 as core 24 on socket 0 00:09:07.028 EAL: Detected lcore 78 as core 25 on socket 0 00:09:07.028 EAL: Detected lcore 79 as core 26 on socket 0 00:09:07.028 EAL: Detected lcore 80 as core 27 on socket 0 00:09:07.028 EAL: Detected lcore 81 as core 28 on socket 0 00:09:07.028 EAL: Detected lcore 82 as core 29 on socket 0 00:09:07.028 EAL: Detected lcore 83 as core 30 on socket 0 00:09:07.028 EAL: Detected lcore 84 as core 0 on socket 1 00:09:07.028 EAL: Detected lcore 85 as core 1 on socket 1 00:09:07.028 EAL: Detected lcore 86 as core 2 on socket 1 00:09:07.028 EAL: Detected lcore 87 as core 3 on socket 1 00:09:07.028 EAL: Detected lcore 88 as core 4 on socket 1 00:09:07.028 EAL: Detected lcore 89 as core 5 on socket 1 00:09:07.028 EAL: Detected lcore 90 as core 6 on socket 1 00:09:07.028 EAL: Detected lcore 91 as core 8 on socket 1 00:09:07.028 EAL: Detected lcore 92 as core 9 on socket 1 00:09:07.028 EAL: Detected lcore 93 as core 10 on socket 1 00:09:07.028 EAL: Detected lcore 94 as core 11 on socket 1 00:09:07.028 EAL: Detected lcore 95 as core 12 on socket 1 00:09:07.028 EAL: Detected lcore 96 as core 13 on socket 1 00:09:07.028 EAL: Detected lcore 97 as core 14 on socket 1 00:09:07.028 EAL: Detected lcore 98 as core 16 on socket 1 00:09:07.028 EAL: Detected lcore 99 as core 17 on socket 1 00:09:07.028 EAL: Detected lcore 100 as core 18 on socket 1 00:09:07.028 EAL: Detected lcore 101 as core 19 on socket 1 00:09:07.028 EAL: Detected lcore 102 as core 20 on socket 1 00:09:07.028 EAL: Detected lcore 103 as core 21 on socket 1 00:09:07.028 EAL: Detected lcore 104 as core 22 on socket 1 00:09:07.028 EAL: Detected lcore 105 as core 24 on socket 1 00:09:07.028 EAL: Detected lcore 106 as core 25 on socket 1 00:09:07.028 EAL: Detected lcore 107 as core 26 on socket 1 00:09:07.028 EAL: Detected lcore 108 as core 27 on socket 1 00:09:07.028 EAL: Detected lcore 109 as core 28 on socket 1 00:09:07.028 EAL: Detected lcore 110 as core 29 on socket 1 00:09:07.028 EAL: Detected lcore 111 as core 30 on socket 1 00:09:07.028 EAL: Maximum logical cores by configuration: 128 00:09:07.028 EAL: Detected CPU lcores: 112 00:09:07.028 EAL: Detected NUMA nodes: 2 00:09:07.028 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:09:07.028 EAL: Detected shared linkage of DPDK 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_auxiliary.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_common_mlx5.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_common_qat.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24.0 00:09:07.028 EAL: pmd.net.i40e.init log level changed from disabled to notice 00:09:07.028 EAL: pmd.net.i40e.driver log level changed from disabled to notice 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_crypto_ipsec_mb.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_crypto_mlx5.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_compress_isal.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_compress_mlx5.so.24.0 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_auxiliary.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_common_mlx5.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_common_qat.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_crypto_ipsec_mb.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_crypto_mlx5.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_compress_isal.so 00:09:07.028 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_compress_mlx5.so 00:09:07.028 EAL: No shared files mode enabled, IPC will be disabled 00:09:07.028 EAL: No shared files mode enabled, IPC is disabled 00:09:07.028 EAL: PCI driver qat for device 0000:1a:01.0 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:01.1 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:01.2 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:01.3 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:01.4 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:01.5 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:01.6 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:01.7 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:02.0 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:02.1 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:02.2 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:02.3 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:02.4 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:02.5 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:02.6 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1a:02.7 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:01.0 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:01.1 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:01.2 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:01.3 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:01.4 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:01.5 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:01.6 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:01.7 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:02.0 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:02.1 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:02.2 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:02.3 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:02.4 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:02.5 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:02.6 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1c:02.7 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:01.0 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:01.1 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:01.2 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:01.3 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:01.4 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:01.5 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:01.6 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:01.7 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:02.0 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:02.1 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:02.2 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:02.3 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:02.4 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:02.5 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:02.6 wants IOVA as 'PA' 00:09:07.028 EAL: PCI driver qat for device 0000:1e:02.7 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:09:07.029 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:09:07.029 EAL: Bus pci wants IOVA as 'PA' 00:09:07.029 EAL: Bus auxiliary wants IOVA as 'DC' 00:09:07.029 EAL: Bus vdev wants IOVA as 'DC' 00:09:07.029 EAL: Selected IOVA mode 'PA' 00:09:07.029 EAL: Probing VFIO support... 00:09:07.029 EAL: IOMMU type 1 (Type 1) is supported 00:09:07.029 EAL: IOMMU type 7 (sPAPR) is not supported 00:09:07.029 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:09:07.029 EAL: VFIO support initialized 00:09:07.029 EAL: Ask a virtual area of 0x2e000 bytes 00:09:07.029 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:09:07.029 EAL: Setting up physically contiguous memory... 00:09:07.029 EAL: Setting maximum number of open files to 524288 00:09:07.029 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:09:07.029 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:09:07.029 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:09:07.029 EAL: Ask a virtual area of 0x61000 bytes 00:09:07.029 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:09:07.029 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:09:07.029 EAL: Ask a virtual area of 0x400000000 bytes 00:09:07.029 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:09:07.029 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:09:07.029 EAL: Ask a virtual area of 0x61000 bytes 00:09:07.029 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:09:07.029 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:09:07.029 EAL: Ask a virtual area of 0x400000000 bytes 00:09:07.029 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:09:07.029 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:09:07.029 EAL: Ask a virtual area of 0x61000 bytes 00:09:07.029 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:09:07.029 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:09:07.029 EAL: Ask a virtual area of 0x400000000 bytes 00:09:07.029 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:09:07.029 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:09:07.029 EAL: Ask a virtual area of 0x61000 bytes 00:09:07.029 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:09:07.029 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:09:07.029 EAL: Ask a virtual area of 0x400000000 bytes 00:09:07.029 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:09:07.029 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:09:07.029 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:09:07.029 EAL: Ask a virtual area of 0x61000 bytes 00:09:07.029 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:09:07.029 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:09:07.029 EAL: Ask a virtual area of 0x400000000 bytes 00:09:07.029 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:09:07.029 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:09:07.029 EAL: Ask a virtual area of 0x61000 bytes 00:09:07.029 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:09:07.029 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:09:07.029 EAL: Ask a virtual area of 0x400000000 bytes 00:09:07.029 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:09:07.029 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:09:07.029 EAL: Ask a virtual area of 0x61000 bytes 00:09:07.029 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:09:07.029 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:09:07.029 EAL: Ask a virtual area of 0x400000000 bytes 00:09:07.029 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:09:07.029 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:09:07.029 EAL: Ask a virtual area of 0x61000 bytes 00:09:07.029 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:09:07.029 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:09:07.029 EAL: Ask a virtual area of 0x400000000 bytes 00:09:07.029 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:09:07.029 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:09:07.029 EAL: Hugepages will be freed exactly as allocated. 00:09:07.029 EAL: No shared files mode enabled, IPC is disabled 00:09:07.029 EAL: No shared files mode enabled, IPC is disabled 00:09:07.029 EAL: TSC frequency is ~2500000 KHz 00:09:07.029 EAL: Main lcore 0 is ready (tid=7f07f8658b00;cpuset=[0]) 00:09:07.029 EAL: Trying to obtain current memory policy. 00:09:07.029 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.029 EAL: Restoring previous memory policy: 0 00:09:07.029 EAL: request: mp_malloc_sync 00:09:07.029 EAL: No shared files mode enabled, IPC is disabled 00:09:07.029 EAL: Heap on socket 0 was expanded by 2MB 00:09:07.029 EAL: PCI device 0000:1a:01.0 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001000000 00:09:07.029 EAL: PCI memory mapped at 0x202001001000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:01.1 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001002000 00:09:07.029 EAL: PCI memory mapped at 0x202001003000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:01.2 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001004000 00:09:07.029 EAL: PCI memory mapped at 0x202001005000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:01.3 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001006000 00:09:07.029 EAL: PCI memory mapped at 0x202001007000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:01.4 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001008000 00:09:07.029 EAL: PCI memory mapped at 0x202001009000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:01.5 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x20200100a000 00:09:07.029 EAL: PCI memory mapped at 0x20200100b000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:01.6 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x20200100c000 00:09:07.029 EAL: PCI memory mapped at 0x20200100d000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:01.7 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x20200100e000 00:09:07.029 EAL: PCI memory mapped at 0x20200100f000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:02.0 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001010000 00:09:07.029 EAL: PCI memory mapped at 0x202001011000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:02.1 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001012000 00:09:07.029 EAL: PCI memory mapped at 0x202001013000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:02.2 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001014000 00:09:07.029 EAL: PCI memory mapped at 0x202001015000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:02.3 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001016000 00:09:07.029 EAL: PCI memory mapped at 0x202001017000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:09:07.029 EAL: PCI device 0000:1a:02.4 on NUMA socket 0 00:09:07.029 EAL: probe driver: 8086:37c9 qat 00:09:07.029 EAL: PCI memory mapped at 0x202001018000 00:09:07.029 EAL: PCI memory mapped at 0x202001019000 00:09:07.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:09:07.030 EAL: PCI device 0000:1a:02.5 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200101a000 00:09:07.030 EAL: PCI memory mapped at 0x20200101b000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:09:07.030 EAL: PCI device 0000:1a:02.6 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200101c000 00:09:07.030 EAL: PCI memory mapped at 0x20200101d000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:09:07.030 EAL: PCI device 0000:1a:02.7 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200101e000 00:09:07.030 EAL: PCI memory mapped at 0x20200101f000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:01.0 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001020000 00:09:07.030 EAL: PCI memory mapped at 0x202001021000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:01.1 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001022000 00:09:07.030 EAL: PCI memory mapped at 0x202001023000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:01.2 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001024000 00:09:07.030 EAL: PCI memory mapped at 0x202001025000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:01.3 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001026000 00:09:07.030 EAL: PCI memory mapped at 0x202001027000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:01.4 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001028000 00:09:07.030 EAL: PCI memory mapped at 0x202001029000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:01.5 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200102a000 00:09:07.030 EAL: PCI memory mapped at 0x20200102b000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:01.6 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200102c000 00:09:07.030 EAL: PCI memory mapped at 0x20200102d000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:01.7 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200102e000 00:09:07.030 EAL: PCI memory mapped at 0x20200102f000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:02.0 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001030000 00:09:07.030 EAL: PCI memory mapped at 0x202001031000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:02.1 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001032000 00:09:07.030 EAL: PCI memory mapped at 0x202001033000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:02.2 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001034000 00:09:07.030 EAL: PCI memory mapped at 0x202001035000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:02.3 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001036000 00:09:07.030 EAL: PCI memory mapped at 0x202001037000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:02.4 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001038000 00:09:07.030 EAL: PCI memory mapped at 0x202001039000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:02.5 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200103a000 00:09:07.030 EAL: PCI memory mapped at 0x20200103b000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:02.6 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200103c000 00:09:07.030 EAL: PCI memory mapped at 0x20200103d000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:09:07.030 EAL: PCI device 0000:1c:02.7 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200103e000 00:09:07.030 EAL: PCI memory mapped at 0x20200103f000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:01.0 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001040000 00:09:07.030 EAL: PCI memory mapped at 0x202001041000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:01.1 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001042000 00:09:07.030 EAL: PCI memory mapped at 0x202001043000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:01.2 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001044000 00:09:07.030 EAL: PCI memory mapped at 0x202001045000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:01.3 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001046000 00:09:07.030 EAL: PCI memory mapped at 0x202001047000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:01.4 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001048000 00:09:07.030 EAL: PCI memory mapped at 0x202001049000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:01.5 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200104a000 00:09:07.030 EAL: PCI memory mapped at 0x20200104b000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:01.6 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200104c000 00:09:07.030 EAL: PCI memory mapped at 0x20200104d000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:01.7 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200104e000 00:09:07.030 EAL: PCI memory mapped at 0x20200104f000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:02.0 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001050000 00:09:07.030 EAL: PCI memory mapped at 0x202001051000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:02.1 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001052000 00:09:07.030 EAL: PCI memory mapped at 0x202001053000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:02.2 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001054000 00:09:07.030 EAL: PCI memory mapped at 0x202001055000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:02.3 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001056000 00:09:07.030 EAL: PCI memory mapped at 0x202001057000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:02.4 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001058000 00:09:07.030 EAL: PCI memory mapped at 0x202001059000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:02.5 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200105a000 00:09:07.030 EAL: PCI memory mapped at 0x20200105b000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:02.6 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200105c000 00:09:07.030 EAL: PCI memory mapped at 0x20200105d000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:09:07.030 EAL: PCI device 0000:1e:02.7 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x20200105e000 00:09:07.030 EAL: PCI memory mapped at 0x20200105f000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:09:07.030 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:09:07.030 EAL: probe driver: 8086:37c9 qat 00:09:07.030 EAL: PCI memory mapped at 0x202001060000 00:09:07.030 EAL: PCI memory mapped at 0x202001061000 00:09:07.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:09:07.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.030 EAL: PCI memory unmapped at 0x202001060000 00:09:07.030 EAL: PCI memory unmapped at 0x202001061000 00:09:07.031 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001062000 00:09:07.031 EAL: PCI memory mapped at 0x202001063000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001062000 00:09:07.031 EAL: PCI memory unmapped at 0x202001063000 00:09:07.031 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001064000 00:09:07.031 EAL: PCI memory mapped at 0x202001065000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001064000 00:09:07.031 EAL: PCI memory unmapped at 0x202001065000 00:09:07.031 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001066000 00:09:07.031 EAL: PCI memory mapped at 0x202001067000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001066000 00:09:07.031 EAL: PCI memory unmapped at 0x202001067000 00:09:07.031 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001068000 00:09:07.031 EAL: PCI memory mapped at 0x202001069000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001068000 00:09:07.031 EAL: PCI memory unmapped at 0x202001069000 00:09:07.031 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x20200106a000 00:09:07.031 EAL: PCI memory mapped at 0x20200106b000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x20200106a000 00:09:07.031 EAL: PCI memory unmapped at 0x20200106b000 00:09:07.031 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x20200106c000 00:09:07.031 EAL: PCI memory mapped at 0x20200106d000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x20200106c000 00:09:07.031 EAL: PCI memory unmapped at 0x20200106d000 00:09:07.031 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x20200106e000 00:09:07.031 EAL: PCI memory mapped at 0x20200106f000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x20200106e000 00:09:07.031 EAL: PCI memory unmapped at 0x20200106f000 00:09:07.031 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001070000 00:09:07.031 EAL: PCI memory mapped at 0x202001071000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001070000 00:09:07.031 EAL: PCI memory unmapped at 0x202001071000 00:09:07.031 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001072000 00:09:07.031 EAL: PCI memory mapped at 0x202001073000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001072000 00:09:07.031 EAL: PCI memory unmapped at 0x202001073000 00:09:07.031 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001074000 00:09:07.031 EAL: PCI memory mapped at 0x202001075000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001074000 00:09:07.031 EAL: PCI memory unmapped at 0x202001075000 00:09:07.031 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001076000 00:09:07.031 EAL: PCI memory mapped at 0x202001077000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001076000 00:09:07.031 EAL: PCI memory unmapped at 0x202001077000 00:09:07.031 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001078000 00:09:07.031 EAL: PCI memory mapped at 0x202001079000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001078000 00:09:07.031 EAL: PCI memory unmapped at 0x202001079000 00:09:07.031 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x20200107a000 00:09:07.031 EAL: PCI memory mapped at 0x20200107b000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x20200107a000 00:09:07.031 EAL: PCI memory unmapped at 0x20200107b000 00:09:07.031 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x20200107c000 00:09:07.031 EAL: PCI memory mapped at 0x20200107d000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x20200107c000 00:09:07.031 EAL: PCI memory unmapped at 0x20200107d000 00:09:07.031 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:07.031 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x20200107e000 00:09:07.031 EAL: PCI memory mapped at 0x20200107f000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x20200107e000 00:09:07.031 EAL: PCI memory unmapped at 0x20200107f000 00:09:07.031 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:07.031 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001080000 00:09:07.031 EAL: PCI memory mapped at 0x202001081000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001080000 00:09:07.031 EAL: PCI memory unmapped at 0x202001081000 00:09:07.031 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:07.031 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001082000 00:09:07.031 EAL: PCI memory mapped at 0x202001083000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001082000 00:09:07.031 EAL: PCI memory unmapped at 0x202001083000 00:09:07.031 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:07.031 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001084000 00:09:07.031 EAL: PCI memory mapped at 0x202001085000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001084000 00:09:07.031 EAL: PCI memory unmapped at 0x202001085000 00:09:07.031 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:07.031 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001086000 00:09:07.031 EAL: PCI memory mapped at 0x202001087000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001086000 00:09:07.031 EAL: PCI memory unmapped at 0x202001087000 00:09:07.031 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:07.031 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:09:07.031 EAL: probe driver: 8086:37c9 qat 00:09:07.031 EAL: PCI memory mapped at 0x202001088000 00:09:07.031 EAL: PCI memory mapped at 0x202001089000 00:09:07.031 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:09:07.031 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.031 EAL: PCI memory unmapped at 0x202001088000 00:09:07.032 EAL: PCI memory unmapped at 0x202001089000 00:09:07.032 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x20200108a000 00:09:07.032 EAL: PCI memory mapped at 0x20200108b000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x20200108a000 00:09:07.032 EAL: PCI memory unmapped at 0x20200108b000 00:09:07.032 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x20200108c000 00:09:07.032 EAL: PCI memory mapped at 0x20200108d000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x20200108c000 00:09:07.032 EAL: PCI memory unmapped at 0x20200108d000 00:09:07.032 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x20200108e000 00:09:07.032 EAL: PCI memory mapped at 0x20200108f000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x20200108e000 00:09:07.032 EAL: PCI memory unmapped at 0x20200108f000 00:09:07.032 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x202001090000 00:09:07.032 EAL: PCI memory mapped at 0x202001091000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x202001090000 00:09:07.032 EAL: PCI memory unmapped at 0x202001091000 00:09:07.032 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x202001092000 00:09:07.032 EAL: PCI memory mapped at 0x202001093000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x202001092000 00:09:07.032 EAL: PCI memory unmapped at 0x202001093000 00:09:07.032 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x202001094000 00:09:07.032 EAL: PCI memory mapped at 0x202001095000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x202001094000 00:09:07.032 EAL: PCI memory unmapped at 0x202001095000 00:09:07.032 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x202001096000 00:09:07.032 EAL: PCI memory mapped at 0x202001097000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x202001096000 00:09:07.032 EAL: PCI memory unmapped at 0x202001097000 00:09:07.032 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x202001098000 00:09:07.032 EAL: PCI memory mapped at 0x202001099000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x202001098000 00:09:07.032 EAL: PCI memory unmapped at 0x202001099000 00:09:07.032 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x20200109a000 00:09:07.032 EAL: PCI memory mapped at 0x20200109b000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x20200109a000 00:09:07.032 EAL: PCI memory unmapped at 0x20200109b000 00:09:07.032 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x20200109c000 00:09:07.032 EAL: PCI memory mapped at 0x20200109d000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:09:07.032 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.032 EAL: PCI memory unmapped at 0x20200109c000 00:09:07.032 EAL: PCI memory unmapped at 0x20200109d000 00:09:07.032 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:07.032 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:09:07.032 EAL: probe driver: 8086:37c9 qat 00:09:07.032 EAL: PCI memory mapped at 0x20200109e000 00:09:07.032 EAL: PCI memory mapped at 0x20200109f000 00:09:07.032 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:09:07.033 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:07.033 EAL: PCI memory unmapped at 0x20200109e000 00:09:07.033 EAL: PCI memory unmapped at 0x20200109f000 00:09:07.033 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:07.033 EAL: PCI device 0000:41:00.0 on NUMA socket 0 00:09:07.033 EAL: probe driver: 8086:37d2 net_i40e 00:09:07.033 EAL: Not managed by a supported kernel driver, skipped 00:09:07.033 EAL: PCI device 0000:41:00.1 on NUMA socket 0 00:09:07.033 EAL: probe driver: 8086:37d2 net_i40e 00:09:07.033 EAL: Not managed by a supported kernel driver, skipped 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: No PCI address specified using 'addr=' in: bus=pci 00:09:07.033 EAL: Mem event callback 'spdk:(nil)' registered 00:09:07.033 00:09:07.033 00:09:07.033 CUnit - A unit testing framework for C - Version 2.1-3 00:09:07.033 http://cunit.sourceforge.net/ 00:09:07.033 00:09:07.033 00:09:07.033 Suite: components_suite 00:09:07.033 Test: vtophys_malloc_test ...passed 00:09:07.033 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:09:07.033 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.033 EAL: Restoring previous memory policy: 4 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was expanded by 4MB 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was shrunk by 4MB 00:09:07.033 EAL: Trying to obtain current memory policy. 00:09:07.033 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.033 EAL: Restoring previous memory policy: 4 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was expanded by 6MB 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was shrunk by 6MB 00:09:07.033 EAL: Trying to obtain current memory policy. 00:09:07.033 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.033 EAL: Restoring previous memory policy: 4 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was expanded by 10MB 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was shrunk by 10MB 00:09:07.033 EAL: Trying to obtain current memory policy. 00:09:07.033 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.033 EAL: Restoring previous memory policy: 4 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was expanded by 18MB 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was shrunk by 18MB 00:09:07.033 EAL: Trying to obtain current memory policy. 00:09:07.033 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.033 EAL: Restoring previous memory policy: 4 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was expanded by 34MB 00:09:07.033 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.033 EAL: request: mp_malloc_sync 00:09:07.033 EAL: No shared files mode enabled, IPC is disabled 00:09:07.033 EAL: Heap on socket 0 was shrunk by 34MB 00:09:07.033 EAL: Trying to obtain current memory policy. 00:09:07.033 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.291 EAL: Restoring previous memory policy: 4 00:09:07.292 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.292 EAL: request: mp_malloc_sync 00:09:07.292 EAL: No shared files mode enabled, IPC is disabled 00:09:07.292 EAL: Heap on socket 0 was expanded by 66MB 00:09:07.292 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.292 EAL: request: mp_malloc_sync 00:09:07.292 EAL: No shared files mode enabled, IPC is disabled 00:09:07.292 EAL: Heap on socket 0 was shrunk by 66MB 00:09:07.292 EAL: Trying to obtain current memory policy. 00:09:07.292 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.292 EAL: Restoring previous memory policy: 4 00:09:07.292 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.292 EAL: request: mp_malloc_sync 00:09:07.292 EAL: No shared files mode enabled, IPC is disabled 00:09:07.292 EAL: Heap on socket 0 was expanded by 130MB 00:09:07.292 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.292 EAL: request: mp_malloc_sync 00:09:07.292 EAL: No shared files mode enabled, IPC is disabled 00:09:07.292 EAL: Heap on socket 0 was shrunk by 130MB 00:09:07.292 EAL: Trying to obtain current memory policy. 00:09:07.292 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.292 EAL: Restoring previous memory policy: 4 00:09:07.292 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.292 EAL: request: mp_malloc_sync 00:09:07.292 EAL: No shared files mode enabled, IPC is disabled 00:09:07.292 EAL: Heap on socket 0 was expanded by 258MB 00:09:07.292 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.292 EAL: request: mp_malloc_sync 00:09:07.292 EAL: No shared files mode enabled, IPC is disabled 00:09:07.292 EAL: Heap on socket 0 was shrunk by 258MB 00:09:07.292 EAL: Trying to obtain current memory policy. 00:09:07.292 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.550 EAL: Restoring previous memory policy: 4 00:09:07.550 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.550 EAL: request: mp_malloc_sync 00:09:07.550 EAL: No shared files mode enabled, IPC is disabled 00:09:07.550 EAL: Heap on socket 0 was expanded by 514MB 00:09:07.550 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.550 EAL: request: mp_malloc_sync 00:09:07.550 EAL: No shared files mode enabled, IPC is disabled 00:09:07.550 EAL: Heap on socket 0 was shrunk by 514MB 00:09:07.550 EAL: Trying to obtain current memory policy. 00:09:07.550 EAL: Setting policy MPOL_PREFERRED for socket 0 00:09:07.808 EAL: Restoring previous memory policy: 4 00:09:07.808 EAL: Calling mem event callback 'spdk:(nil)' 00:09:07.808 EAL: request: mp_malloc_sync 00:09:07.808 EAL: No shared files mode enabled, IPC is disabled 00:09:07.808 EAL: Heap on socket 0 was expanded by 1026MB 00:09:08.066 EAL: Calling mem event callback 'spdk:(nil)' 00:09:08.066 EAL: request: mp_malloc_sync 00:09:08.066 EAL: No shared files mode enabled, IPC is disabled 00:09:08.066 EAL: Heap on socket 0 was shrunk by 1026MB 00:09:08.066 passed 00:09:08.066 00:09:08.066 Run Summary: Type Total Ran Passed Failed Inactive 00:09:08.066 suites 1 1 n/a 0 0 00:09:08.066 tests 2 2 2 0 0 00:09:08.066 asserts 6380 6380 6380 0 n/a 00:09:08.066 00:09:08.066 Elapsed time = 1.014 seconds 00:09:08.066 EAL: No shared files mode enabled, IPC is disabled 00:09:08.066 EAL: No shared files mode enabled, IPC is disabled 00:09:08.066 EAL: No shared files mode enabled, IPC is disabled 00:09:08.066 00:09:08.066 real 0m1.221s 00:09:08.066 user 0m0.679s 00:09:08.066 sys 0m0.508s 00:09:08.066 06:25:21 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:08.066 06:25:21 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:09:08.066 ************************************ 00:09:08.066 END TEST env_vtophys 00:09:08.066 ************************************ 00:09:08.324 06:25:21 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:09:08.324 06:25:21 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:08.324 06:25:21 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:08.324 06:25:21 env -- common/autotest_common.sh@10 -- # set +x 00:09:08.324 ************************************ 00:09:08.324 START TEST env_pci 00:09:08.324 ************************************ 00:09:08.325 06:25:21 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:09:08.325 00:09:08.325 00:09:08.325 CUnit - A unit testing framework for C - Version 2.1-3 00:09:08.325 http://cunit.sourceforge.net/ 00:09:08.325 00:09:08.325 00:09:08.325 Suite: pci 00:09:08.325 Test: pci_hook ...[2024-07-25 06:25:21.720800] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1041011 has claimed it 00:09:08.325 EAL: Cannot find device (10000:00:01.0) 00:09:08.325 EAL: Failed to attach device on primary process 00:09:08.325 passed 00:09:08.325 00:09:08.325 Run Summary: Type Total Ran Passed Failed Inactive 00:09:08.325 suites 1 1 n/a 0 0 00:09:08.325 tests 1 1 1 0 0 00:09:08.325 asserts 25 25 25 0 n/a 00:09:08.325 00:09:08.325 Elapsed time = 0.046 seconds 00:09:08.325 00:09:08.325 real 0m0.075s 00:09:08.325 user 0m0.019s 00:09:08.325 sys 0m0.055s 00:09:08.325 06:25:21 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:08.325 06:25:21 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:09:08.325 ************************************ 00:09:08.325 END TEST env_pci 00:09:08.325 ************************************ 00:09:08.325 06:25:21 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:09:08.325 06:25:21 env -- env/env.sh@15 -- # uname 00:09:08.325 06:25:21 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:09:08.325 06:25:21 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:09:08.325 06:25:21 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:09:08.325 06:25:21 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:08.325 06:25:21 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:08.325 06:25:21 env -- common/autotest_common.sh@10 -- # set +x 00:09:08.325 ************************************ 00:09:08.325 START TEST env_dpdk_post_init 00:09:08.325 ************************************ 00:09:08.325 06:25:21 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:09:08.585 EAL: Detected CPU lcores: 112 00:09:08.585 EAL: Detected NUMA nodes: 2 00:09:08.585 EAL: Detected shared linkage of DPDK 00:09:08.585 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:09:08.585 EAL: Selected IOVA mode 'PA' 00:09:08.585 EAL: VFIO support initialized 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.585 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:09:08.585 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.585 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:08.586 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:09:08.586 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:09:08.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.586 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:09:08.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.586 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:09:08.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.586 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:09:08.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.586 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:09:08.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.586 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:09:08.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.586 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:09:08.586 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.586 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:08.586 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:08.587 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:09:08.587 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.587 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:08.587 TELEMETRY: No legacy callbacks, legacy socket not created 00:09:08.587 EAL: Using IOMMU type 1 (Type 1) 00:09:08.587 EAL: Ignore mapping IO port bar(1) 00:09:08.587 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:09:08.587 EAL: Ignore mapping IO port bar(1) 00:09:08.587 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:09:08.587 EAL: Ignore mapping IO port bar(1) 00:09:08.587 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:09:08.846 EAL: Ignore mapping IO port bar(1) 00:09:08.846 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:09:08.846 EAL: Ignore mapping IO port bar(1) 00:09:08.846 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:09:08.846 EAL: Ignore mapping IO port bar(1) 00:09:08.846 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:09:08.846 EAL: Ignore mapping IO port bar(1) 00:09:08.846 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:09:08.846 EAL: Ignore mapping IO port bar(1) 00:09:08.846 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:09:08.846 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:09:08.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.846 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:08.846 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:09:08.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.846 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:08.846 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:09:08.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.846 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:08.846 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:09:08.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.846 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:08.846 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:09:08.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.846 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:08.846 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:09:08.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.846 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:08.846 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:09:08.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.846 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:08.846 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:08.847 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:09:08.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:08.847 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:08.847 EAL: Ignore mapping IO port bar(1) 00:09:08.847 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:09:08.847 EAL: Ignore mapping IO port bar(1) 00:09:08.847 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:09:08.847 EAL: Ignore mapping IO port bar(1) 00:09:08.847 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:09:08.847 EAL: Ignore mapping IO port bar(1) 00:09:08.847 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:09:08.847 EAL: Ignore mapping IO port bar(1) 00:09:08.847 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:09:08.847 EAL: Ignore mapping IO port bar(1) 00:09:08.847 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:09:08.847 EAL: Ignore mapping IO port bar(1) 00:09:08.847 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:09:08.847 EAL: Ignore mapping IO port bar(1) 00:09:08.847 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:09:09.782 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:09:13.965 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:09:13.965 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001120000 00:09:13.965 Starting DPDK initialization... 00:09:13.965 Starting SPDK post initialization... 00:09:13.965 SPDK NVMe probe 00:09:13.965 Attaching to 0000:d8:00.0 00:09:13.965 Attached to 0000:d8:00.0 00:09:13.965 Cleaning up... 00:09:13.965 00:09:13.965 real 0m5.433s 00:09:13.965 user 0m3.954s 00:09:13.965 sys 0m0.537s 00:09:13.965 06:25:27 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:13.965 06:25:27 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:09:13.965 ************************************ 00:09:13.965 END TEST env_dpdk_post_init 00:09:13.965 ************************************ 00:09:13.965 06:25:27 env -- env/env.sh@26 -- # uname 00:09:13.965 06:25:27 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:09:13.965 06:25:27 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:09:13.966 06:25:27 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:13.966 06:25:27 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:13.966 06:25:27 env -- common/autotest_common.sh@10 -- # set +x 00:09:13.966 ************************************ 00:09:13.966 START TEST env_mem_callbacks 00:09:13.966 ************************************ 00:09:13.966 06:25:27 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:09:13.966 EAL: Detected CPU lcores: 112 00:09:13.966 EAL: Detected NUMA nodes: 2 00:09:13.966 EAL: Detected shared linkage of DPDK 00:09:13.966 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:09:13.966 EAL: Selected IOVA mode 'PA' 00:09:13.966 EAL: VFIO support initialized 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.966 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:09:13.966 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:09:13.966 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:09:13.967 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:09:13.967 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:09:13.967 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.967 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:09:13.967 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.967 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:09:13.967 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.967 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:09:13.967 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.967 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:09:13.967 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.967 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:09:13.967 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.967 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:13.967 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:09:13.967 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:13.968 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:09:13.968 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:13.968 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:13.968 TELEMETRY: No legacy callbacks, legacy socket not created 00:09:13.968 00:09:13.968 00:09:13.968 CUnit - A unit testing framework for C - Version 2.1-3 00:09:13.968 http://cunit.sourceforge.net/ 00:09:13.968 00:09:13.968 00:09:13.968 Suite: memory 00:09:13.968 Test: test ... 00:09:13.968 register 0x200000200000 2097152 00:09:13.968 malloc 3145728 00:09:13.968 register 0x200000400000 4194304 00:09:13.968 buf 0x200000500000 len 3145728 PASSED 00:09:13.968 malloc 64 00:09:13.968 buf 0x2000004fff40 len 64 PASSED 00:09:13.968 malloc 4194304 00:09:13.968 register 0x200000800000 6291456 00:09:13.968 buf 0x200000a00000 len 4194304 PASSED 00:09:13.968 free 0x200000500000 3145728 00:09:13.968 free 0x2000004fff40 64 00:09:13.968 unregister 0x200000400000 4194304 PASSED 00:09:13.968 free 0x200000a00000 4194304 00:09:13.968 unregister 0x200000800000 6291456 PASSED 00:09:13.968 malloc 8388608 00:09:13.968 register 0x200000400000 10485760 00:09:13.968 buf 0x200000600000 len 8388608 PASSED 00:09:13.968 free 0x200000600000 8388608 00:09:13.968 unregister 0x200000400000 10485760 PASSED 00:09:13.968 passed 00:09:13.968 00:09:13.968 Run Summary: Type Total Ran Passed Failed Inactive 00:09:13.968 suites 1 1 n/a 0 0 00:09:13.968 tests 1 1 1 0 0 00:09:13.968 asserts 15 15 15 0 n/a 00:09:13.968 00:09:13.968 Elapsed time = 0.006 seconds 00:09:13.968 00:09:13.968 real 0m0.115s 00:09:13.968 user 0m0.032s 00:09:13.968 sys 0m0.082s 00:09:13.968 06:25:27 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:13.968 06:25:27 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:09:13.968 ************************************ 00:09:13.968 END TEST env_mem_callbacks 00:09:13.968 ************************************ 00:09:14.227 00:09:14.227 real 0m7.557s 00:09:14.227 user 0m5.057s 00:09:14.227 sys 0m1.561s 00:09:14.227 06:25:27 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:14.227 06:25:27 env -- common/autotest_common.sh@10 -- # set +x 00:09:14.227 ************************************ 00:09:14.227 END TEST env 00:09:14.227 ************************************ 00:09:14.227 06:25:27 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:09:14.227 06:25:27 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:14.227 06:25:27 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:14.227 06:25:27 -- common/autotest_common.sh@10 -- # set +x 00:09:14.227 ************************************ 00:09:14.227 START TEST rpc 00:09:14.227 ************************************ 00:09:14.227 06:25:27 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:09:14.227 * Looking for test storage... 00:09:14.227 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:09:14.227 06:25:27 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1042183 00:09:14.227 06:25:27 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:09:14.227 06:25:27 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:09:14.227 06:25:27 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1042183 00:09:14.227 06:25:27 rpc -- common/autotest_common.sh@831 -- # '[' -z 1042183 ']' 00:09:14.227 06:25:27 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:14.227 06:25:27 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:14.227 06:25:27 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:14.227 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:14.227 06:25:27 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:14.227 06:25:27 rpc -- common/autotest_common.sh@10 -- # set +x 00:09:14.487 [2024-07-25 06:25:27.818346] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:09:14.487 [2024-07-25 06:25:27.818405] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1042183 ] 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:14.487 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:14.487 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:14.487 [2024-07-25 06:25:27.954837] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.487 [2024-07-25 06:25:27.998751] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:09:14.487 [2024-07-25 06:25:27.998798] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1042183' to capture a snapshot of events at runtime. 00:09:14.487 [2024-07-25 06:25:27.998811] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:09:14.487 [2024-07-25 06:25:27.998823] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:09:14.487 [2024-07-25 06:25:27.998833] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1042183 for offline analysis/debug. 00:09:14.487 [2024-07-25 06:25:27.998863] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.422 06:25:28 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:15.422 06:25:28 rpc -- common/autotest_common.sh@864 -- # return 0 00:09:15.422 06:25:28 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:09:15.422 06:25:28 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:09:15.422 06:25:28 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:09:15.422 06:25:28 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:09:15.422 06:25:28 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:15.422 06:25:28 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:15.422 06:25:28 rpc -- common/autotest_common.sh@10 -- # set +x 00:09:15.422 ************************************ 00:09:15.422 START TEST rpc_integrity 00:09:15.422 ************************************ 00:09:15.422 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:09:15.422 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:09:15.423 { 00:09:15.423 "name": "Malloc0", 00:09:15.423 "aliases": [ 00:09:15.423 "6e6a18e6-d344-430f-80ce-a4f3cdffa62f" 00:09:15.423 ], 00:09:15.423 "product_name": "Malloc disk", 00:09:15.423 "block_size": 512, 00:09:15.423 "num_blocks": 16384, 00:09:15.423 "uuid": "6e6a18e6-d344-430f-80ce-a4f3cdffa62f", 00:09:15.423 "assigned_rate_limits": { 00:09:15.423 "rw_ios_per_sec": 0, 00:09:15.423 "rw_mbytes_per_sec": 0, 00:09:15.423 "r_mbytes_per_sec": 0, 00:09:15.423 "w_mbytes_per_sec": 0 00:09:15.423 }, 00:09:15.423 "claimed": false, 00:09:15.423 "zoned": false, 00:09:15.423 "supported_io_types": { 00:09:15.423 "read": true, 00:09:15.423 "write": true, 00:09:15.423 "unmap": true, 00:09:15.423 "flush": true, 00:09:15.423 "reset": true, 00:09:15.423 "nvme_admin": false, 00:09:15.423 "nvme_io": false, 00:09:15.423 "nvme_io_md": false, 00:09:15.423 "write_zeroes": true, 00:09:15.423 "zcopy": true, 00:09:15.423 "get_zone_info": false, 00:09:15.423 "zone_management": false, 00:09:15.423 "zone_append": false, 00:09:15.423 "compare": false, 00:09:15.423 "compare_and_write": false, 00:09:15.423 "abort": true, 00:09:15.423 "seek_hole": false, 00:09:15.423 "seek_data": false, 00:09:15.423 "copy": true, 00:09:15.423 "nvme_iov_md": false 00:09:15.423 }, 00:09:15.423 "memory_domains": [ 00:09:15.423 { 00:09:15.423 "dma_device_id": "system", 00:09:15.423 "dma_device_type": 1 00:09:15.423 }, 00:09:15.423 { 00:09:15.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:15.423 "dma_device_type": 2 00:09:15.423 } 00:09:15.423 ], 00:09:15.423 "driver_specific": {} 00:09:15.423 } 00:09:15.423 ]' 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.423 [2024-07-25 06:25:28.884188] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:09:15.423 [2024-07-25 06:25:28.884227] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:15.423 [2024-07-25 06:25:28.884246] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25da490 00:09:15.423 [2024-07-25 06:25:28.884257] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:15.423 [2024-07-25 06:25:28.885621] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:15.423 [2024-07-25 06:25:28.885647] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:09:15.423 Passthru0 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:09:15.423 { 00:09:15.423 "name": "Malloc0", 00:09:15.423 "aliases": [ 00:09:15.423 "6e6a18e6-d344-430f-80ce-a4f3cdffa62f" 00:09:15.423 ], 00:09:15.423 "product_name": "Malloc disk", 00:09:15.423 "block_size": 512, 00:09:15.423 "num_blocks": 16384, 00:09:15.423 "uuid": "6e6a18e6-d344-430f-80ce-a4f3cdffa62f", 00:09:15.423 "assigned_rate_limits": { 00:09:15.423 "rw_ios_per_sec": 0, 00:09:15.423 "rw_mbytes_per_sec": 0, 00:09:15.423 "r_mbytes_per_sec": 0, 00:09:15.423 "w_mbytes_per_sec": 0 00:09:15.423 }, 00:09:15.423 "claimed": true, 00:09:15.423 "claim_type": "exclusive_write", 00:09:15.423 "zoned": false, 00:09:15.423 "supported_io_types": { 00:09:15.423 "read": true, 00:09:15.423 "write": true, 00:09:15.423 "unmap": true, 00:09:15.423 "flush": true, 00:09:15.423 "reset": true, 00:09:15.423 "nvme_admin": false, 00:09:15.423 "nvme_io": false, 00:09:15.423 "nvme_io_md": false, 00:09:15.423 "write_zeroes": true, 00:09:15.423 "zcopy": true, 00:09:15.423 "get_zone_info": false, 00:09:15.423 "zone_management": false, 00:09:15.423 "zone_append": false, 00:09:15.423 "compare": false, 00:09:15.423 "compare_and_write": false, 00:09:15.423 "abort": true, 00:09:15.423 "seek_hole": false, 00:09:15.423 "seek_data": false, 00:09:15.423 "copy": true, 00:09:15.423 "nvme_iov_md": false 00:09:15.423 }, 00:09:15.423 "memory_domains": [ 00:09:15.423 { 00:09:15.423 "dma_device_id": "system", 00:09:15.423 "dma_device_type": 1 00:09:15.423 }, 00:09:15.423 { 00:09:15.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:15.423 "dma_device_type": 2 00:09:15.423 } 00:09:15.423 ], 00:09:15.423 "driver_specific": {} 00:09:15.423 }, 00:09:15.423 { 00:09:15.423 "name": "Passthru0", 00:09:15.423 "aliases": [ 00:09:15.423 "38317d37-dfb9-5834-9ca3-44fb888b6e93" 00:09:15.423 ], 00:09:15.423 "product_name": "passthru", 00:09:15.423 "block_size": 512, 00:09:15.423 "num_blocks": 16384, 00:09:15.423 "uuid": "38317d37-dfb9-5834-9ca3-44fb888b6e93", 00:09:15.423 "assigned_rate_limits": { 00:09:15.423 "rw_ios_per_sec": 0, 00:09:15.423 "rw_mbytes_per_sec": 0, 00:09:15.423 "r_mbytes_per_sec": 0, 00:09:15.423 "w_mbytes_per_sec": 0 00:09:15.423 }, 00:09:15.423 "claimed": false, 00:09:15.423 "zoned": false, 00:09:15.423 "supported_io_types": { 00:09:15.423 "read": true, 00:09:15.423 "write": true, 00:09:15.423 "unmap": true, 00:09:15.423 "flush": true, 00:09:15.423 "reset": true, 00:09:15.423 "nvme_admin": false, 00:09:15.423 "nvme_io": false, 00:09:15.423 "nvme_io_md": false, 00:09:15.423 "write_zeroes": true, 00:09:15.423 "zcopy": true, 00:09:15.423 "get_zone_info": false, 00:09:15.423 "zone_management": false, 00:09:15.423 "zone_append": false, 00:09:15.423 "compare": false, 00:09:15.423 "compare_and_write": false, 00:09:15.423 "abort": true, 00:09:15.423 "seek_hole": false, 00:09:15.423 "seek_data": false, 00:09:15.423 "copy": true, 00:09:15.423 "nvme_iov_md": false 00:09:15.423 }, 00:09:15.423 "memory_domains": [ 00:09:15.423 { 00:09:15.423 "dma_device_id": "system", 00:09:15.423 "dma_device_type": 1 00:09:15.423 }, 00:09:15.423 { 00:09:15.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:15.423 "dma_device_type": 2 00:09:15.423 } 00:09:15.423 ], 00:09:15.423 "driver_specific": { 00:09:15.423 "passthru": { 00:09:15.423 "name": "Passthru0", 00:09:15.423 "base_bdev_name": "Malloc0" 00:09:15.423 } 00:09:15.423 } 00:09:15.423 } 00:09:15.423 ]' 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.423 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.423 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.682 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.682 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:09:15.682 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.682 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.682 06:25:28 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.682 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:09:15.682 06:25:28 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:09:15.682 06:25:29 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:09:15.682 00:09:15.682 real 0m0.294s 00:09:15.682 user 0m0.186s 00:09:15.682 sys 0m0.054s 00:09:15.682 06:25:29 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:15.682 06:25:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:15.682 ************************************ 00:09:15.682 END TEST rpc_integrity 00:09:15.682 ************************************ 00:09:15.682 06:25:29 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:09:15.682 06:25:29 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:15.682 06:25:29 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:15.682 06:25:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:09:15.682 ************************************ 00:09:15.682 START TEST rpc_plugins 00:09:15.682 ************************************ 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:09:15.682 { 00:09:15.682 "name": "Malloc1", 00:09:15.682 "aliases": [ 00:09:15.682 "3f9b83a1-29bb-4b03-9f07-97917905ef72" 00:09:15.682 ], 00:09:15.682 "product_name": "Malloc disk", 00:09:15.682 "block_size": 4096, 00:09:15.682 "num_blocks": 256, 00:09:15.682 "uuid": "3f9b83a1-29bb-4b03-9f07-97917905ef72", 00:09:15.682 "assigned_rate_limits": { 00:09:15.682 "rw_ios_per_sec": 0, 00:09:15.682 "rw_mbytes_per_sec": 0, 00:09:15.682 "r_mbytes_per_sec": 0, 00:09:15.682 "w_mbytes_per_sec": 0 00:09:15.682 }, 00:09:15.682 "claimed": false, 00:09:15.682 "zoned": false, 00:09:15.682 "supported_io_types": { 00:09:15.682 "read": true, 00:09:15.682 "write": true, 00:09:15.682 "unmap": true, 00:09:15.682 "flush": true, 00:09:15.682 "reset": true, 00:09:15.682 "nvme_admin": false, 00:09:15.682 "nvme_io": false, 00:09:15.682 "nvme_io_md": false, 00:09:15.682 "write_zeroes": true, 00:09:15.682 "zcopy": true, 00:09:15.682 "get_zone_info": false, 00:09:15.682 "zone_management": false, 00:09:15.682 "zone_append": false, 00:09:15.682 "compare": false, 00:09:15.682 "compare_and_write": false, 00:09:15.682 "abort": true, 00:09:15.682 "seek_hole": false, 00:09:15.682 "seek_data": false, 00:09:15.682 "copy": true, 00:09:15.682 "nvme_iov_md": false 00:09:15.682 }, 00:09:15.682 "memory_domains": [ 00:09:15.682 { 00:09:15.682 "dma_device_id": "system", 00:09:15.682 "dma_device_type": 1 00:09:15.682 }, 00:09:15.682 { 00:09:15.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:15.682 "dma_device_type": 2 00:09:15.682 } 00:09:15.682 ], 00:09:15.682 "driver_specific": {} 00:09:15.682 } 00:09:15.682 ]' 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:09:15.682 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:09:15.682 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:09:15.941 06:25:29 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:09:15.941 00:09:15.941 real 0m0.148s 00:09:15.941 user 0m0.094s 00:09:15.941 sys 0m0.025s 00:09:15.941 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:15.941 06:25:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:09:15.941 ************************************ 00:09:15.941 END TEST rpc_plugins 00:09:15.941 ************************************ 00:09:15.941 06:25:29 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:09:15.941 06:25:29 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:15.941 06:25:29 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:15.941 06:25:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:09:15.941 ************************************ 00:09:15.941 START TEST rpc_trace_cmd_test 00:09:15.941 ************************************ 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:09:15.941 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1042183", 00:09:15.941 "tpoint_group_mask": "0x8", 00:09:15.941 "iscsi_conn": { 00:09:15.941 "mask": "0x2", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "scsi": { 00:09:15.941 "mask": "0x4", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "bdev": { 00:09:15.941 "mask": "0x8", 00:09:15.941 "tpoint_mask": "0xffffffffffffffff" 00:09:15.941 }, 00:09:15.941 "nvmf_rdma": { 00:09:15.941 "mask": "0x10", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "nvmf_tcp": { 00:09:15.941 "mask": "0x20", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "ftl": { 00:09:15.941 "mask": "0x40", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "blobfs": { 00:09:15.941 "mask": "0x80", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "dsa": { 00:09:15.941 "mask": "0x200", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "thread": { 00:09:15.941 "mask": "0x400", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "nvme_pcie": { 00:09:15.941 "mask": "0x800", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "iaa": { 00:09:15.941 "mask": "0x1000", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "nvme_tcp": { 00:09:15.941 "mask": "0x2000", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "bdev_nvme": { 00:09:15.941 "mask": "0x4000", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 }, 00:09:15.941 "sock": { 00:09:15.941 "mask": "0x8000", 00:09:15.941 "tpoint_mask": "0x0" 00:09:15.941 } 00:09:15.941 }' 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:09:15.941 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:09:16.199 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:09:16.199 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:09:16.199 06:25:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:09:16.199 00:09:16.199 real 0m0.234s 00:09:16.199 user 0m0.188s 00:09:16.199 sys 0m0.040s 00:09:16.199 06:25:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:16.199 06:25:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:09:16.199 ************************************ 00:09:16.199 END TEST rpc_trace_cmd_test 00:09:16.199 ************************************ 00:09:16.199 06:25:29 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:09:16.199 06:25:29 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:09:16.199 06:25:29 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:09:16.199 06:25:29 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:16.199 06:25:29 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:16.199 06:25:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:09:16.199 ************************************ 00:09:16.199 START TEST rpc_daemon_integrity 00:09:16.199 ************************************ 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:16.199 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.200 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:16.200 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:09:16.200 { 00:09:16.200 "name": "Malloc2", 00:09:16.200 "aliases": [ 00:09:16.200 "2b09038c-a4cb-4271-92fb-ac1fac5e9a98" 00:09:16.200 ], 00:09:16.200 "product_name": "Malloc disk", 00:09:16.200 "block_size": 512, 00:09:16.200 "num_blocks": 16384, 00:09:16.200 "uuid": "2b09038c-a4cb-4271-92fb-ac1fac5e9a98", 00:09:16.200 "assigned_rate_limits": { 00:09:16.200 "rw_ios_per_sec": 0, 00:09:16.200 "rw_mbytes_per_sec": 0, 00:09:16.200 "r_mbytes_per_sec": 0, 00:09:16.200 "w_mbytes_per_sec": 0 00:09:16.200 }, 00:09:16.200 "claimed": false, 00:09:16.200 "zoned": false, 00:09:16.200 "supported_io_types": { 00:09:16.200 "read": true, 00:09:16.200 "write": true, 00:09:16.200 "unmap": true, 00:09:16.200 "flush": true, 00:09:16.200 "reset": true, 00:09:16.200 "nvme_admin": false, 00:09:16.200 "nvme_io": false, 00:09:16.200 "nvme_io_md": false, 00:09:16.200 "write_zeroes": true, 00:09:16.200 "zcopy": true, 00:09:16.200 "get_zone_info": false, 00:09:16.200 "zone_management": false, 00:09:16.200 "zone_append": false, 00:09:16.200 "compare": false, 00:09:16.200 "compare_and_write": false, 00:09:16.200 "abort": true, 00:09:16.200 "seek_hole": false, 00:09:16.200 "seek_data": false, 00:09:16.200 "copy": true, 00:09:16.200 "nvme_iov_md": false 00:09:16.200 }, 00:09:16.200 "memory_domains": [ 00:09:16.200 { 00:09:16.200 "dma_device_id": "system", 00:09:16.200 "dma_device_type": 1 00:09:16.200 }, 00:09:16.200 { 00:09:16.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:16.200 "dma_device_type": 2 00:09:16.200 } 00:09:16.200 ], 00:09:16.200 "driver_specific": {} 00:09:16.200 } 00:09:16.200 ]' 00:09:16.200 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:09:16.458 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:09:16.458 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.459 [2024-07-25 06:25:29.794737] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:09:16.459 [2024-07-25 06:25:29.794777] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:16.459 [2024-07-25 06:25:29.794796] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25de400 00:09:16.459 [2024-07-25 06:25:29.794812] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:16.459 [2024-07-25 06:25:29.796082] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:16.459 [2024-07-25 06:25:29.796110] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:09:16.459 Passthru0 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:09:16.459 { 00:09:16.459 "name": "Malloc2", 00:09:16.459 "aliases": [ 00:09:16.459 "2b09038c-a4cb-4271-92fb-ac1fac5e9a98" 00:09:16.459 ], 00:09:16.459 "product_name": "Malloc disk", 00:09:16.459 "block_size": 512, 00:09:16.459 "num_blocks": 16384, 00:09:16.459 "uuid": "2b09038c-a4cb-4271-92fb-ac1fac5e9a98", 00:09:16.459 "assigned_rate_limits": { 00:09:16.459 "rw_ios_per_sec": 0, 00:09:16.459 "rw_mbytes_per_sec": 0, 00:09:16.459 "r_mbytes_per_sec": 0, 00:09:16.459 "w_mbytes_per_sec": 0 00:09:16.459 }, 00:09:16.459 "claimed": true, 00:09:16.459 "claim_type": "exclusive_write", 00:09:16.459 "zoned": false, 00:09:16.459 "supported_io_types": { 00:09:16.459 "read": true, 00:09:16.459 "write": true, 00:09:16.459 "unmap": true, 00:09:16.459 "flush": true, 00:09:16.459 "reset": true, 00:09:16.459 "nvme_admin": false, 00:09:16.459 "nvme_io": false, 00:09:16.459 "nvme_io_md": false, 00:09:16.459 "write_zeroes": true, 00:09:16.459 "zcopy": true, 00:09:16.459 "get_zone_info": false, 00:09:16.459 "zone_management": false, 00:09:16.459 "zone_append": false, 00:09:16.459 "compare": false, 00:09:16.459 "compare_and_write": false, 00:09:16.459 "abort": true, 00:09:16.459 "seek_hole": false, 00:09:16.459 "seek_data": false, 00:09:16.459 "copy": true, 00:09:16.459 "nvme_iov_md": false 00:09:16.459 }, 00:09:16.459 "memory_domains": [ 00:09:16.459 { 00:09:16.459 "dma_device_id": "system", 00:09:16.459 "dma_device_type": 1 00:09:16.459 }, 00:09:16.459 { 00:09:16.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:16.459 "dma_device_type": 2 00:09:16.459 } 00:09:16.459 ], 00:09:16.459 "driver_specific": {} 00:09:16.459 }, 00:09:16.459 { 00:09:16.459 "name": "Passthru0", 00:09:16.459 "aliases": [ 00:09:16.459 "7d575f4c-db61-55db-89c5-1fb15950097b" 00:09:16.459 ], 00:09:16.459 "product_name": "passthru", 00:09:16.459 "block_size": 512, 00:09:16.459 "num_blocks": 16384, 00:09:16.459 "uuid": "7d575f4c-db61-55db-89c5-1fb15950097b", 00:09:16.459 "assigned_rate_limits": { 00:09:16.459 "rw_ios_per_sec": 0, 00:09:16.459 "rw_mbytes_per_sec": 0, 00:09:16.459 "r_mbytes_per_sec": 0, 00:09:16.459 "w_mbytes_per_sec": 0 00:09:16.459 }, 00:09:16.459 "claimed": false, 00:09:16.459 "zoned": false, 00:09:16.459 "supported_io_types": { 00:09:16.459 "read": true, 00:09:16.459 "write": true, 00:09:16.459 "unmap": true, 00:09:16.459 "flush": true, 00:09:16.459 "reset": true, 00:09:16.459 "nvme_admin": false, 00:09:16.459 "nvme_io": false, 00:09:16.459 "nvme_io_md": false, 00:09:16.459 "write_zeroes": true, 00:09:16.459 "zcopy": true, 00:09:16.459 "get_zone_info": false, 00:09:16.459 "zone_management": false, 00:09:16.459 "zone_append": false, 00:09:16.459 "compare": false, 00:09:16.459 "compare_and_write": false, 00:09:16.459 "abort": true, 00:09:16.459 "seek_hole": false, 00:09:16.459 "seek_data": false, 00:09:16.459 "copy": true, 00:09:16.459 "nvme_iov_md": false 00:09:16.459 }, 00:09:16.459 "memory_domains": [ 00:09:16.459 { 00:09:16.459 "dma_device_id": "system", 00:09:16.459 "dma_device_type": 1 00:09:16.459 }, 00:09:16.459 { 00:09:16.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:16.459 "dma_device_type": 2 00:09:16.459 } 00:09:16.459 ], 00:09:16.459 "driver_specific": { 00:09:16.459 "passthru": { 00:09:16.459 "name": "Passthru0", 00:09:16.459 "base_bdev_name": "Malloc2" 00:09:16.459 } 00:09:16.459 } 00:09:16.459 } 00:09:16.459 ]' 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:09:16.459 00:09:16.459 real 0m0.289s 00:09:16.459 user 0m0.185s 00:09:16.459 sys 0m0.055s 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:16.459 06:25:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:09:16.459 ************************************ 00:09:16.459 END TEST rpc_daemon_integrity 00:09:16.459 ************************************ 00:09:16.459 06:25:29 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:09:16.459 06:25:29 rpc -- rpc/rpc.sh@84 -- # killprocess 1042183 00:09:16.459 06:25:29 rpc -- common/autotest_common.sh@950 -- # '[' -z 1042183 ']' 00:09:16.459 06:25:29 rpc -- common/autotest_common.sh@954 -- # kill -0 1042183 00:09:16.459 06:25:29 rpc -- common/autotest_common.sh@955 -- # uname 00:09:16.459 06:25:29 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:16.459 06:25:29 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1042183 00:09:16.718 06:25:30 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:16.718 06:25:30 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:16.718 06:25:30 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1042183' 00:09:16.718 killing process with pid 1042183 00:09:16.718 06:25:30 rpc -- common/autotest_common.sh@969 -- # kill 1042183 00:09:16.718 06:25:30 rpc -- common/autotest_common.sh@974 -- # wait 1042183 00:09:16.977 00:09:16.977 real 0m2.727s 00:09:16.977 user 0m3.472s 00:09:16.977 sys 0m0.893s 00:09:16.977 06:25:30 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:16.977 06:25:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:09:16.977 ************************************ 00:09:16.977 END TEST rpc 00:09:16.977 ************************************ 00:09:16.977 06:25:30 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:09:16.977 06:25:30 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:16.977 06:25:30 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:16.977 06:25:30 -- common/autotest_common.sh@10 -- # set +x 00:09:16.977 ************************************ 00:09:16.977 START TEST skip_rpc 00:09:16.977 ************************************ 00:09:16.977 06:25:30 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:09:17.236 * Looking for test storage... 00:09:17.236 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:09:17.236 06:25:30 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:09:17.236 06:25:30 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:09:17.236 06:25:30 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:09:17.236 06:25:30 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:17.236 06:25:30 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:17.236 06:25:30 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:17.236 ************************************ 00:09:17.236 START TEST skip_rpc 00:09:17.236 ************************************ 00:09:17.236 06:25:30 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:09:17.236 06:25:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1042886 00:09:17.236 06:25:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:09:17.236 06:25:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:09:17.236 06:25:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:09:17.236 [2024-07-25 06:25:30.657996] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:09:17.236 [2024-07-25 06:25:30.658052] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1042886 ] 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.236 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:17.236 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:17.237 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:17.237 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:17.495 [2024-07-25 06:25:30.795207] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.495 [2024-07-25 06:25:30.839077] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:22.762 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1042886 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 1042886 ']' 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 1042886 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1042886 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1042886' 00:09:22.763 killing process with pid 1042886 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 1042886 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 1042886 00:09:22.763 00:09:22.763 real 0m5.381s 00:09:22.763 user 0m5.033s 00:09:22.763 sys 0m0.365s 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:22.763 06:25:35 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:22.763 ************************************ 00:09:22.763 END TEST skip_rpc 00:09:22.763 ************************************ 00:09:22.763 06:25:36 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:09:22.763 06:25:36 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:22.763 06:25:36 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:22.763 06:25:36 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:22.763 ************************************ 00:09:22.763 START TEST skip_rpc_with_json 00:09:22.763 ************************************ 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1043880 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1043880 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 1043880 ']' 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:22.763 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:09:22.763 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:09:22.763 [2024-07-25 06:25:36.115137] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:09:22.763 [2024-07-25 06:25:36.115200] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1043880 ] 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:22.763 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:22.763 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:22.763 [2024-07-25 06:25:36.252454] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:22.763 [2024-07-25 06:25:36.297465] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:09:23.771 [2024-07-25 06:25:36.992999] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:09:23.771 request: 00:09:23.771 { 00:09:23.771 "trtype": "tcp", 00:09:23.771 "method": "nvmf_get_transports", 00:09:23.771 "req_id": 1 00:09:23.771 } 00:09:23.771 Got JSON-RPC error response 00:09:23.771 response: 00:09:23.771 { 00:09:23.771 "code": -19, 00:09:23.771 "message": "No such device" 00:09:23.771 } 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:23.771 06:25:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:09:23.771 [2024-07-25 06:25:37.001133] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:23.771 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:23.771 06:25:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:09:23.771 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:23.771 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:09:23.771 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:23.771 06:25:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:09:23.771 { 00:09:23.771 "subsystems": [ 00:09:23.771 { 00:09:23.771 "subsystem": "keyring", 00:09:23.771 "config": [] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "iobuf", 00:09:23.771 "config": [ 00:09:23.771 { 00:09:23.771 "method": "iobuf_set_options", 00:09:23.771 "params": { 00:09:23.771 "small_pool_count": 8192, 00:09:23.771 "large_pool_count": 1024, 00:09:23.771 "small_bufsize": 8192, 00:09:23.771 "large_bufsize": 135168 00:09:23.771 } 00:09:23.771 } 00:09:23.771 ] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "sock", 00:09:23.771 "config": [ 00:09:23.771 { 00:09:23.771 "method": "sock_set_default_impl", 00:09:23.771 "params": { 00:09:23.771 "impl_name": "posix" 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "sock_impl_set_options", 00:09:23.771 "params": { 00:09:23.771 "impl_name": "ssl", 00:09:23.771 "recv_buf_size": 4096, 00:09:23.771 "send_buf_size": 4096, 00:09:23.771 "enable_recv_pipe": true, 00:09:23.771 "enable_quickack": false, 00:09:23.771 "enable_placement_id": 0, 00:09:23.771 "enable_zerocopy_send_server": true, 00:09:23.771 "enable_zerocopy_send_client": false, 00:09:23.771 "zerocopy_threshold": 0, 00:09:23.771 "tls_version": 0, 00:09:23.771 "enable_ktls": false 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "sock_impl_set_options", 00:09:23.771 "params": { 00:09:23.771 "impl_name": "posix", 00:09:23.771 "recv_buf_size": 2097152, 00:09:23.771 "send_buf_size": 2097152, 00:09:23.771 "enable_recv_pipe": true, 00:09:23.771 "enable_quickack": false, 00:09:23.771 "enable_placement_id": 0, 00:09:23.771 "enable_zerocopy_send_server": true, 00:09:23.771 "enable_zerocopy_send_client": false, 00:09:23.771 "zerocopy_threshold": 0, 00:09:23.771 "tls_version": 0, 00:09:23.771 "enable_ktls": false 00:09:23.771 } 00:09:23.771 } 00:09:23.771 ] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "vmd", 00:09:23.771 "config": [] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "accel", 00:09:23.771 "config": [ 00:09:23.771 { 00:09:23.771 "method": "accel_set_options", 00:09:23.771 "params": { 00:09:23.771 "small_cache_size": 128, 00:09:23.771 "large_cache_size": 16, 00:09:23.771 "task_count": 2048, 00:09:23.771 "sequence_count": 2048, 00:09:23.771 "buf_count": 2048 00:09:23.771 } 00:09:23.771 } 00:09:23.771 ] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "bdev", 00:09:23.771 "config": [ 00:09:23.771 { 00:09:23.771 "method": "bdev_set_options", 00:09:23.771 "params": { 00:09:23.771 "bdev_io_pool_size": 65535, 00:09:23.771 "bdev_io_cache_size": 256, 00:09:23.771 "bdev_auto_examine": true, 00:09:23.771 "iobuf_small_cache_size": 128, 00:09:23.771 "iobuf_large_cache_size": 16 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "bdev_raid_set_options", 00:09:23.771 "params": { 00:09:23.771 "process_window_size_kb": 1024, 00:09:23.771 "process_max_bandwidth_mb_sec": 0 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "bdev_iscsi_set_options", 00:09:23.771 "params": { 00:09:23.771 "timeout_sec": 30 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "bdev_nvme_set_options", 00:09:23.771 "params": { 00:09:23.771 "action_on_timeout": "none", 00:09:23.771 "timeout_us": 0, 00:09:23.771 "timeout_admin_us": 0, 00:09:23.771 "keep_alive_timeout_ms": 10000, 00:09:23.771 "arbitration_burst": 0, 00:09:23.771 "low_priority_weight": 0, 00:09:23.771 "medium_priority_weight": 0, 00:09:23.771 "high_priority_weight": 0, 00:09:23.771 "nvme_adminq_poll_period_us": 10000, 00:09:23.771 "nvme_ioq_poll_period_us": 0, 00:09:23.771 "io_queue_requests": 0, 00:09:23.771 "delay_cmd_submit": true, 00:09:23.771 "transport_retry_count": 4, 00:09:23.771 "bdev_retry_count": 3, 00:09:23.771 "transport_ack_timeout": 0, 00:09:23.771 "ctrlr_loss_timeout_sec": 0, 00:09:23.771 "reconnect_delay_sec": 0, 00:09:23.771 "fast_io_fail_timeout_sec": 0, 00:09:23.771 "disable_auto_failback": false, 00:09:23.771 "generate_uuids": false, 00:09:23.771 "transport_tos": 0, 00:09:23.771 "nvme_error_stat": false, 00:09:23.771 "rdma_srq_size": 0, 00:09:23.771 "io_path_stat": false, 00:09:23.771 "allow_accel_sequence": false, 00:09:23.771 "rdma_max_cq_size": 0, 00:09:23.771 "rdma_cm_event_timeout_ms": 0, 00:09:23.771 "dhchap_digests": [ 00:09:23.771 "sha256", 00:09:23.771 "sha384", 00:09:23.771 "sha512" 00:09:23.771 ], 00:09:23.771 "dhchap_dhgroups": [ 00:09:23.771 "null", 00:09:23.771 "ffdhe2048", 00:09:23.771 "ffdhe3072", 00:09:23.771 "ffdhe4096", 00:09:23.771 "ffdhe6144", 00:09:23.771 "ffdhe8192" 00:09:23.771 ] 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "bdev_nvme_set_hotplug", 00:09:23.771 "params": { 00:09:23.771 "period_us": 100000, 00:09:23.771 "enable": false 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "bdev_wait_for_examine" 00:09:23.771 } 00:09:23.771 ] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "scsi", 00:09:23.771 "config": null 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "scheduler", 00:09:23.771 "config": [ 00:09:23.771 { 00:09:23.771 "method": "framework_set_scheduler", 00:09:23.771 "params": { 00:09:23.771 "name": "static" 00:09:23.771 } 00:09:23.771 } 00:09:23.771 ] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "vhost_scsi", 00:09:23.771 "config": [] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "vhost_blk", 00:09:23.771 "config": [] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "ublk", 00:09:23.771 "config": [] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "nbd", 00:09:23.771 "config": [] 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "subsystem": "nvmf", 00:09:23.771 "config": [ 00:09:23.771 { 00:09:23.771 "method": "nvmf_set_config", 00:09:23.771 "params": { 00:09:23.771 "discovery_filter": "match_any", 00:09:23.771 "admin_cmd_passthru": { 00:09:23.771 "identify_ctrlr": false 00:09:23.771 } 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "nvmf_set_max_subsystems", 00:09:23.771 "params": { 00:09:23.771 "max_subsystems": 1024 00:09:23.771 } 00:09:23.771 }, 00:09:23.771 { 00:09:23.771 "method": "nvmf_set_crdt", 00:09:23.771 "params": { 00:09:23.772 "crdt1": 0, 00:09:23.772 "crdt2": 0, 00:09:23.772 "crdt3": 0 00:09:23.772 } 00:09:23.772 }, 00:09:23.772 { 00:09:23.772 "method": "nvmf_create_transport", 00:09:23.772 "params": { 00:09:23.772 "trtype": "TCP", 00:09:23.772 "max_queue_depth": 128, 00:09:23.772 "max_io_qpairs_per_ctrlr": 127, 00:09:23.772 "in_capsule_data_size": 4096, 00:09:23.772 "max_io_size": 131072, 00:09:23.772 "io_unit_size": 131072, 00:09:23.772 "max_aq_depth": 128, 00:09:23.772 "num_shared_buffers": 511, 00:09:23.772 "buf_cache_size": 4294967295, 00:09:23.772 "dif_insert_or_strip": false, 00:09:23.772 "zcopy": false, 00:09:23.772 "c2h_success": true, 00:09:23.772 "sock_priority": 0, 00:09:23.772 "abort_timeout_sec": 1, 00:09:23.772 "ack_timeout": 0, 00:09:23.772 "data_wr_pool_size": 0 00:09:23.772 } 00:09:23.772 } 00:09:23.772 ] 00:09:23.772 }, 00:09:23.772 { 00:09:23.772 "subsystem": "iscsi", 00:09:23.772 "config": [ 00:09:23.772 { 00:09:23.772 "method": "iscsi_set_options", 00:09:23.772 "params": { 00:09:23.772 "node_base": "iqn.2016-06.io.spdk", 00:09:23.772 "max_sessions": 128, 00:09:23.772 "max_connections_per_session": 2, 00:09:23.772 "max_queue_depth": 64, 00:09:23.772 "default_time2wait": 2, 00:09:23.772 "default_time2retain": 20, 00:09:23.772 "first_burst_length": 8192, 00:09:23.772 "immediate_data": true, 00:09:23.772 "allow_duplicated_isid": false, 00:09:23.772 "error_recovery_level": 0, 00:09:23.772 "nop_timeout": 60, 00:09:23.772 "nop_in_interval": 30, 00:09:23.772 "disable_chap": false, 00:09:23.772 "require_chap": false, 00:09:23.772 "mutual_chap": false, 00:09:23.772 "chap_group": 0, 00:09:23.772 "max_large_datain_per_connection": 64, 00:09:23.772 "max_r2t_per_connection": 4, 00:09:23.772 "pdu_pool_size": 36864, 00:09:23.772 "immediate_data_pool_size": 16384, 00:09:23.772 "data_out_pool_size": 2048 00:09:23.772 } 00:09:23.772 } 00:09:23.772 ] 00:09:23.772 } 00:09:23.772 ] 00:09:23.772 } 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1043880 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1043880 ']' 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1043880 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1043880 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1043880' 00:09:23.772 killing process with pid 1043880 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1043880 00:09:23.772 06:25:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1043880 00:09:24.030 06:25:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1044128 00:09:24.030 06:25:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:09:24.030 06:25:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1044128 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1044128 ']' 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1044128 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1044128 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1044128' 00:09:29.293 killing process with pid 1044128 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1044128 00:09:29.293 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1044128 00:09:29.552 06:25:42 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:09:29.553 00:09:29.553 real 0m6.861s 00:09:29.553 user 0m6.583s 00:09:29.553 sys 0m0.792s 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:09:29.553 ************************************ 00:09:29.553 END TEST skip_rpc_with_json 00:09:29.553 ************************************ 00:09:29.553 06:25:42 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:09:29.553 06:25:42 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:29.553 06:25:42 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:29.553 06:25:42 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:29.553 ************************************ 00:09:29.553 START TEST skip_rpc_with_delay 00:09:29.553 ************************************ 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:29.553 06:25:42 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:29.553 06:25:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:09:29.553 06:25:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:09:29.553 [2024-07-25 06:25:43.067779] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:09:29.553 [2024-07-25 06:25:43.067872] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:09:29.553 06:25:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:09:29.553 06:25:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:29.553 06:25:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:09:29.553 06:25:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:29.553 00:09:29.553 real 0m0.091s 00:09:29.553 user 0m0.058s 00:09:29.553 sys 0m0.032s 00:09:29.553 06:25:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:29.553 06:25:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:09:29.553 ************************************ 00:09:29.553 END TEST skip_rpc_with_delay 00:09:29.553 ************************************ 00:09:29.812 06:25:43 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:09:29.812 06:25:43 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:09:29.812 06:25:43 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:09:29.812 06:25:43 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:29.812 06:25:43 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:29.812 06:25:43 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:29.812 ************************************ 00:09:29.812 START TEST exit_on_failed_rpc_init 00:09:29.812 ************************************ 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1045088 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1045088 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 1045088 ']' 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:29.812 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:29.812 06:25:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:09:29.812 [2024-07-25 06:25:43.248684] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:09:29.812 [2024-07-25 06:25:43.248744] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1045088 ] 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:29.812 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:29.812 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:30.071 [2024-07-25 06:25:43.386405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.071 [2024-07-25 06:25:43.428239] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:30.639 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:30.640 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:09:30.640 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:09:30.898 [2024-07-25 06:25:44.209809] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:09:30.898 [2024-07-25 06:25:44.209870] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1045351 ] 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:30.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:30.898 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:30.898 [2024-07-25 06:25:44.333982] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.898 [2024-07-25 06:25:44.377500] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:30.898 [2024-07-25 06:25:44.377581] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:09:30.898 [2024-07-25 06:25:44.377596] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:09:30.898 [2024-07-25 06:25:44.377607] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:30.898 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:09:30.898 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:30.898 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:09:30.898 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:09:30.898 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:09:30.898 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:30.898 06:25:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:09:30.898 06:25:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1045088 00:09:31.156 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 1045088 ']' 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 1045088 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1045088 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1045088' 00:09:31.157 killing process with pid 1045088 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 1045088 00:09:31.157 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 1045088 00:09:31.415 00:09:31.415 real 0m1.644s 00:09:31.415 user 0m1.818s 00:09:31.415 sys 0m0.597s 00:09:31.415 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:31.415 06:25:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:09:31.415 ************************************ 00:09:31.415 END TEST exit_on_failed_rpc_init 00:09:31.415 ************************************ 00:09:31.415 06:25:44 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:09:31.415 00:09:31.415 real 0m14.418s 00:09:31.415 user 0m13.653s 00:09:31.415 sys 0m2.104s 00:09:31.415 06:25:44 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:31.415 06:25:44 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:31.415 ************************************ 00:09:31.415 END TEST skip_rpc 00:09:31.415 ************************************ 00:09:31.415 06:25:44 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:09:31.415 06:25:44 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:31.415 06:25:44 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:31.415 06:25:44 -- common/autotest_common.sh@10 -- # set +x 00:09:31.415 ************************************ 00:09:31.415 START TEST rpc_client 00:09:31.415 ************************************ 00:09:31.415 06:25:44 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:09:31.674 * Looking for test storage... 00:09:31.674 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:09:31.674 06:25:45 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:09:31.674 OK 00:09:31.674 06:25:45 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:09:31.674 00:09:31.674 real 0m0.142s 00:09:31.674 user 0m0.062s 00:09:31.674 sys 0m0.090s 00:09:31.674 06:25:45 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:31.674 06:25:45 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:09:31.674 ************************************ 00:09:31.674 END TEST rpc_client 00:09:31.674 ************************************ 00:09:31.674 06:25:45 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:09:31.674 06:25:45 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:31.674 06:25:45 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:31.674 06:25:45 -- common/autotest_common.sh@10 -- # set +x 00:09:31.674 ************************************ 00:09:31.674 START TEST json_config 00:09:31.674 ************************************ 00:09:31.674 06:25:45 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:09:31.933 06:25:45 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@7 -- # uname -s 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:09:31.933 06:25:45 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:09:31.934 06:25:45 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:31.934 06:25:45 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:31.934 06:25:45 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:31.934 06:25:45 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:31.934 06:25:45 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:31.934 06:25:45 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:31.934 06:25:45 json_config -- paths/export.sh@5 -- # export PATH 00:09:31.934 06:25:45 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@47 -- # : 0 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:09:31.934 06:25:45 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:09:31.934 INFO: JSON configuration test init 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:31.934 06:25:45 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:09:31.934 06:25:45 json_config -- json_config/common.sh@9 -- # local app=target 00:09:31.934 06:25:45 json_config -- json_config/common.sh@10 -- # shift 00:09:31.934 06:25:45 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:09:31.934 06:25:45 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:09:31.934 06:25:45 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:09:31.934 06:25:45 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:09:31.934 06:25:45 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:09:31.934 06:25:45 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1045640 00:09:31.934 06:25:45 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:09:31.934 Waiting for target to run... 00:09:31.934 06:25:45 json_config -- json_config/common.sh@25 -- # waitforlisten 1045640 /var/tmp/spdk_tgt.sock 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@831 -- # '[' -z 1045640 ']' 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:09:31.934 06:25:45 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:09:31.934 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:31.934 06:25:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:31.934 [2024-07-25 06:25:45.359612] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:09:31.934 [2024-07-25 06:25:45.359679] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1045640 ] 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:32.193 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:32.193 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:32.193 [2024-07-25 06:25:45.717265] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:32.193 [2024-07-25 06:25:45.743233] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.759 06:25:46 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:32.759 06:25:46 json_config -- common/autotest_common.sh@864 -- # return 0 00:09:32.759 06:25:46 json_config -- json_config/common.sh@26 -- # echo '' 00:09:32.759 00:09:32.759 06:25:46 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:09:32.759 06:25:46 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:09:32.759 06:25:46 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:32.759 06:25:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:32.759 06:25:46 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:09:32.759 06:25:46 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:09:32.759 06:25:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:09:33.017 06:25:46 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:09:33.017 06:25:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:09:33.275 [2024-07-25 06:25:46.690081] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:09:33.275 06:25:46 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:09:33.275 06:25:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:09:33.533 [2024-07-25 06:25:46.914662] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:09:33.533 06:25:46 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:09:33.533 06:25:46 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:33.533 06:25:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:33.533 06:25:46 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:09:33.533 06:25:46 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:09:33.533 06:25:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:09:33.790 [2024-07-25 06:25:47.207747] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:09:39.056 06:25:52 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:09:39.056 06:25:52 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:09:39.056 06:25:52 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:39.056 06:25:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:39.056 06:25:52 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:09:39.056 06:25:52 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:09:39.056 06:25:52 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:09:39.056 06:25:52 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:09:39.056 06:25:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:09:39.056 06:25:52 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@48 -- # local get_types 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@51 -- # sort 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:09:39.315 06:25:52 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:39.315 06:25:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@59 -- # return 0 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:09:39.315 06:25:52 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:39.315 06:25:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:09:39.315 06:25:52 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:09:39.315 06:25:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:09:39.573 06:25:52 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:09:39.574 06:25:52 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:39.574 06:25:52 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:39.574 06:25:52 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:09:39.574 06:25:52 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:09:39.574 06:25:52 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:09:39.574 06:25:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:09:39.832 Nvme0n1p0 Nvme0n1p1 00:09:39.832 06:25:53 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:09:39.832 06:25:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:09:39.832 [2024-07-25 06:25:53.354623] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:09:39.832 [2024-07-25 06:25:53.354669] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:09:39.832 00:09:39.832 06:25:53 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:09:39.832 06:25:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:09:40.089 Malloc3 00:09:40.089 06:25:53 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:09:40.089 06:25:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:09:40.347 [2024-07-25 06:25:53.807902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:40.347 [2024-07-25 06:25:53.807945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:40.347 [2024-07-25 06:25:53.807966] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2796740 00:09:40.347 [2024-07-25 06:25:53.807979] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:40.347 [2024-07-25 06:25:53.809385] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:40.347 [2024-07-25 06:25:53.809411] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:09:40.348 PTBdevFromMalloc3 00:09:40.348 06:25:53 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:09:40.348 06:25:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:09:40.606 Null0 00:09:40.606 06:25:54 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:09:40.606 06:25:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:09:40.865 Malloc0 00:09:40.865 06:25:54 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:09:40.865 06:25:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:09:41.124 Malloc1 00:09:41.124 06:25:54 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:09:41.124 06:25:54 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:09:41.382 102400+0 records in 00:09:41.382 102400+0 records out 00:09:41.382 104857600 bytes (105 MB, 100 MiB) copied, 0.284321 s, 369 MB/s 00:09:41.382 06:25:54 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:09:41.382 06:25:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:09:41.641 aio_disk 00:09:41.641 06:25:55 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:09:41.641 06:25:55 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:09:41.641 06:25:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:09:45.857 55b8aa78-2725-4d73-9d6f-92d5b561e9a5 00:09:45.857 06:25:59 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:09:45.857 06:25:59 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:09:45.857 06:25:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:09:45.857 06:25:59 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:09:45.857 06:25:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:09:46.115 06:25:59 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:09:46.115 06:25:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:09:46.373 06:25:59 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:09:46.373 06:25:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:09:46.632 06:26:00 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:09:46.632 06:26:00 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:09:46.632 06:26:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:09:46.890 MallocForCryptoBdev 00:09:46.890 06:26:00 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:09:46.890 06:26:00 json_config -- json_config/json_config.sh@163 -- # wc -l 00:09:46.890 06:26:00 json_config -- json_config/json_config.sh@163 -- # [[ 5 -eq 0 ]] 00:09:46.890 06:26:00 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:09:46.890 06:26:00 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:09:46.890 06:26:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:09:47.148 [2024-07-25 06:26:00.553726] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:09:47.148 CryptoMallocBdev 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:975b150d-40a7-4acc-be95-2589d099af2e bdev_register:45598aaa-bde8-479d-8118-d41400e1df60 bdev_register:8c205ad0-6d7b-43c1-90ee-9f411b1dc220 bdev_register:0520dee5-fc6a-4fe8-8be0-77d44b9b855f bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:975b150d-40a7-4acc-be95-2589d099af2e bdev_register:45598aaa-bde8-479d-8118-d41400e1df60 bdev_register:8c205ad0-6d7b-43c1-90ee-9f411b1dc220 bdev_register:0520dee5-fc6a-4fe8-8be0-77d44b9b855f bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@75 -- # sort 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@76 -- # sort 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:09:47.148 06:26:00 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:09:47.148 06:26:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:975b150d-40a7-4acc-be95-2589d099af2e 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:45598aaa-bde8-479d-8118-d41400e1df60 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:8c205ad0-6d7b-43c1-90ee-9f411b1dc220 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:0520dee5-fc6a-4fe8-8be0-77d44b9b855f 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:0520dee5-fc6a-4fe8-8be0-77d44b9b855f bdev_register:45598aaa-bde8-479d-8118-d41400e1df60 bdev_register:8c205ad0-6d7b-43c1-90ee-9f411b1dc220 bdev_register:975b150d-40a7-4acc-be95-2589d099af2e bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\5\2\0\d\e\e\5\-\f\c\6\a\-\4\f\e\8\-\8\b\e\0\-\7\7\d\4\4\b\9\b\8\5\5\f\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\5\5\9\8\a\a\a\-\b\d\e\8\-\4\7\9\d\-\8\1\1\8\-\d\4\1\4\0\0\e\1\d\f\6\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\c\2\0\5\a\d\0\-\6\d\7\b\-\4\3\c\1\-\9\0\e\e\-\9\f\4\1\1\b\1\d\c\2\2\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\9\7\5\b\1\5\0\d\-\4\0\a\7\-\4\a\c\c\-\b\e\9\5\-\2\5\8\9\d\0\9\9\a\f\2\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@90 -- # cat 00:09:47.407 06:26:00 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:0520dee5-fc6a-4fe8-8be0-77d44b9b855f bdev_register:45598aaa-bde8-479d-8118-d41400e1df60 bdev_register:8c205ad0-6d7b-43c1-90ee-9f411b1dc220 bdev_register:975b150d-40a7-4acc-be95-2589d099af2e bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:09:47.407 Expected events matched: 00:09:47.407 bdev_register:0520dee5-fc6a-4fe8-8be0-77d44b9b855f 00:09:47.407 bdev_register:45598aaa-bde8-479d-8118-d41400e1df60 00:09:47.407 bdev_register:8c205ad0-6d7b-43c1-90ee-9f411b1dc220 00:09:47.407 bdev_register:975b150d-40a7-4acc-be95-2589d099af2e 00:09:47.407 bdev_register:aio_disk 00:09:47.407 bdev_register:CryptoMallocBdev 00:09:47.407 bdev_register:Malloc0 00:09:47.407 bdev_register:Malloc0p0 00:09:47.407 bdev_register:Malloc0p1 00:09:47.407 bdev_register:Malloc0p2 00:09:47.408 bdev_register:Malloc1 00:09:47.408 bdev_register:Malloc3 00:09:47.408 bdev_register:MallocForCryptoBdev 00:09:47.408 bdev_register:Null0 00:09:47.408 bdev_register:Nvme0n1 00:09:47.408 bdev_register:Nvme0n1p0 00:09:47.408 bdev_register:Nvme0n1p1 00:09:47.408 bdev_register:PTBdevFromMalloc3 00:09:47.408 06:26:00 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:09:47.408 06:26:00 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:47.408 06:26:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:47.408 06:26:00 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:09:47.408 06:26:00 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:09:47.408 06:26:00 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:09:47.408 06:26:00 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:09:47.408 06:26:00 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:47.408 06:26:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:47.408 06:26:00 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:09:47.408 06:26:00 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:09:47.408 06:26:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:09:47.666 MallocBdevForConfigChangeCheck 00:09:47.666 06:26:01 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:09:47.666 06:26:01 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:47.666 06:26:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:47.666 06:26:01 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:09:47.666 06:26:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:09:48.233 06:26:01 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:09:48.233 INFO: shutting down applications... 00:09:48.233 06:26:01 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:09:48.233 06:26:01 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:09:48.233 06:26:01 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:09:48.233 06:26:01 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:09:48.233 [2024-07-25 06:26:01.745386] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:09:51.516 Calling clear_iscsi_subsystem 00:09:51.516 Calling clear_nvmf_subsystem 00:09:51.516 Calling clear_nbd_subsystem 00:09:51.516 Calling clear_ublk_subsystem 00:09:51.516 Calling clear_vhost_blk_subsystem 00:09:51.516 Calling clear_vhost_scsi_subsystem 00:09:51.516 Calling clear_bdev_subsystem 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@347 -- # count=100 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@349 -- # break 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:09:51.516 06:26:04 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:09:51.516 06:26:04 json_config -- json_config/common.sh@31 -- # local app=target 00:09:51.516 06:26:04 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:09:51.516 06:26:04 json_config -- json_config/common.sh@35 -- # [[ -n 1045640 ]] 00:09:51.516 06:26:04 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1045640 00:09:51.516 06:26:04 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:09:51.516 06:26:04 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:09:51.516 06:26:04 json_config -- json_config/common.sh@41 -- # kill -0 1045640 00:09:51.516 06:26:04 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:09:51.775 06:26:05 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:09:51.775 06:26:05 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:09:51.775 06:26:05 json_config -- json_config/common.sh@41 -- # kill -0 1045640 00:09:51.775 06:26:05 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:09:51.775 06:26:05 json_config -- json_config/common.sh@43 -- # break 00:09:51.775 06:26:05 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:09:51.775 06:26:05 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:09:51.775 SPDK target shutdown done 00:09:51.775 06:26:05 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:09:51.775 INFO: relaunching applications... 00:09:51.775 06:26:05 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:51.775 06:26:05 json_config -- json_config/common.sh@9 -- # local app=target 00:09:51.775 06:26:05 json_config -- json_config/common.sh@10 -- # shift 00:09:51.775 06:26:05 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:09:51.775 06:26:05 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:09:51.775 06:26:05 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:09:51.775 06:26:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:09:51.775 06:26:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:09:51.775 06:26:05 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1049079 00:09:51.775 06:26:05 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:09:51.775 Waiting for target to run... 00:09:51.775 06:26:05 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:51.775 06:26:05 json_config -- json_config/common.sh@25 -- # waitforlisten 1049079 /var/tmp/spdk_tgt.sock 00:09:51.775 06:26:05 json_config -- common/autotest_common.sh@831 -- # '[' -z 1049079 ']' 00:09:51.775 06:26:05 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:09:51.775 06:26:05 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:51.775 06:26:05 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:09:51.775 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:09:51.775 06:26:05 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:51.775 06:26:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:51.775 [2024-07-25 06:26:05.265659] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:09:51.775 [2024-07-25 06:26:05.265727] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1049079 ] 00:09:52.341 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.341 EAL: Requested device 0000:3d:01.0 cannot be used 00:09:52.341 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.341 EAL: Requested device 0000:3d:01.1 cannot be used 00:09:52.341 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.341 EAL: Requested device 0000:3d:01.2 cannot be used 00:09:52.341 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.341 EAL: Requested device 0000:3d:01.3 cannot be used 00:09:52.341 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.341 EAL: Requested device 0000:3d:01.4 cannot be used 00:09:52.341 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.341 EAL: Requested device 0000:3d:01.5 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:01.6 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:01.7 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:02.0 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:02.1 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:02.2 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:02.3 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:02.4 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:02.5 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:02.6 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3d:02.7 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:01.0 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:01.1 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:01.2 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:01.3 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:01.4 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:01.5 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:01.6 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:01.7 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:02.0 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:02.1 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:02.2 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:02.3 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:02.4 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:02.5 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:02.6 cannot be used 00:09:52.342 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:09:52.342 EAL: Requested device 0000:3f:02.7 cannot be used 00:09:52.342 [2024-07-25 06:26:05.791684] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.342 [2024-07-25 06:26:05.825171] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.342 [2024-07-25 06:26:05.879270] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:09:52.342 [2024-07-25 06:26:05.887305] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:09:52.342 [2024-07-25 06:26:05.895323] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:09:52.600 [2024-07-25 06:26:05.976202] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:09:55.132 [2024-07-25 06:26:08.253700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:55.132 [2024-07-25 06:26:08.253751] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:55.132 [2024-07-25 06:26:08.253765] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:55.132 [2024-07-25 06:26:08.261716] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:09:55.132 [2024-07-25 06:26:08.261746] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:09:55.132 [2024-07-25 06:26:08.269732] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:09:55.132 [2024-07-25 06:26:08.269757] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:09:55.132 [2024-07-25 06:26:08.277765] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:09:55.132 [2024-07-25 06:26:08.277793] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:09:55.132 [2024-07-25 06:26:08.277805] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:57.663 [2024-07-25 06:26:11.174530] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:57.663 [2024-07-25 06:26:11.174573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:57.663 [2024-07-25 06:26:11.174589] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210a6b0 00:09:57.663 [2024-07-25 06:26:11.174601] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:57.663 [2024-07-25 06:26:11.174866] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:57.663 [2024-07-25 06:26:11.174882] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:09:57.922 06:26:11 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:57.922 06:26:11 json_config -- common/autotest_common.sh@864 -- # return 0 00:09:57.922 06:26:11 json_config -- json_config/common.sh@26 -- # echo '' 00:09:57.922 00:09:57.922 06:26:11 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:09:57.922 06:26:11 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:09:57.922 INFO: Checking if target configuration is the same... 00:09:57.922 06:26:11 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:57.922 06:26:11 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:09:57.922 06:26:11 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:09:57.922 + '[' 2 -ne 2 ']' 00:09:57.922 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:09:57.922 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:09:57.922 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:57.922 +++ basename /dev/fd/62 00:09:57.922 ++ mktemp /tmp/62.XXX 00:09:57.922 + tmp_file_1=/tmp/62.8GE 00:09:57.922 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:57.922 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:09:57.922 + tmp_file_2=/tmp/spdk_tgt_config.json.DPr 00:09:57.922 + ret=0 00:09:57.922 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:09:58.180 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:09:58.438 + diff -u /tmp/62.8GE /tmp/spdk_tgt_config.json.DPr 00:09:58.438 + echo 'INFO: JSON config files are the same' 00:09:58.438 INFO: JSON config files are the same 00:09:58.438 + rm /tmp/62.8GE /tmp/spdk_tgt_config.json.DPr 00:09:58.438 + exit 0 00:09:58.438 06:26:11 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:09:58.438 06:26:11 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:09:58.438 INFO: changing configuration and checking if this can be detected... 00:09:58.438 06:26:11 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:09:58.438 06:26:11 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:09:58.695 06:26:12 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:58.695 06:26:12 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:09:58.695 06:26:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:09:58.695 + '[' 2 -ne 2 ']' 00:09:58.695 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:09:58.695 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:09:58.695 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:58.695 +++ basename /dev/fd/62 00:09:58.695 ++ mktemp /tmp/62.XXX 00:09:58.695 + tmp_file_1=/tmp/62.TMm 00:09:58.695 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:58.695 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:09:58.695 + tmp_file_2=/tmp/spdk_tgt_config.json.yId 00:09:58.695 + ret=0 00:09:58.695 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:09:58.954 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:09:58.954 + diff -u /tmp/62.TMm /tmp/spdk_tgt_config.json.yId 00:09:58.954 + ret=1 00:09:58.954 + echo '=== Start of file: /tmp/62.TMm ===' 00:09:58.954 + cat /tmp/62.TMm 00:09:58.954 + echo '=== End of file: /tmp/62.TMm ===' 00:09:58.954 + echo '' 00:09:58.954 + echo '=== Start of file: /tmp/spdk_tgt_config.json.yId ===' 00:09:58.954 + cat /tmp/spdk_tgt_config.json.yId 00:09:58.954 + echo '=== End of file: /tmp/spdk_tgt_config.json.yId ===' 00:09:58.954 + echo '' 00:09:58.954 + rm /tmp/62.TMm /tmp/spdk_tgt_config.json.yId 00:09:58.954 + exit 1 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:09:58.954 INFO: configuration change detected. 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:09:58.954 06:26:12 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:58.954 06:26:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@321 -- # [[ -n 1049079 ]] 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:09:58.954 06:26:12 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:58.954 06:26:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:09:58.954 06:26:12 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:09:58.954 06:26:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:09:59.212 06:26:12 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:09:59.212 06:26:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:09:59.469 06:26:12 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:09:59.469 06:26:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:09:59.726 06:26:13 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:09:59.726 06:26:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:09:59.984 06:26:13 json_config -- json_config/json_config.sh@197 -- # uname -s 00:09:59.984 06:26:13 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:09:59.984 06:26:13 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:09:59.984 06:26:13 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:09:59.984 06:26:13 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:59.984 06:26:13 json_config -- json_config/json_config.sh@327 -- # killprocess 1049079 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@950 -- # '[' -z 1049079 ']' 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@954 -- # kill -0 1049079 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@955 -- # uname 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1049079 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1049079' 00:09:59.984 killing process with pid 1049079 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@969 -- # kill 1049079 00:09:59.984 06:26:13 json_config -- common/autotest_common.sh@974 -- # wait 1049079 00:10:02.511 06:26:15 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:10:02.511 06:26:15 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:10:02.511 06:26:15 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:10:02.511 06:26:15 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:02.511 06:26:16 json_config -- json_config/json_config.sh@332 -- # return 0 00:10:02.511 06:26:16 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:10:02.511 INFO: Success 00:10:02.511 00:10:02.511 real 0m30.849s 00:10:02.511 user 0m35.283s 00:10:02.511 sys 0m4.009s 00:10:02.511 06:26:16 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:02.511 06:26:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:02.511 ************************************ 00:10:02.511 END TEST json_config 00:10:02.511 ************************************ 00:10:02.511 06:26:16 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:10:02.511 06:26:16 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:02.511 06:26:16 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:02.511 06:26:16 -- common/autotest_common.sh@10 -- # set +x 00:10:02.770 ************************************ 00:10:02.770 START TEST json_config_extra_key 00:10:02.770 ************************************ 00:10:02.770 06:26:16 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:10:02.770 06:26:16 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:10:02.770 06:26:16 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:02.770 06:26:16 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:02.770 06:26:16 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:02.770 06:26:16 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:02.770 06:26:16 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:02.770 06:26:16 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:10:02.770 06:26:16 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:10:02.770 06:26:16 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:10:02.770 INFO: launching applications... 00:10:02.770 06:26:16 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1051145 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:10:02.770 Waiting for target to run... 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1051145 /var/tmp/spdk_tgt.sock 00:10:02.770 06:26:16 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:10:02.770 06:26:16 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 1051145 ']' 00:10:02.770 06:26:16 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:10:02.770 06:26:16 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:02.770 06:26:16 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:10:02.770 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:10:02.770 06:26:16 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:02.770 06:26:16 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:10:02.770 [2024-07-25 06:26:16.268921] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:02.770 [2024-07-25 06:26:16.268990] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1051145 ] 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:03.028 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.028 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:03.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.029 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:03.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.029 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:03.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.029 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:03.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.029 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:03.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.029 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:03.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.029 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:03.287 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.287 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:03.287 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:03.287 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:03.287 [2024-07-25 06:26:16.641616] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:03.287 [2024-07-25 06:26:16.667427] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:03.889 06:26:17 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:03.889 06:26:17 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:10:03.889 00:10:03.889 06:26:17 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:10:03.889 INFO: shutting down applications... 00:10:03.889 06:26:17 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1051145 ]] 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1051145 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1051145 00:10:03.889 06:26:17 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:10:04.147 06:26:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:10:04.147 06:26:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:10:04.147 06:26:17 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1051145 00:10:04.147 06:26:17 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:10:04.147 06:26:17 json_config_extra_key -- json_config/common.sh@43 -- # break 00:10:04.147 06:26:17 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:10:04.147 06:26:17 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:10:04.147 SPDK target shutdown done 00:10:04.147 06:26:17 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:10:04.147 Success 00:10:04.147 00:10:04.147 real 0m1.573s 00:10:04.147 user 0m1.185s 00:10:04.147 sys 0m0.486s 00:10:04.147 06:26:17 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:04.147 06:26:17 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:10:04.147 ************************************ 00:10:04.147 END TEST json_config_extra_key 00:10:04.147 ************************************ 00:10:04.405 06:26:17 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:10:04.405 06:26:17 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:04.405 06:26:17 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:04.405 06:26:17 -- common/autotest_common.sh@10 -- # set +x 00:10:04.405 ************************************ 00:10:04.405 START TEST alias_rpc 00:10:04.405 ************************************ 00:10:04.405 06:26:17 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:10:04.405 * Looking for test storage... 00:10:04.405 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:10:04.405 06:26:17 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:10:04.405 06:26:17 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1051496 00:10:04.405 06:26:17 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:04.405 06:26:17 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1051496 00:10:04.405 06:26:17 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 1051496 ']' 00:10:04.405 06:26:17 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:04.405 06:26:17 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:04.406 06:26:17 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:04.406 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:04.406 06:26:17 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:04.406 06:26:17 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:04.664 [2024-07-25 06:26:17.975678] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:04.664 [2024-07-25 06:26:17.975814] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1051496 ] 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.664 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:04.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:04.665 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:04.665 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:04.665 [2024-07-25 06:26:18.194690] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:04.923 [2024-07-25 06:26:18.238095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:05.489 06:26:18 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:05.489 06:26:18 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:10:05.489 06:26:18 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:10:05.747 06:26:19 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1051496 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 1051496 ']' 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 1051496 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1051496 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1051496' 00:10:05.748 killing process with pid 1051496 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@969 -- # kill 1051496 00:10:05.748 06:26:19 alias_rpc -- common/autotest_common.sh@974 -- # wait 1051496 00:10:06.006 00:10:06.006 real 0m1.694s 00:10:06.006 user 0m1.797s 00:10:06.006 sys 0m0.600s 00:10:06.006 06:26:19 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:06.006 06:26:19 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:06.006 ************************************ 00:10:06.006 END TEST alias_rpc 00:10:06.006 ************************************ 00:10:06.006 06:26:19 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:10:06.006 06:26:19 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:10:06.006 06:26:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:06.006 06:26:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:06.006 06:26:19 -- common/autotest_common.sh@10 -- # set +x 00:10:06.006 ************************************ 00:10:06.006 START TEST spdkcli_tcp 00:10:06.006 ************************************ 00:10:06.006 06:26:19 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:10:06.265 * Looking for test storage... 00:10:06.265 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:10:06.265 06:26:19 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:10:06.265 06:26:19 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1051935 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1051935 00:10:06.265 06:26:19 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 1051935 ']' 00:10:06.265 06:26:19 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:06.265 06:26:19 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:06.265 06:26:19 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:06.265 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:06.265 06:26:19 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:06.265 06:26:19 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:10:06.265 06:26:19 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:10:06.265 [2024-07-25 06:26:19.696592] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:06.265 [2024-07-25 06:26:19.696654] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1051935 ] 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:06.265 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:06.265 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:06.524 [2024-07-25 06:26:19.833158] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:06.524 [2024-07-25 06:26:19.879253] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:06.524 [2024-07-25 06:26:19.879260] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:07.090 06:26:20 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:07.090 06:26:20 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:10:07.090 06:26:20 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1051953 00:10:07.090 06:26:20 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:10:07.090 06:26:20 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:10:07.349 [ 00:10:07.349 "bdev_malloc_delete", 00:10:07.349 "bdev_malloc_create", 00:10:07.349 "bdev_null_resize", 00:10:07.349 "bdev_null_delete", 00:10:07.349 "bdev_null_create", 00:10:07.349 "bdev_nvme_cuse_unregister", 00:10:07.349 "bdev_nvme_cuse_register", 00:10:07.349 "bdev_opal_new_user", 00:10:07.349 "bdev_opal_set_lock_state", 00:10:07.349 "bdev_opal_delete", 00:10:07.349 "bdev_opal_get_info", 00:10:07.349 "bdev_opal_create", 00:10:07.349 "bdev_nvme_opal_revert", 00:10:07.349 "bdev_nvme_opal_init", 00:10:07.349 "bdev_nvme_send_cmd", 00:10:07.349 "bdev_nvme_get_path_iostat", 00:10:07.349 "bdev_nvme_get_mdns_discovery_info", 00:10:07.349 "bdev_nvme_stop_mdns_discovery", 00:10:07.349 "bdev_nvme_start_mdns_discovery", 00:10:07.349 "bdev_nvme_set_multipath_policy", 00:10:07.349 "bdev_nvme_set_preferred_path", 00:10:07.349 "bdev_nvme_get_io_paths", 00:10:07.349 "bdev_nvme_remove_error_injection", 00:10:07.349 "bdev_nvme_add_error_injection", 00:10:07.349 "bdev_nvme_get_discovery_info", 00:10:07.349 "bdev_nvme_stop_discovery", 00:10:07.349 "bdev_nvme_start_discovery", 00:10:07.349 "bdev_nvme_get_controller_health_info", 00:10:07.349 "bdev_nvme_disable_controller", 00:10:07.349 "bdev_nvme_enable_controller", 00:10:07.349 "bdev_nvme_reset_controller", 00:10:07.349 "bdev_nvme_get_transport_statistics", 00:10:07.349 "bdev_nvme_apply_firmware", 00:10:07.349 "bdev_nvme_detach_controller", 00:10:07.349 "bdev_nvme_get_controllers", 00:10:07.349 "bdev_nvme_attach_controller", 00:10:07.349 "bdev_nvme_set_hotplug", 00:10:07.349 "bdev_nvme_set_options", 00:10:07.349 "bdev_passthru_delete", 00:10:07.349 "bdev_passthru_create", 00:10:07.349 "bdev_lvol_set_parent_bdev", 00:10:07.349 "bdev_lvol_set_parent", 00:10:07.349 "bdev_lvol_check_shallow_copy", 00:10:07.349 "bdev_lvol_start_shallow_copy", 00:10:07.349 "bdev_lvol_grow_lvstore", 00:10:07.349 "bdev_lvol_get_lvols", 00:10:07.349 "bdev_lvol_get_lvstores", 00:10:07.349 "bdev_lvol_delete", 00:10:07.349 "bdev_lvol_set_read_only", 00:10:07.349 "bdev_lvol_resize", 00:10:07.349 "bdev_lvol_decouple_parent", 00:10:07.349 "bdev_lvol_inflate", 00:10:07.349 "bdev_lvol_rename", 00:10:07.349 "bdev_lvol_clone_bdev", 00:10:07.349 "bdev_lvol_clone", 00:10:07.349 "bdev_lvol_snapshot", 00:10:07.349 "bdev_lvol_create", 00:10:07.349 "bdev_lvol_delete_lvstore", 00:10:07.349 "bdev_lvol_rename_lvstore", 00:10:07.349 "bdev_lvol_create_lvstore", 00:10:07.349 "bdev_raid_set_options", 00:10:07.349 "bdev_raid_remove_base_bdev", 00:10:07.349 "bdev_raid_add_base_bdev", 00:10:07.349 "bdev_raid_delete", 00:10:07.349 "bdev_raid_create", 00:10:07.349 "bdev_raid_get_bdevs", 00:10:07.349 "bdev_error_inject_error", 00:10:07.349 "bdev_error_delete", 00:10:07.349 "bdev_error_create", 00:10:07.349 "bdev_split_delete", 00:10:07.349 "bdev_split_create", 00:10:07.349 "bdev_delay_delete", 00:10:07.350 "bdev_delay_create", 00:10:07.350 "bdev_delay_update_latency", 00:10:07.350 "bdev_zone_block_delete", 00:10:07.350 "bdev_zone_block_create", 00:10:07.350 "blobfs_create", 00:10:07.350 "blobfs_detect", 00:10:07.350 "blobfs_set_cache_size", 00:10:07.350 "bdev_crypto_delete", 00:10:07.350 "bdev_crypto_create", 00:10:07.350 "bdev_compress_delete", 00:10:07.350 "bdev_compress_create", 00:10:07.350 "bdev_compress_get_orphans", 00:10:07.350 "bdev_aio_delete", 00:10:07.350 "bdev_aio_rescan", 00:10:07.350 "bdev_aio_create", 00:10:07.350 "bdev_ftl_set_property", 00:10:07.350 "bdev_ftl_get_properties", 00:10:07.350 "bdev_ftl_get_stats", 00:10:07.350 "bdev_ftl_unmap", 00:10:07.350 "bdev_ftl_unload", 00:10:07.350 "bdev_ftl_delete", 00:10:07.350 "bdev_ftl_load", 00:10:07.350 "bdev_ftl_create", 00:10:07.350 "bdev_virtio_attach_controller", 00:10:07.350 "bdev_virtio_scsi_get_devices", 00:10:07.350 "bdev_virtio_detach_controller", 00:10:07.350 "bdev_virtio_blk_set_hotplug", 00:10:07.350 "bdev_iscsi_delete", 00:10:07.350 "bdev_iscsi_create", 00:10:07.350 "bdev_iscsi_set_options", 00:10:07.350 "accel_error_inject_error", 00:10:07.350 "ioat_scan_accel_module", 00:10:07.350 "dsa_scan_accel_module", 00:10:07.350 "iaa_scan_accel_module", 00:10:07.350 "dpdk_cryptodev_get_driver", 00:10:07.350 "dpdk_cryptodev_set_driver", 00:10:07.350 "dpdk_cryptodev_scan_accel_module", 00:10:07.350 "compressdev_scan_accel_module", 00:10:07.350 "keyring_file_remove_key", 00:10:07.350 "keyring_file_add_key", 00:10:07.350 "keyring_linux_set_options", 00:10:07.350 "iscsi_get_histogram", 00:10:07.350 "iscsi_enable_histogram", 00:10:07.350 "iscsi_set_options", 00:10:07.350 "iscsi_get_auth_groups", 00:10:07.350 "iscsi_auth_group_remove_secret", 00:10:07.350 "iscsi_auth_group_add_secret", 00:10:07.350 "iscsi_delete_auth_group", 00:10:07.350 "iscsi_create_auth_group", 00:10:07.350 "iscsi_set_discovery_auth", 00:10:07.350 "iscsi_get_options", 00:10:07.350 "iscsi_target_node_request_logout", 00:10:07.350 "iscsi_target_node_set_redirect", 00:10:07.350 "iscsi_target_node_set_auth", 00:10:07.350 "iscsi_target_node_add_lun", 00:10:07.350 "iscsi_get_stats", 00:10:07.350 "iscsi_get_connections", 00:10:07.350 "iscsi_portal_group_set_auth", 00:10:07.350 "iscsi_start_portal_group", 00:10:07.350 "iscsi_delete_portal_group", 00:10:07.350 "iscsi_create_portal_group", 00:10:07.350 "iscsi_get_portal_groups", 00:10:07.350 "iscsi_delete_target_node", 00:10:07.350 "iscsi_target_node_remove_pg_ig_maps", 00:10:07.350 "iscsi_target_node_add_pg_ig_maps", 00:10:07.350 "iscsi_create_target_node", 00:10:07.350 "iscsi_get_target_nodes", 00:10:07.350 "iscsi_delete_initiator_group", 00:10:07.350 "iscsi_initiator_group_remove_initiators", 00:10:07.350 "iscsi_initiator_group_add_initiators", 00:10:07.350 "iscsi_create_initiator_group", 00:10:07.350 "iscsi_get_initiator_groups", 00:10:07.350 "nvmf_set_crdt", 00:10:07.350 "nvmf_set_config", 00:10:07.350 "nvmf_set_max_subsystems", 00:10:07.350 "nvmf_stop_mdns_prr", 00:10:07.350 "nvmf_publish_mdns_prr", 00:10:07.350 "nvmf_subsystem_get_listeners", 00:10:07.350 "nvmf_subsystem_get_qpairs", 00:10:07.350 "nvmf_subsystem_get_controllers", 00:10:07.350 "nvmf_get_stats", 00:10:07.350 "nvmf_get_transports", 00:10:07.350 "nvmf_create_transport", 00:10:07.350 "nvmf_get_targets", 00:10:07.350 "nvmf_delete_target", 00:10:07.350 "nvmf_create_target", 00:10:07.350 "nvmf_subsystem_allow_any_host", 00:10:07.350 "nvmf_subsystem_remove_host", 00:10:07.350 "nvmf_subsystem_add_host", 00:10:07.350 "nvmf_ns_remove_host", 00:10:07.350 "nvmf_ns_add_host", 00:10:07.350 "nvmf_subsystem_remove_ns", 00:10:07.350 "nvmf_subsystem_add_ns", 00:10:07.350 "nvmf_subsystem_listener_set_ana_state", 00:10:07.350 "nvmf_discovery_get_referrals", 00:10:07.350 "nvmf_discovery_remove_referral", 00:10:07.350 "nvmf_discovery_add_referral", 00:10:07.350 "nvmf_subsystem_remove_listener", 00:10:07.350 "nvmf_subsystem_add_listener", 00:10:07.350 "nvmf_delete_subsystem", 00:10:07.350 "nvmf_create_subsystem", 00:10:07.350 "nvmf_get_subsystems", 00:10:07.350 "env_dpdk_get_mem_stats", 00:10:07.350 "nbd_get_disks", 00:10:07.350 "nbd_stop_disk", 00:10:07.350 "nbd_start_disk", 00:10:07.350 "ublk_recover_disk", 00:10:07.350 "ublk_get_disks", 00:10:07.350 "ublk_stop_disk", 00:10:07.350 "ublk_start_disk", 00:10:07.350 "ublk_destroy_target", 00:10:07.350 "ublk_create_target", 00:10:07.350 "virtio_blk_create_transport", 00:10:07.350 "virtio_blk_get_transports", 00:10:07.350 "vhost_controller_set_coalescing", 00:10:07.350 "vhost_get_controllers", 00:10:07.350 "vhost_delete_controller", 00:10:07.350 "vhost_create_blk_controller", 00:10:07.350 "vhost_scsi_controller_remove_target", 00:10:07.350 "vhost_scsi_controller_add_target", 00:10:07.350 "vhost_start_scsi_controller", 00:10:07.350 "vhost_create_scsi_controller", 00:10:07.350 "thread_set_cpumask", 00:10:07.350 "framework_get_governor", 00:10:07.350 "framework_get_scheduler", 00:10:07.350 "framework_set_scheduler", 00:10:07.350 "framework_get_reactors", 00:10:07.350 "thread_get_io_channels", 00:10:07.350 "thread_get_pollers", 00:10:07.350 "thread_get_stats", 00:10:07.350 "framework_monitor_context_switch", 00:10:07.350 "spdk_kill_instance", 00:10:07.350 "log_enable_timestamps", 00:10:07.350 "log_get_flags", 00:10:07.350 "log_clear_flag", 00:10:07.350 "log_set_flag", 00:10:07.350 "log_get_level", 00:10:07.350 "log_set_level", 00:10:07.350 "log_get_print_level", 00:10:07.350 "log_set_print_level", 00:10:07.350 "framework_enable_cpumask_locks", 00:10:07.350 "framework_disable_cpumask_locks", 00:10:07.350 "framework_wait_init", 00:10:07.350 "framework_start_init", 00:10:07.350 "scsi_get_devices", 00:10:07.350 "bdev_get_histogram", 00:10:07.350 "bdev_enable_histogram", 00:10:07.350 "bdev_set_qos_limit", 00:10:07.350 "bdev_set_qd_sampling_period", 00:10:07.350 "bdev_get_bdevs", 00:10:07.350 "bdev_reset_iostat", 00:10:07.350 "bdev_get_iostat", 00:10:07.350 "bdev_examine", 00:10:07.350 "bdev_wait_for_examine", 00:10:07.350 "bdev_set_options", 00:10:07.350 "notify_get_notifications", 00:10:07.350 "notify_get_types", 00:10:07.350 "accel_get_stats", 00:10:07.350 "accel_set_options", 00:10:07.350 "accel_set_driver", 00:10:07.350 "accel_crypto_key_destroy", 00:10:07.350 "accel_crypto_keys_get", 00:10:07.350 "accel_crypto_key_create", 00:10:07.350 "accel_assign_opc", 00:10:07.350 "accel_get_module_info", 00:10:07.350 "accel_get_opc_assignments", 00:10:07.350 "vmd_rescan", 00:10:07.350 "vmd_remove_device", 00:10:07.350 "vmd_enable", 00:10:07.350 "sock_get_default_impl", 00:10:07.350 "sock_set_default_impl", 00:10:07.350 "sock_impl_set_options", 00:10:07.350 "sock_impl_get_options", 00:10:07.350 "iobuf_get_stats", 00:10:07.350 "iobuf_set_options", 00:10:07.350 "framework_get_pci_devices", 00:10:07.350 "framework_get_config", 00:10:07.350 "framework_get_subsystems", 00:10:07.350 "trace_get_info", 00:10:07.350 "trace_get_tpoint_group_mask", 00:10:07.350 "trace_disable_tpoint_group", 00:10:07.350 "trace_enable_tpoint_group", 00:10:07.350 "trace_clear_tpoint_mask", 00:10:07.350 "trace_set_tpoint_mask", 00:10:07.350 "keyring_get_keys", 00:10:07.350 "spdk_get_version", 00:10:07.350 "rpc_get_methods" 00:10:07.350 ] 00:10:07.350 06:26:20 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:10:07.350 06:26:20 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:10:07.350 06:26:20 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:10:07.350 06:26:20 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:10:07.350 06:26:20 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1051935 00:10:07.350 06:26:20 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 1051935 ']' 00:10:07.350 06:26:20 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 1051935 00:10:07.350 06:26:20 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:10:07.350 06:26:20 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:07.350 06:26:20 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1051935 00:10:07.609 06:26:20 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:07.609 06:26:20 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:07.609 06:26:20 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1051935' 00:10:07.609 killing process with pid 1051935 00:10:07.609 06:26:20 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 1051935 00:10:07.609 06:26:20 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 1051935 00:10:07.868 00:10:07.868 real 0m1.733s 00:10:07.868 user 0m3.183s 00:10:07.868 sys 0m0.592s 00:10:07.868 06:26:21 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:07.868 06:26:21 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:10:07.868 ************************************ 00:10:07.868 END TEST spdkcli_tcp 00:10:07.868 ************************************ 00:10:07.868 06:26:21 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:10:07.868 06:26:21 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:07.868 06:26:21 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:07.868 06:26:21 -- common/autotest_common.sh@10 -- # set +x 00:10:07.868 ************************************ 00:10:07.868 START TEST dpdk_mem_utility 00:10:07.868 ************************************ 00:10:07.868 06:26:21 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:10:08.127 * Looking for test storage... 00:10:08.127 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:10:08.127 06:26:21 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:10:08.127 06:26:21 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1052269 00:10:08.127 06:26:21 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1052269 00:10:08.127 06:26:21 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:08.127 06:26:21 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 1052269 ']' 00:10:08.127 06:26:21 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:08.127 06:26:21 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:08.127 06:26:21 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:08.127 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:08.127 06:26:21 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:08.127 06:26:21 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:10:08.127 [2024-07-25 06:26:21.507976] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:08.127 [2024-07-25 06:26:21.508033] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1052269 ] 00:10:08.127 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.127 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:08.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:08.128 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:08.128 [2024-07-25 06:26:21.645407] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:08.387 [2024-07-25 06:26:21.689206] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:08.954 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:08.955 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:10:08.955 06:26:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:10:08.955 06:26:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:10:08.955 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:08.955 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:10:08.955 { 00:10:08.955 "filename": "/tmp/spdk_mem_dump.txt" 00:10:08.955 } 00:10:08.955 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:08.955 06:26:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:10:08.955 DPDK memory size 814.000000 MiB in 1 heap(s) 00:10:08.955 1 heaps totaling size 814.000000 MiB 00:10:08.955 size: 814.000000 MiB heap id: 0 00:10:08.955 end heaps---------- 00:10:08.955 8 mempools totaling size 598.116089 MiB 00:10:08.955 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:10:08.955 size: 158.602051 MiB name: PDU_data_out_Pool 00:10:08.955 size: 84.521057 MiB name: bdev_io_1052269 00:10:08.955 size: 51.011292 MiB name: evtpool_1052269 00:10:08.955 size: 50.003479 MiB name: msgpool_1052269 00:10:08.955 size: 21.763794 MiB name: PDU_Pool 00:10:08.955 size: 19.513306 MiB name: SCSI_TASK_Pool 00:10:08.955 size: 0.026123 MiB name: Session_Pool 00:10:08.955 end mempools------- 00:10:08.955 201 memzones totaling size 4.173523 MiB 00:10:08.955 size: 1.000366 MiB name: RG_ring_0_1052269 00:10:08.955 size: 1.000366 MiB name: RG_ring_1_1052269 00:10:08.955 size: 1.000366 MiB name: RG_ring_4_1052269 00:10:08.955 size: 1.000366 MiB name: RG_ring_5_1052269 00:10:08.955 size: 0.125366 MiB name: RG_ring_2_1052269 00:10:08.955 size: 0.015991 MiB name: RG_ring_3_1052269 00:10:08.955 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:10:08.955 size: 0.000244 MiB name: 0000:1a:01.0_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:01.1_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:01.2_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:01.3_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:01.4_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:01.5_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:01.6_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:01.7_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:02.0_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:02.1_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:02.2_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:02.3_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:02.4_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:02.5_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:02.6_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1a:02.7_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:01.0_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:01.1_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:01.2_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:01.3_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:01.4_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:01.5_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:01.6_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:01.7_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:02.0_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:02.1_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:02.2_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:02.3_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:02.4_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:02.5_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:02.6_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1c:02.7_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:01.0_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:01.1_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:01.2_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:01.3_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:01.4_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:01.5_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:01.6_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:01.7_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:02.0_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:02.1_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:02.2_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:02.3_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:02.4_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:02.5_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:02.6_qat 00:10:08.955 size: 0.000244 MiB name: 0000:1e:02.7_qat 00:10:08.955 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_0 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_0 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_1 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_2 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_1 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_3 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_4 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_2 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_5 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_6 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_3 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_7 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_8 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_4 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_9 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_10 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_5 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_11 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_12 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_6 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_13 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_14 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_7 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_15 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_16 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_8 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_17 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_18 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_9 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_19 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_20 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_10 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_21 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_22 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_11 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_23 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_24 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_12 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_25 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_26 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_13 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_27 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_28 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_14 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_29 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_30 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_15 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_31 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_32 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_16 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_33 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_34 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_17 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_35 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_36 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_18 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_37 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_38 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_19 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_39 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_40 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_20 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_41 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_42 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_21 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_43 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_44 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_22 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_45 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_46 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_23 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_47 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_48 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_24 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_49 00:10:08.955 size: 0.000122 MiB name: rte_cryptodev_data_50 00:10:08.955 size: 0.000122 MiB name: rte_compressdev_data_25 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_51 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_52 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_26 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_53 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_54 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_27 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_55 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_56 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_28 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_57 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_58 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_29 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_59 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_60 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_30 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_61 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_62 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_31 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_63 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_64 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_32 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_65 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_66 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_33 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_67 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_68 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_34 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_69 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_70 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_35 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_71 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_72 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_36 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_73 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_74 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_37 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_75 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_76 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_38 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_77 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_78 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_39 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_79 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_80 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_40 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_81 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_82 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_41 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_83 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_84 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_42 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_85 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_86 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_43 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_87 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_88 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_44 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_89 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_90 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_45 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_91 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_92 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_46 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_93 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_94 00:10:08.956 size: 0.000122 MiB name: rte_compressdev_data_47 00:10:08.956 size: 0.000122 MiB name: rte_cryptodev_data_95 00:10:08.956 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:10:08.956 end memzones------- 00:10:08.956 06:26:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:10:09.219 heap id: 0 total size: 814.000000 MiB number of busy elements: 637 number of free elements: 14 00:10:09.219 list of free elements. size: 11.784302 MiB 00:10:09.219 element at address: 0x200000400000 with size: 1.999512 MiB 00:10:09.219 element at address: 0x200018e00000 with size: 0.999878 MiB 00:10:09.219 element at address: 0x200019000000 with size: 0.999878 MiB 00:10:09.219 element at address: 0x200003e00000 with size: 0.996460 MiB 00:10:09.219 element at address: 0x200031c00000 with size: 0.994446 MiB 00:10:09.219 element at address: 0x200013800000 with size: 0.978699 MiB 00:10:09.219 element at address: 0x200007000000 with size: 0.959839 MiB 00:10:09.219 element at address: 0x200019200000 with size: 0.936584 MiB 00:10:09.219 element at address: 0x20001aa00000 with size: 0.565308 MiB 00:10:09.219 element at address: 0x200003a00000 with size: 0.496887 MiB 00:10:09.219 element at address: 0x20000b200000 with size: 0.488892 MiB 00:10:09.219 element at address: 0x200000800000 with size: 0.486511 MiB 00:10:09.219 element at address: 0x200019400000 with size: 0.485657 MiB 00:10:09.219 element at address: 0x200027e00000 with size: 0.395752 MiB 00:10:09.219 list of standard malloc elements. size: 199.898621 MiB 00:10:09.219 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:10:09.219 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:10:09.219 element at address: 0x200018efff80 with size: 1.000122 MiB 00:10:09.219 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:10:09.219 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:10:09.219 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:10:09.219 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:10:09.219 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:10:09.219 element at address: 0x20000032c840 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003302c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000333d40 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003377c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000033b240 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000033ecc0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000342740 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003461c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000349c40 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000034d6c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000351140 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000354bc0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000358640 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000035c0c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000035fb40 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003635c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000367040 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000036aac0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000036e540 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000371fc0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000375a40 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003794c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000037cf40 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003809c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000384440 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000387ec0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000038b940 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000038f3c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x200000392e40 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003968c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000039a340 with size: 0.004395 MiB 00:10:09.219 element at address: 0x20000039ddc0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003a1840 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003a52c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003a8d40 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003ac7c0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003b0240 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003b3cc0 with size: 0.004395 MiB 00:10:09.219 element at address: 0x2000003b7740 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003bb1c0 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003bec40 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003c26c0 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003c6140 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003c9bc0 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003cd640 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003d10c0 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003d4b40 with size: 0.004395 MiB 00:10:09.220 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:10:09.220 element at address: 0x20000032a740 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000032b7c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000032e1c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000032f240 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000331c40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000332cc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003356c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000336740 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000339140 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000033a1c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000033cbc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000033dc40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000340640 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003416c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003440c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000345140 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000347b40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000348bc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000034b5c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000034c640 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000034f040 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003500c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000352ac0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000353b40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000356540 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003575c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000359fc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000035b040 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000035da40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000035eac0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003614c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000362540 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000364f40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000365fc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003689c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000369a40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000036c440 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000036d4c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000036fec0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000370f40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000373940 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003749c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003773c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000378440 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000037ae40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000037bec0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000037e8c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000037f940 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000382340 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003833c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000385dc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000386e40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000389840 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000038a8c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000038d2c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000038e340 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000390d40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000391dc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003947c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000395840 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000398240 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003992c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000039bcc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000039cd40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x20000039f740 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003a07c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003a31c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003a4240 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003a6c40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003a7cc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003aa6c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003ab740 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003ae140 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003af1c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003b1bc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003b2c40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003b5640 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003b66c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003b90c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003ba140 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003bcb40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003bdbc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003c05c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003c1640 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003c4040 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003c50c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003c7ac0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003c8b40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003cb540 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003cc5c0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003cefc0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003d0040 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003d2a40 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003d3ac0 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:10:09.220 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:10:09.220 element at address: 0x200000200000 with size: 0.000305 MiB 00:10:09.220 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:10:09.220 element at address: 0x200000200140 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200200 with size: 0.000183 MiB 00:10:09.220 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200380 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200440 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200500 with size: 0.000183 MiB 00:10:09.220 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200680 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200740 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200800 with size: 0.000183 MiB 00:10:09.220 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200980 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200a40 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200b00 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200c80 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200d40 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200e00 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200ec0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000200f80 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201040 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201100 with size: 0.000183 MiB 00:10:09.220 element at address: 0x2000002011c0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201280 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201340 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201400 with size: 0.000183 MiB 00:10:09.220 element at address: 0x2000002014c0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201580 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201640 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201700 with size: 0.000183 MiB 00:10:09.220 element at address: 0x2000002017c0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201880 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201940 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201a00 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000201c00 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000205ec0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000226180 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000226240 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000226300 with size: 0.000183 MiB 00:10:09.220 element at address: 0x2000002263c0 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000226480 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000226540 with size: 0.000183 MiB 00:10:09.220 element at address: 0x200000226600 with size: 0.000183 MiB 00:10:09.220 element at address: 0x2000002266c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226780 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226840 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226900 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000002269c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226a80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226b40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226c00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226cc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226d80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226e40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000226f00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227100 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000002271c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227280 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227340 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227400 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000002274c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227580 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227640 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227700 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000002277c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227880 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227940 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227a00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227ac0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227b80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227c40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000227d00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000329f00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000329fc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000032a180 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000032a340 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000032a400 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000032da40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000032dc00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000032ddc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000032de80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003314c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000331680 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000331840 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000331900 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000334f40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000335100 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000335380 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003389c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000338b80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000338d40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000338e00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000033c440 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000033c600 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000033c7c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000033c880 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000033fec0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000340080 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000340240 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000340300 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000343940 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000343b00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000343cc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000343d80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003473c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000347580 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000347740 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000347800 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000034ae40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000034b000 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000034b1c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000034b280 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000034e8c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000034ea80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000034ec40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000034ed00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000352340 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000352500 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003526c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000352780 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000355dc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000355f80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000356140 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000356200 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000359840 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000359a00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000359bc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000359c80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000035d2c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000035d480 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000035d640 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000035d700 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000360d40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000360f00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003610c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000361180 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003647c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000364980 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000364b40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000364c00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000368240 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000368400 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003685c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000368680 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000036bcc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000036be80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000036c040 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000036c100 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000036f740 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000036f900 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000036fac0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000036fb80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003731c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000373380 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000373540 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000373600 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000376c40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000376e00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000376fc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000377080 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000037a6c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000037a880 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000037aa40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000037ab00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000037e140 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000037e300 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000037e4c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000037e580 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000381bc0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000381d80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000381f40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000382000 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000385640 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000385800 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003859c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000385a80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003890c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000389280 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000389440 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000389500 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000038cb40 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000038cd00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000038cec0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x20000038cf80 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003905c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000390780 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000390940 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000390a00 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000394040 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000394200 with size: 0.000183 MiB 00:10:09.221 element at address: 0x2000003943c0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000394480 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000397ac0 with size: 0.000183 MiB 00:10:09.221 element at address: 0x200000397c80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200000397e40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200000397f00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000039b540 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000039b700 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000039b8c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000039b980 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000039efc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000039f180 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000039f340 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000039f400 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a2a40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a2c00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a2dc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a2e80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a64c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a6680 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a6840 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a6900 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003a9f40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003aa100 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003aa2c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003aa380 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003ad9c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003adb80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003add40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003ade00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b1440 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b1600 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b17c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b1880 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b5240 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b5300 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b8940 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b8b00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b8cc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003b8d80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003bc3c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003bc580 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003bc740 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003bc800 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c0000 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c01c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c0280 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c38c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c3a80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c3c40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c3d00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c7340 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c7500 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c76c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003c7780 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003cadc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003caf80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003cb140 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003cb200 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003ce840 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003cec80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003d22c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003d2480 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003d2640 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003d2700 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003d5e80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003d6100 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003d6800 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000003d68c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087c980 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f340 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f400 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f4c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f580 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f640 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f700 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f7c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f880 with size: 0.000183 MiB 00:10:09.222 element at address: 0x200003a7f940 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d280 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d340 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d400 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d4c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa90b80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa90c40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa90d00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa90dc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa90e80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa90f40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91000 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa910c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91180 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91240 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91300 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa913c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91480 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91540 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:10:09.222 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:10:09.223 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e65500 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:10:09.223 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:10:09.223 list of memzone associated elements. size: 602.317078 MiB 00:10:09.223 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:10:09.223 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:10:09.223 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:10:09.223 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:10:09.223 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:10:09.223 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1052269_0 00:10:09.223 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:10:09.223 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1052269_0 00:10:09.223 element at address: 0x200003fff380 with size: 48.003052 MiB 00:10:09.223 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1052269_0 00:10:09.223 element at address: 0x2000195be940 with size: 20.255554 MiB 00:10:09.223 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:10:09.223 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:10:09.224 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:10:09.224 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:10:09.224 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1052269 00:10:09.224 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:10:09.224 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1052269 00:10:09.224 element at address: 0x200000227dc0 with size: 1.008118 MiB 00:10:09.224 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1052269 00:10:09.224 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:10:09.224 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:10:09.224 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:10:09.224 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:10:09.224 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:10:09.224 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:10:09.224 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:10:09.224 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:10:09.224 element at address: 0x200003eff180 with size: 1.000488 MiB 00:10:09.224 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1052269 00:10:09.224 element at address: 0x200003affc00 with size: 1.000488 MiB 00:10:09.224 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1052269 00:10:09.224 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:10:09.224 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1052269 00:10:09.224 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:10:09.224 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1052269 00:10:09.224 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:10:09.224 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1052269 00:10:09.224 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:10:09.224 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:10:09.224 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:10:09.224 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:10:09.224 element at address: 0x20001947c540 with size: 0.250488 MiB 00:10:09.224 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:10:09.224 element at address: 0x200000205f80 with size: 0.125488 MiB 00:10:09.224 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1052269 00:10:09.224 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:10:09.224 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:10:09.224 element at address: 0x200027e65680 with size: 0.023743 MiB 00:10:09.224 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:10:09.224 element at address: 0x200000201cc0 with size: 0.016113 MiB 00:10:09.224 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1052269 00:10:09.224 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:10:09.224 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:10:09.224 element at address: 0x2000003d62c0 with size: 0.001282 MiB 00:10:09.224 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:10:09.224 element at address: 0x2000003d6a80 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:01.0_qat 00:10:09.224 element at address: 0x2000003d28c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:01.1_qat 00:10:09.224 element at address: 0x2000003cee40 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:01.2_qat 00:10:09.224 element at address: 0x2000003cb3c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:01.3_qat 00:10:09.224 element at address: 0x2000003c7940 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:01.4_qat 00:10:09.224 element at address: 0x2000003c3ec0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:01.5_qat 00:10:09.224 element at address: 0x2000003c0440 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:01.6_qat 00:10:09.224 element at address: 0x2000003bc9c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:01.7_qat 00:10:09.224 element at address: 0x2000003b8f40 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:02.0_qat 00:10:09.224 element at address: 0x2000003b54c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:02.1_qat 00:10:09.224 element at address: 0x2000003b1a40 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:02.2_qat 00:10:09.224 element at address: 0x2000003adfc0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:02.3_qat 00:10:09.224 element at address: 0x2000003aa540 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:02.4_qat 00:10:09.224 element at address: 0x2000003a6ac0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:02.5_qat 00:10:09.224 element at address: 0x2000003a3040 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:02.6_qat 00:10:09.224 element at address: 0x20000039f5c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1a:02.7_qat 00:10:09.224 element at address: 0x20000039bb40 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:01.0_qat 00:10:09.224 element at address: 0x2000003980c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:01.1_qat 00:10:09.224 element at address: 0x200000394640 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:01.2_qat 00:10:09.224 element at address: 0x200000390bc0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:01.3_qat 00:10:09.224 element at address: 0x20000038d140 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:01.4_qat 00:10:09.224 element at address: 0x2000003896c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:01.5_qat 00:10:09.224 element at address: 0x200000385c40 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:01.6_qat 00:10:09.224 element at address: 0x2000003821c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:01.7_qat 00:10:09.224 element at address: 0x20000037e740 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:02.0_qat 00:10:09.224 element at address: 0x20000037acc0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:02.1_qat 00:10:09.224 element at address: 0x200000377240 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:02.2_qat 00:10:09.224 element at address: 0x2000003737c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:02.3_qat 00:10:09.224 element at address: 0x20000036fd40 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:02.4_qat 00:10:09.224 element at address: 0x20000036c2c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:02.5_qat 00:10:09.224 element at address: 0x200000368840 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:02.6_qat 00:10:09.224 element at address: 0x200000364dc0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1c:02.7_qat 00:10:09.224 element at address: 0x200000361340 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1e:01.0_qat 00:10:09.224 element at address: 0x20000035d8c0 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1e:01.1_qat 00:10:09.224 element at address: 0x200000359e40 with size: 0.000366 MiB 00:10:09.224 associated memzone info: size: 0.000244 MiB name: 0000:1e:01.2_qat 00:10:09.225 element at address: 0x2000003563c0 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:01.3_qat 00:10:09.225 element at address: 0x200000352940 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:01.4_qat 00:10:09.225 element at address: 0x20000034eec0 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:01.5_qat 00:10:09.225 element at address: 0x20000034b440 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:01.6_qat 00:10:09.225 element at address: 0x2000003479c0 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:01.7_qat 00:10:09.225 element at address: 0x200000343f40 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:02.0_qat 00:10:09.225 element at address: 0x2000003404c0 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:02.1_qat 00:10:09.225 element at address: 0x20000033ca40 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:02.2_qat 00:10:09.225 element at address: 0x200000338fc0 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:02.3_qat 00:10:09.225 element at address: 0x200000335540 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:02.4_qat 00:10:09.225 element at address: 0x200000331ac0 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:02.5_qat 00:10:09.225 element at address: 0x20000032e040 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:02.6_qat 00:10:09.225 element at address: 0x20000032a5c0 with size: 0.000366 MiB 00:10:09.225 associated memzone info: size: 0.000244 MiB name: 0000:1e:02.7_qat 00:10:09.225 element at address: 0x2000003d5d40 with size: 0.000305 MiB 00:10:09.225 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:10:09.225 element at address: 0x200000226fc0 with size: 0.000305 MiB 00:10:09.225 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1052269 00:10:09.225 element at address: 0x200000201ac0 with size: 0.000305 MiB 00:10:09.225 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1052269 00:10:09.225 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:10:09.225 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:10:09.225 element at address: 0x2000003d6980 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:10:09.225 element at address: 0x2000003d61c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:10:09.225 element at address: 0x2000003d5f40 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:10:09.225 element at address: 0x2000003d27c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:10:09.225 element at address: 0x2000003d2540 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:10:09.225 element at address: 0x2000003d2380 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:10:09.225 element at address: 0x2000003ced40 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:10:09.225 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:10:09.225 element at address: 0x2000003ce900 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:10:09.225 element at address: 0x2000003cb2c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:10:09.225 element at address: 0x2000003cb040 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:10:09.225 element at address: 0x2000003cae80 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:10:09.225 element at address: 0x2000003c7840 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:10:09.225 element at address: 0x2000003c75c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:10:09.225 element at address: 0x2000003c7400 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:10:09.225 element at address: 0x2000003c3dc0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:10:09.225 element at address: 0x2000003c3b40 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:10:09.225 element at address: 0x2000003c3980 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:10:09.225 element at address: 0x2000003c0340 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:10:09.225 element at address: 0x2000003c00c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:10:09.225 element at address: 0x2000003bff00 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:10:09.225 element at address: 0x2000003bc8c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:10:09.225 element at address: 0x2000003bc640 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:10:09.225 element at address: 0x2000003bc480 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:10:09.225 element at address: 0x2000003b8e40 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:10:09.225 element at address: 0x2000003b8bc0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:10:09.225 element at address: 0x2000003b8a00 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:10:09.225 element at address: 0x2000003b53c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:10:09.225 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:10:09.225 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:10:09.225 element at address: 0x2000003b1940 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:10:09.225 element at address: 0x2000003b16c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:10:09.225 element at address: 0x2000003b1500 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:10:09.225 element at address: 0x2000003adec0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:10:09.225 element at address: 0x2000003adc40 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:10:09.225 element at address: 0x2000003ada80 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:10:09.225 element at address: 0x2000003aa440 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:10:09.225 element at address: 0x2000003aa1c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:10:09.225 element at address: 0x2000003aa000 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:10:09.225 element at address: 0x2000003a69c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:10:09.225 element at address: 0x2000003a6740 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:10:09.225 element at address: 0x2000003a6580 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:10:09.225 element at address: 0x2000003a2f40 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:10:09.225 element at address: 0x2000003a2cc0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:10:09.225 element at address: 0x2000003a2b00 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:10:09.225 element at address: 0x20000039f4c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:10:09.225 element at address: 0x20000039f240 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:10:09.225 element at address: 0x20000039f080 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:10:09.225 element at address: 0x20000039ba40 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:10:09.225 element at address: 0x20000039b7c0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:10:09.225 element at address: 0x20000039b600 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:10:09.225 element at address: 0x200000397fc0 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:10:09.225 element at address: 0x200000397d40 with size: 0.000244 MiB 00:10:09.225 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:10:09.226 element at address: 0x200000397b80 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:10:09.226 element at address: 0x200000394540 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:10:09.226 element at address: 0x2000003942c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:10:09.226 element at address: 0x200000394100 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:10:09.226 element at address: 0x200000390ac0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:10:09.226 element at address: 0x200000390840 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:10:09.226 element at address: 0x200000390680 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:10:09.226 element at address: 0x20000038d040 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:10:09.226 element at address: 0x20000038cdc0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:10:09.226 element at address: 0x20000038cc00 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:10:09.226 element at address: 0x2000003895c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:10:09.226 element at address: 0x200000389340 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:10:09.226 element at address: 0x200000389180 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:10:09.226 element at address: 0x200000385b40 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:10:09.226 element at address: 0x2000003858c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:10:09.226 element at address: 0x200000385700 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:10:09.226 element at address: 0x2000003820c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:10:09.226 element at address: 0x200000381e40 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:10:09.226 element at address: 0x200000381c80 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:10:09.226 element at address: 0x20000037e640 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:10:09.226 element at address: 0x20000037e3c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:10:09.226 element at address: 0x20000037e200 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:10:09.226 element at address: 0x20000037abc0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:10:09.226 element at address: 0x20000037a940 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:10:09.226 element at address: 0x20000037a780 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:10:09.226 element at address: 0x200000377140 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:10:09.226 element at address: 0x200000376ec0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:10:09.226 element at address: 0x200000376d00 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:10:09.226 element at address: 0x2000003736c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:10:09.226 element at address: 0x200000373440 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:10:09.226 element at address: 0x200000373280 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:10:09.226 element at address: 0x20000036fc40 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:10:09.226 element at address: 0x20000036f9c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:10:09.226 element at address: 0x20000036f800 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:10:09.226 element at address: 0x20000036c1c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:10:09.226 element at address: 0x20000036bf40 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:10:09.226 element at address: 0x20000036bd80 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:10:09.226 element at address: 0x200000368740 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:10:09.226 element at address: 0x2000003684c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:10:09.226 element at address: 0x200000368300 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:10:09.226 element at address: 0x200000364cc0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:10:09.226 element at address: 0x200000364a40 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:10:09.226 element at address: 0x200000364880 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:10:09.226 element at address: 0x200000361240 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_64 00:10:09.226 element at address: 0x200000360fc0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_32 00:10:09.226 element at address: 0x200000360e00 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_65 00:10:09.226 element at address: 0x20000035d7c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_66 00:10:09.226 element at address: 0x20000035d540 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_33 00:10:09.226 element at address: 0x20000035d380 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_67 00:10:09.226 element at address: 0x200000359d40 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_68 00:10:09.226 element at address: 0x200000359ac0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_34 00:10:09.226 element at address: 0x200000359900 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_69 00:10:09.226 element at address: 0x2000003562c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_70 00:10:09.226 element at address: 0x200000356040 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_35 00:10:09.226 element at address: 0x200000355e80 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_71 00:10:09.226 element at address: 0x200000352840 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_72 00:10:09.226 element at address: 0x2000003525c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_36 00:10:09.226 element at address: 0x200000352400 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_73 00:10:09.226 element at address: 0x20000034edc0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_74 00:10:09.226 element at address: 0x20000034eb40 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_37 00:10:09.226 element at address: 0x20000034e980 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_75 00:10:09.226 element at address: 0x20000034b340 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_76 00:10:09.226 element at address: 0x20000034b0c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_38 00:10:09.226 element at address: 0x20000034af00 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_77 00:10:09.226 element at address: 0x2000003478c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_78 00:10:09.226 element at address: 0x200000347640 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_39 00:10:09.226 element at address: 0x200000347480 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_79 00:10:09.226 element at address: 0x200000343e40 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_80 00:10:09.226 element at address: 0x200000343bc0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_40 00:10:09.226 element at address: 0x200000343a00 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_81 00:10:09.226 element at address: 0x2000003403c0 with size: 0.000244 MiB 00:10:09.226 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_82 00:10:09.227 element at address: 0x200000340140 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_41 00:10:09.227 element at address: 0x20000033ff80 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_83 00:10:09.227 element at address: 0x20000033c940 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_84 00:10:09.227 element at address: 0x20000033c6c0 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_42 00:10:09.227 element at address: 0x20000033c500 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_85 00:10:09.227 element at address: 0x200000338ec0 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_86 00:10:09.227 element at address: 0x200000338c40 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_43 00:10:09.227 element at address: 0x200000338a80 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_87 00:10:09.227 element at address: 0x200000335440 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_88 00:10:09.227 element at address: 0x2000003351c0 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_44 00:10:09.227 element at address: 0x200000335000 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_89 00:10:09.227 element at address: 0x2000003319c0 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_90 00:10:09.227 element at address: 0x200000331740 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_45 00:10:09.227 element at address: 0x200000331580 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_91 00:10:09.227 element at address: 0x20000032df40 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_92 00:10:09.227 element at address: 0x20000032dcc0 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_46 00:10:09.227 element at address: 0x20000032db00 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_93 00:10:09.227 element at address: 0x20000032a4c0 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_94 00:10:09.227 element at address: 0x20000032a240 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_47 00:10:09.227 element at address: 0x20000032a080 with size: 0.000244 MiB 00:10:09.227 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_95 00:10:09.227 element at address: 0x2000003d6040 with size: 0.000183 MiB 00:10:09.227 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:10:09.227 06:26:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:10:09.227 06:26:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1052269 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 1052269 ']' 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 1052269 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1052269 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1052269' 00:10:09.227 killing process with pid 1052269 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 1052269 00:10:09.227 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 1052269 00:10:09.486 00:10:09.486 real 0m1.633s 00:10:09.486 user 0m1.743s 00:10:09.486 sys 0m0.549s 00:10:09.486 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:09.486 06:26:22 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:10:09.486 ************************************ 00:10:09.486 END TEST dpdk_mem_utility 00:10:09.486 ************************************ 00:10:09.486 06:26:23 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:10:09.486 06:26:23 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:09.486 06:26:23 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:09.486 06:26:23 -- common/autotest_common.sh@10 -- # set +x 00:10:09.745 ************************************ 00:10:09.745 START TEST event 00:10:09.745 ************************************ 00:10:09.745 06:26:23 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:10:09.745 * Looking for test storage... 00:10:09.745 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:10:09.745 06:26:23 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:09.745 06:26:23 event -- bdev/nbd_common.sh@6 -- # set -e 00:10:09.745 06:26:23 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:10:09.745 06:26:23 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:10:09.745 06:26:23 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:09.745 06:26:23 event -- common/autotest_common.sh@10 -- # set +x 00:10:09.745 ************************************ 00:10:09.745 START TEST event_perf 00:10:09.745 ************************************ 00:10:09.745 06:26:23 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:10:09.745 Running I/O for 1 seconds...[2024-07-25 06:26:23.208609] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:09.745 [2024-07-25 06:26:23.208665] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1052596 ] 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:09.745 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:09.745 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:10.005 [2024-07-25 06:26:23.341435] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:10.005 [2024-07-25 06:26:23.389230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:10.005 [2024-07-25 06:26:23.389325] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:10.005 [2024-07-25 06:26:23.389412] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:10:10.005 [2024-07-25 06:26:23.389415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:10.941 Running I/O for 1 seconds... 00:10:10.941 lcore 0: 187755 00:10:10.941 lcore 1: 187752 00:10:10.941 lcore 2: 187753 00:10:10.941 lcore 3: 187755 00:10:10.941 done. 00:10:10.941 00:10:10.941 real 0m1.274s 00:10:10.941 user 0m4.120s 00:10:10.941 sys 0m0.148s 00:10:10.941 06:26:24 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:10.941 06:26:24 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:10:10.941 ************************************ 00:10:10.941 END TEST event_perf 00:10:10.941 ************************************ 00:10:11.200 06:26:24 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:10:11.200 06:26:24 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:10:11.200 06:26:24 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:11.200 06:26:24 event -- common/autotest_common.sh@10 -- # set +x 00:10:11.200 ************************************ 00:10:11.200 START TEST event_reactor 00:10:11.200 ************************************ 00:10:11.200 06:26:24 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:10:11.200 [2024-07-25 06:26:24.565276] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:11.200 [2024-07-25 06:26:24.565338] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1052877 ] 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:11.200 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:11.200 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:11.200 [2024-07-25 06:26:24.702985] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:11.200 [2024-07-25 06:26:24.744309] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:12.575 test_start 00:10:12.575 oneshot 00:10:12.575 tick 100 00:10:12.575 tick 100 00:10:12.575 tick 250 00:10:12.575 tick 100 00:10:12.575 tick 100 00:10:12.575 tick 250 00:10:12.575 tick 100 00:10:12.575 tick 500 00:10:12.575 tick 100 00:10:12.575 tick 100 00:10:12.575 tick 250 00:10:12.575 tick 100 00:10:12.575 tick 100 00:10:12.575 test_end 00:10:12.575 00:10:12.575 real 0m1.270s 00:10:12.575 user 0m1.121s 00:10:12.575 sys 0m0.142s 00:10:12.575 06:26:25 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:12.575 06:26:25 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:10:12.575 ************************************ 00:10:12.575 END TEST event_reactor 00:10:12.575 ************************************ 00:10:12.575 06:26:25 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:10:12.575 06:26:25 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:10:12.575 06:26:25 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:12.575 06:26:25 event -- common/autotest_common.sh@10 -- # set +x 00:10:12.575 ************************************ 00:10:12.575 START TEST event_reactor_perf 00:10:12.575 ************************************ 00:10:12.575 06:26:25 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:10:12.575 [2024-07-25 06:26:25.926841] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:12.575 [2024-07-25 06:26:25.926983] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1053160 ] 00:10:12.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.575 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:12.575 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.575 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:12.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:12.576 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:12.834 [2024-07-25 06:26:26.138423] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.834 [2024-07-25 06:26:26.187396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:13.770 test_start 00:10:13.770 test_end 00:10:13.770 Performance: 355566 events per second 00:10:13.770 00:10:13.770 real 0m1.361s 00:10:13.770 user 0m1.144s 00:10:13.770 sys 0m0.210s 00:10:13.770 06:26:27 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:13.770 06:26:27 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:10:13.770 ************************************ 00:10:13.770 END TEST event_reactor_perf 00:10:13.770 ************************************ 00:10:13.770 06:26:27 event -- event/event.sh@49 -- # uname -s 00:10:13.770 06:26:27 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:10:13.770 06:26:27 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:10:13.770 06:26:27 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:13.770 06:26:27 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:13.770 06:26:27 event -- common/autotest_common.sh@10 -- # set +x 00:10:14.029 ************************************ 00:10:14.029 START TEST event_scheduler 00:10:14.029 ************************************ 00:10:14.029 06:26:27 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:10:14.029 * Looking for test storage... 00:10:14.029 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:10:14.029 06:26:27 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:10:14.029 06:26:27 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1053470 00:10:14.029 06:26:27 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:10:14.029 06:26:27 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:10:14.029 06:26:27 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1053470 00:10:14.029 06:26:27 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 1053470 ']' 00:10:14.029 06:26:27 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:14.029 06:26:27 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:14.029 06:26:27 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:14.029 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:14.029 06:26:27 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:14.029 06:26:27 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:10:14.029 [2024-07-25 06:26:27.507377] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:14.029 [2024-07-25 06:26:27.507440] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1053470 ] 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.029 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:14.029 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.030 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:14.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.030 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:14.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.030 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:14.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.030 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:14.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.030 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:14.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.030 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:14.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:14.030 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:14.288 [2024-07-25 06:26:27.619226] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:14.288 [2024-07-25 06:26:27.661736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:14.288 [2024-07-25 06:26:27.661756] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:14.288 [2024-07-25 06:26:27.661819] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:10:14.288 [2024-07-25 06:26:27.661821] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:10:15.223 06:26:28 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:10:15.223 [2024-07-25 06:26:28.420662] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:10:15.223 [2024-07-25 06:26:28.420684] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:10:15.223 [2024-07-25 06:26:28.420695] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:10:15.223 [2024-07-25 06:26:28.420703] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:10:15.223 [2024-07-25 06:26:28.420710] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.223 06:26:28 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:10:15.223 [2024-07-25 06:26:28.500324] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.223 06:26:28 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:15.223 06:26:28 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:10:15.223 ************************************ 00:10:15.223 START TEST scheduler_create_thread 00:10:15.223 ************************************ 00:10:15.223 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:10:15.223 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:10:15.223 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.223 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 2 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 3 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 4 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 5 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 6 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 7 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 8 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 9 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 10 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.224 06:26:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:15.791 06:26:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:15.791 06:26:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:10:15.791 06:26:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:15.791 06:26:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:17.166 06:26:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:17.166 06:26:30 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:10:17.166 06:26:30 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:10:17.166 06:26:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:17.166 06:26:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:18.100 06:26:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:18.100 00:10:18.100 real 0m3.096s 00:10:18.100 user 0m0.025s 00:10:18.100 sys 0m0.006s 00:10:18.100 06:26:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:18.100 06:26:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:10:18.100 ************************************ 00:10:18.100 END TEST scheduler_create_thread 00:10:18.100 ************************************ 00:10:18.358 06:26:31 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:10:18.358 06:26:31 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1053470 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 1053470 ']' 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 1053470 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1053470 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1053470' 00:10:18.358 killing process with pid 1053470 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 1053470 00:10:18.358 06:26:31 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 1053470 00:10:18.645 [2024-07-25 06:26:32.019810] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:10:18.941 00:10:18.941 real 0m4.893s 00:10:18.941 user 0m9.655s 00:10:18.941 sys 0m0.491s 00:10:18.941 06:26:32 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:18.941 06:26:32 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:10:18.941 ************************************ 00:10:18.941 END TEST event_scheduler 00:10:18.941 ************************************ 00:10:18.941 06:26:32 event -- event/event.sh@51 -- # modprobe -n nbd 00:10:18.941 06:26:32 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:10:18.941 06:26:32 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:18.941 06:26:32 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:18.941 06:26:32 event -- common/autotest_common.sh@10 -- # set +x 00:10:18.941 ************************************ 00:10:18.941 START TEST app_repeat 00:10:18.941 ************************************ 00:10:18.941 06:26:32 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1054322 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1054322' 00:10:18.941 Process app_repeat pid: 1054322 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:10:18.941 spdk_app_start Round 0 00:10:18.941 06:26:32 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1054322 /var/tmp/spdk-nbd.sock 00:10:18.941 06:26:32 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1054322 ']' 00:10:18.941 06:26:32 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:10:18.941 06:26:32 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:18.941 06:26:32 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:10:18.941 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:10:18.941 06:26:32 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:18.941 06:26:32 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:10:18.941 [2024-07-25 06:26:32.340396] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:18.941 [2024-07-25 06:26:32.340440] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1054322 ] 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.941 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:18.941 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:18.942 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:18.942 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:18.942 [2024-07-25 06:26:32.458880] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:19.199 [2024-07-25 06:26:32.505302] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:19.199 [2024-07-25 06:26:32.505307] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:19.457 06:26:32 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:19.457 06:26:32 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:10:19.457 06:26:32 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:10:19.715 Malloc0 00:10:19.715 06:26:33 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:10:19.973 Malloc1 00:10:19.973 06:26:33 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:19.973 06:26:33 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:10:20.230 /dev/nbd0 00:10:20.230 06:26:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:20.230 06:26:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:10:20.230 1+0 records in 00:10:20.230 1+0 records out 00:10:20.230 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022975 s, 17.8 MB/s 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:20.230 06:26:33 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:10:20.230 06:26:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:20.230 06:26:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:20.230 06:26:33 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:10:20.487 /dev/nbd1 00:10:20.487 06:26:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:10:20.487 06:26:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:20.487 06:26:33 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:10:20.487 1+0 records in 00:10:20.487 1+0 records out 00:10:20.487 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189829 s, 21.6 MB/s 00:10:20.488 06:26:33 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:20.488 06:26:33 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:10:20.488 06:26:33 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:20.488 06:26:33 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:20.488 06:26:33 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:10:20.488 06:26:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:20.488 06:26:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:20.488 06:26:33 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:20.488 06:26:33 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:20.488 06:26:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:20.746 { 00:10:20.746 "nbd_device": "/dev/nbd0", 00:10:20.746 "bdev_name": "Malloc0" 00:10:20.746 }, 00:10:20.746 { 00:10:20.746 "nbd_device": "/dev/nbd1", 00:10:20.746 "bdev_name": "Malloc1" 00:10:20.746 } 00:10:20.746 ]' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:20.746 { 00:10:20.746 "nbd_device": "/dev/nbd0", 00:10:20.746 "bdev_name": "Malloc0" 00:10:20.746 }, 00:10:20.746 { 00:10:20.746 "nbd_device": "/dev/nbd1", 00:10:20.746 "bdev_name": "Malloc1" 00:10:20.746 } 00:10:20.746 ]' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:10:20.746 /dev/nbd1' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:10:20.746 /dev/nbd1' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:10:20.746 256+0 records in 00:10:20.746 256+0 records out 00:10:20.746 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0101533 s, 103 MB/s 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:10:20.746 256+0 records in 00:10:20.746 256+0 records out 00:10:20.746 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0165942 s, 63.2 MB/s 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:10:20.746 256+0 records in 00:10:20.746 256+0 records out 00:10:20.746 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0182359 s, 57.5 MB/s 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:20.746 06:26:34 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:21.004 06:26:34 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:21.261 06:26:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:21.262 06:26:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:21.519 06:26:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:21.519 06:26:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:21.519 06:26:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:10:21.519 06:26:35 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:10:21.519 06:26:35 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:10:21.778 06:26:35 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:10:22.037 [2024-07-25 06:26:35.476649] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:22.037 [2024-07-25 06:26:35.517016] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:22.037 [2024-07-25 06:26:35.517023] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:22.037 [2024-07-25 06:26:35.561107] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:10:22.037 [2024-07-25 06:26:35.561157] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:10:25.320 06:26:38 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:10:25.320 06:26:38 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:10:25.320 spdk_app_start Round 1 00:10:25.320 06:26:38 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1054322 /var/tmp/spdk-nbd.sock 00:10:25.320 06:26:38 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1054322 ']' 00:10:25.320 06:26:38 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:10:25.320 06:26:38 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:25.320 06:26:38 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:10:25.320 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:10:25.320 06:26:38 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:25.320 06:26:38 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:10:25.320 06:26:38 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:25.320 06:26:38 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:10:25.320 06:26:38 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:10:25.320 Malloc0 00:10:25.320 06:26:38 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:10:25.578 Malloc1 00:10:25.578 06:26:38 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:25.578 06:26:38 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:10:25.836 /dev/nbd0 00:10:25.836 06:26:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:25.836 06:26:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:10:25.836 1+0 records in 00:10:25.836 1+0 records out 00:10:25.836 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000200701 s, 20.4 MB/s 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:25.836 06:26:39 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:10:25.836 06:26:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:25.836 06:26:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:25.836 06:26:39 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:10:26.094 /dev/nbd1 00:10:26.094 06:26:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:10:26.094 06:26:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:10:26.094 1+0 records in 00:10:26.094 1+0 records out 00:10:26.094 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242621 s, 16.9 MB/s 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:26.094 06:26:39 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:10:26.094 06:26:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:26.094 06:26:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:26.094 06:26:39 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:26.094 06:26:39 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:26.094 06:26:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:26.352 { 00:10:26.352 "nbd_device": "/dev/nbd0", 00:10:26.352 "bdev_name": "Malloc0" 00:10:26.352 }, 00:10:26.352 { 00:10:26.352 "nbd_device": "/dev/nbd1", 00:10:26.352 "bdev_name": "Malloc1" 00:10:26.352 } 00:10:26.352 ]' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:26.352 { 00:10:26.352 "nbd_device": "/dev/nbd0", 00:10:26.352 "bdev_name": "Malloc0" 00:10:26.352 }, 00:10:26.352 { 00:10:26.352 "nbd_device": "/dev/nbd1", 00:10:26.352 "bdev_name": "Malloc1" 00:10:26.352 } 00:10:26.352 ]' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:10:26.352 /dev/nbd1' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:10:26.352 /dev/nbd1' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:10:26.352 256+0 records in 00:10:26.352 256+0 records out 00:10:26.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00894761 s, 117 MB/s 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:10:26.352 256+0 records in 00:10:26.352 256+0 records out 00:10:26.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0165343 s, 63.4 MB/s 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:10:26.352 256+0 records in 00:10:26.352 256+0 records out 00:10:26.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0181706 s, 57.7 MB/s 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:26.352 06:26:39 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:26.610 06:26:40 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:26.867 06:26:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:27.124 06:26:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:10:27.125 06:26:40 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:10:27.125 06:26:40 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:10:27.383 06:26:40 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:10:27.641 [2024-07-25 06:26:41.047778] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:27.641 [2024-07-25 06:26:41.088502] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:27.641 [2024-07-25 06:26:41.088507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:27.641 [2024-07-25 06:26:41.134226] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:10:27.641 [2024-07-25 06:26:41.134274] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:10:30.921 06:26:43 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:10:30.921 06:26:43 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:10:30.921 spdk_app_start Round 2 00:10:30.921 06:26:43 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1054322 /var/tmp/spdk-nbd.sock 00:10:30.921 06:26:43 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1054322 ']' 00:10:30.921 06:26:43 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:10:30.921 06:26:43 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:30.921 06:26:43 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:10:30.921 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:10:30.921 06:26:43 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:30.921 06:26:43 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:10:30.921 06:26:44 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:30.921 06:26:44 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:10:30.921 06:26:44 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:10:30.921 Malloc0 00:10:30.921 06:26:44 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:10:31.487 Malloc1 00:10:31.487 06:26:44 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:31.487 06:26:44 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:10:31.744 /dev/nbd0 00:10:31.744 06:26:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:31.744 06:26:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:10:31.744 1+0 records in 00:10:31.744 1+0 records out 00:10:31.744 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021959 s, 18.7 MB/s 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:31.744 06:26:45 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:10:31.744 06:26:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:31.744 06:26:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:31.744 06:26:45 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:10:32.002 /dev/nbd1 00:10:32.002 06:26:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:10:32.002 06:26:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:10:32.002 1+0 records in 00:10:32.002 1+0 records out 00:10:32.002 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245149 s, 16.7 MB/s 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:10:32.002 06:26:45 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:10:32.002 06:26:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:32.002 06:26:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:10:32.002 06:26:45 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:32.002 06:26:45 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:32.002 06:26:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:32.259 06:26:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:32.259 { 00:10:32.259 "nbd_device": "/dev/nbd0", 00:10:32.259 "bdev_name": "Malloc0" 00:10:32.259 }, 00:10:32.259 { 00:10:32.259 "nbd_device": "/dev/nbd1", 00:10:32.259 "bdev_name": "Malloc1" 00:10:32.259 } 00:10:32.259 ]' 00:10:32.259 06:26:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:32.259 { 00:10:32.259 "nbd_device": "/dev/nbd0", 00:10:32.259 "bdev_name": "Malloc0" 00:10:32.259 }, 00:10:32.259 { 00:10:32.259 "nbd_device": "/dev/nbd1", 00:10:32.260 "bdev_name": "Malloc1" 00:10:32.260 } 00:10:32.260 ]' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:10:32.260 /dev/nbd1' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:10:32.260 /dev/nbd1' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:10:32.260 256+0 records in 00:10:32.260 256+0 records out 00:10:32.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011134 s, 94.2 MB/s 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:10:32.260 256+0 records in 00:10:32.260 256+0 records out 00:10:32.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0165431 s, 63.4 MB/s 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:10:32.260 256+0 records in 00:10:32.260 256+0 records out 00:10:32.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0183274 s, 57.2 MB/s 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:32.260 06:26:45 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:32.518 06:26:45 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:32.776 06:26:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:32.776 06:26:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:32.777 06:26:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:10:33.035 06:26:46 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:10:33.035 06:26:46 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:10:33.294 06:26:46 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:10:33.552 [2024-07-25 06:26:46.963471] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:33.552 [2024-07-25 06:26:47.003786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:33.552 [2024-07-25 06:26:47.003791] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:33.552 [2024-07-25 06:26:47.048544] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:10:33.552 [2024-07-25 06:26:47.048590] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:10:36.868 06:26:49 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1054322 /var/tmp/spdk-nbd.sock 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1054322 ']' 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:10:36.868 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:10:36.868 06:26:49 event.app_repeat -- event/event.sh@39 -- # killprocess 1054322 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 1054322 ']' 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 1054322 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:36.868 06:26:49 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1054322 00:10:36.868 06:26:50 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:36.868 06:26:50 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:36.868 06:26:50 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1054322' 00:10:36.868 killing process with pid 1054322 00:10:36.868 06:26:50 event.app_repeat -- common/autotest_common.sh@969 -- # kill 1054322 00:10:36.868 06:26:50 event.app_repeat -- common/autotest_common.sh@974 -- # wait 1054322 00:10:36.868 spdk_app_start is called in Round 0. 00:10:36.868 Shutdown signal received, stop current app iteration 00:10:36.868 Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 reinitialization... 00:10:36.868 spdk_app_start is called in Round 1. 00:10:36.868 Shutdown signal received, stop current app iteration 00:10:36.868 Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 reinitialization... 00:10:36.868 spdk_app_start is called in Round 2. 00:10:36.868 Shutdown signal received, stop current app iteration 00:10:36.868 Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 reinitialization... 00:10:36.869 spdk_app_start is called in Round 3. 00:10:36.869 Shutdown signal received, stop current app iteration 00:10:36.869 06:26:50 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:10:36.869 06:26:50 event.app_repeat -- event/event.sh@42 -- # return 0 00:10:36.869 00:10:36.869 real 0m17.881s 00:10:36.869 user 0m39.197s 00:10:36.869 sys 0m3.659s 00:10:36.869 06:26:50 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:36.869 06:26:50 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:10:36.869 ************************************ 00:10:36.869 END TEST app_repeat 00:10:36.869 ************************************ 00:10:36.869 06:26:50 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:10:36.869 00:10:36.869 real 0m27.203s 00:10:36.869 user 0m55.410s 00:10:36.869 sys 0m5.044s 00:10:36.869 06:26:50 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:36.869 06:26:50 event -- common/autotest_common.sh@10 -- # set +x 00:10:36.869 ************************************ 00:10:36.869 END TEST event 00:10:36.869 ************************************ 00:10:36.869 06:26:50 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:10:36.869 06:26:50 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:36.869 06:26:50 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:36.869 06:26:50 -- common/autotest_common.sh@10 -- # set +x 00:10:36.869 ************************************ 00:10:36.869 START TEST thread 00:10:36.869 ************************************ 00:10:36.869 06:26:50 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:10:37.128 * Looking for test storage... 00:10:37.128 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:10:37.128 06:26:50 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:10:37.128 06:26:50 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:10:37.128 06:26:50 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:37.128 06:26:50 thread -- common/autotest_common.sh@10 -- # set +x 00:10:37.128 ************************************ 00:10:37.128 START TEST thread_poller_perf 00:10:37.128 ************************************ 00:10:37.128 06:26:50 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:10:37.128 [2024-07-25 06:26:50.507783] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:37.128 [2024-07-25 06:26:50.507850] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1057629 ] 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:37.128 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:37.128 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:37.128 [2024-07-25 06:26:50.643406] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:37.387 [2024-07-25 06:26:50.687947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.387 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:10:38.322 ====================================== 00:10:38.322 busy:2514153218 (cyc) 00:10:38.322 total_run_count: 290000 00:10:38.322 tsc_hz: 2500000000 (cyc) 00:10:38.322 ====================================== 00:10:38.322 poller_cost: 8669 (cyc), 3467 (nsec) 00:10:38.322 00:10:38.322 real 0m1.280s 00:10:38.322 user 0m1.137s 00:10:38.322 sys 0m0.138s 00:10:38.322 06:26:51 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:38.322 06:26:51 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:10:38.322 ************************************ 00:10:38.322 END TEST thread_poller_perf 00:10:38.322 ************************************ 00:10:38.322 06:26:51 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:10:38.322 06:26:51 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:10:38.322 06:26:51 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:38.322 06:26:51 thread -- common/autotest_common.sh@10 -- # set +x 00:10:38.322 ************************************ 00:10:38.322 START TEST thread_poller_perf 00:10:38.322 ************************************ 00:10:38.322 06:26:51 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:10:38.322 [2024-07-25 06:26:51.866181] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:38.322 [2024-07-25 06:26:51.866239] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1057821 ] 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:38.581 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:38.581 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:38.581 [2024-07-25 06:26:51.998047] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:38.581 [2024-07-25 06:26:52.041348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:38.581 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:10:39.959 ====================================== 00:10:39.959 busy:2502753826 (cyc) 00:10:39.959 total_run_count: 3800000 00:10:39.959 tsc_hz: 2500000000 (cyc) 00:10:39.959 ====================================== 00:10:39.959 poller_cost: 658 (cyc), 263 (nsec) 00:10:39.959 00:10:39.959 real 0m1.267s 00:10:39.959 user 0m1.117s 00:10:39.959 sys 0m0.143s 00:10:39.959 06:26:53 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:39.959 06:26:53 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:10:39.959 ************************************ 00:10:39.959 END TEST thread_poller_perf 00:10:39.959 ************************************ 00:10:39.959 06:26:53 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:10:39.959 00:10:39.959 real 0m2.818s 00:10:39.959 user 0m2.365s 00:10:39.959 sys 0m0.464s 00:10:39.959 06:26:53 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:39.959 06:26:53 thread -- common/autotest_common.sh@10 -- # set +x 00:10:39.959 ************************************ 00:10:39.959 END TEST thread 00:10:39.959 ************************************ 00:10:39.959 06:26:53 -- spdk/autotest.sh@184 -- # [[ 1 -eq 1 ]] 00:10:39.959 06:26:53 -- spdk/autotest.sh@185 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:10:39.959 06:26:53 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:39.959 06:26:53 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:39.959 06:26:53 -- common/autotest_common.sh@10 -- # set +x 00:10:39.959 ************************************ 00:10:39.959 START TEST accel 00:10:39.959 ************************************ 00:10:39.959 06:26:53 accel -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:10:39.959 * Looking for test storage... 00:10:39.959 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:10:39.959 06:26:53 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:10:39.959 06:26:53 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:10:39.959 06:26:53 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:10:39.959 06:26:53 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1058097 00:10:39.959 06:26:53 accel -- accel/accel.sh@63 -- # waitforlisten 1058097 00:10:39.959 06:26:53 accel -- common/autotest_common.sh@831 -- # '[' -z 1058097 ']' 00:10:39.959 06:26:53 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:39.959 06:26:53 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:39.959 06:26:53 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:39.959 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:39.959 06:26:53 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:39.959 06:26:53 accel -- common/autotest_common.sh@10 -- # set +x 00:10:39.959 06:26:53 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:10:39.959 06:26:53 accel -- accel/accel.sh@61 -- # build_accel_config 00:10:39.959 06:26:53 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:39.959 06:26:53 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:39.959 06:26:53 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:39.959 06:26:53 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:39.959 06:26:53 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:39.959 06:26:53 accel -- accel/accel.sh@40 -- # local IFS=, 00:10:39.959 06:26:53 accel -- accel/accel.sh@41 -- # jq -r . 00:10:39.959 [2024-07-25 06:26:53.394315] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:39.959 [2024-07-25 06:26:53.394377] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1058097 ] 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:39.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:39.959 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:40.218 [2024-07-25 06:26:53.531584] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:40.218 [2024-07-25 06:26:53.575594] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.154 06:26:54 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:41.154 06:26:54 accel -- common/autotest_common.sh@864 -- # return 0 00:10:41.154 06:26:54 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:10:41.154 06:26:54 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:10:41.154 06:26:54 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:10:41.154 06:26:54 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:10:41.154 06:26:54 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:10:41.154 06:26:54 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:10:41.154 06:26:54 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:10:41.154 06:26:54 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.154 06:26:54 accel -- common/autotest_common.sh@10 -- # set +x 00:10:41.154 06:26:54 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.154 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.154 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.154 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.154 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.154 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.154 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.154 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.154 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.154 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.154 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.154 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.154 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.154 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.154 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # IFS== 00:10:41.155 06:26:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:41.155 06:26:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:41.155 06:26:54 accel -- accel/accel.sh@75 -- # killprocess 1058097 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@950 -- # '[' -z 1058097 ']' 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@954 -- # kill -0 1058097 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@955 -- # uname 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1058097 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1058097' 00:10:41.155 killing process with pid 1058097 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@969 -- # kill 1058097 00:10:41.155 06:26:54 accel -- common/autotest_common.sh@974 -- # wait 1058097 00:10:41.723 06:26:54 accel -- accel/accel.sh@76 -- # trap - ERR 00:10:41.723 06:26:54 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:10:41.723 06:26:54 accel -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:41.723 06:26:54 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:41.723 06:26:54 accel -- common/autotest_common.sh@10 -- # set +x 00:10:41.723 06:26:55 accel.accel_help -- common/autotest_common.sh@1125 -- # accel_perf -h 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:10:41.723 06:26:55 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:10:41.723 06:26:55 accel.accel_help -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:41.723 06:26:55 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:10:41.723 06:26:55 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:10:41.723 06:26:55 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:10:41.723 06:26:55 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:41.723 06:26:55 accel -- common/autotest_common.sh@10 -- # set +x 00:10:41.723 ************************************ 00:10:41.723 START TEST accel_missing_filename 00:10:41.723 ************************************ 00:10:41.723 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress 00:10:41.724 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # local es=0 00:10:41.724 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:10:41.724 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:10:41.724 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:41.724 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # type -t accel_perf 00:10:41.724 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:41.724 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:10:41.724 06:26:55 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:10:41.724 [2024-07-25 06:26:55.177489] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:41.724 [2024-07-25 06:26:55.177544] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1058517 ] 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:41.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:41.724 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:41.983 [2024-07-25 06:26:55.311842] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:41.983 [2024-07-25 06:26:55.355608] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.983 [2024-07-25 06:26:55.416830] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:41.983 [2024-07-25 06:26:55.482775] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:10:42.242 A filename is required. 00:10:42.242 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # es=234 00:10:42.242 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:42.242 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # es=106 00:10:42.242 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@663 -- # case "$es" in 00:10:42.242 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@670 -- # es=1 00:10:42.242 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:42.242 00:10:42.242 real 0m0.406s 00:10:42.242 user 0m0.234s 00:10:42.242 sys 0m0.189s 00:10:42.242 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:42.242 06:26:55 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:10:42.242 ************************************ 00:10:42.242 END TEST accel_missing_filename 00:10:42.242 ************************************ 00:10:42.242 06:26:55 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:42.242 06:26:55 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:10:42.242 06:26:55 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:42.242 06:26:55 accel -- common/autotest_common.sh@10 -- # set +x 00:10:42.242 ************************************ 00:10:42.242 START TEST accel_compress_verify 00:10:42.242 ************************************ 00:10:42.242 06:26:55 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:42.242 06:26:55 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # local es=0 00:10:42.242 06:26:55 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:42.242 06:26:55 accel.accel_compress_verify -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:10:42.242 06:26:55 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.242 06:26:55 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # type -t accel_perf 00:10:42.242 06:26:55 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.242 06:26:55 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:10:42.242 06:26:55 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:10:42.242 [2024-07-25 06:26:55.655743] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:42.242 [2024-07-25 06:26:55.655798] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1058654 ] 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:42.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.242 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:42.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:42.243 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:42.243 [2024-07-25 06:26:55.788457] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:42.502 [2024-07-25 06:26:55.831732] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:42.502 [2024-07-25 06:26:55.893799] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:42.502 [2024-07-25 06:26:55.956875] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:10:42.502 00:10:42.502 Compression does not support the verify option, aborting. 00:10:42.502 06:26:56 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # es=161 00:10:42.502 06:26:56 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:42.502 06:26:56 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # es=33 00:10:42.502 06:26:56 accel.accel_compress_verify -- common/autotest_common.sh@663 -- # case "$es" in 00:10:42.503 06:26:56 accel.accel_compress_verify -- common/autotest_common.sh@670 -- # es=1 00:10:42.503 06:26:56 accel.accel_compress_verify -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:42.503 00:10:42.503 real 0m0.401s 00:10:42.503 user 0m0.245s 00:10:42.503 sys 0m0.181s 00:10:42.503 06:26:56 accel.accel_compress_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:42.503 06:26:56 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:10:42.503 ************************************ 00:10:42.503 END TEST accel_compress_verify 00:10:42.503 ************************************ 00:10:42.762 06:26:56 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:10:42.762 06:26:56 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:10:42.762 06:26:56 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:42.762 06:26:56 accel -- common/autotest_common.sh@10 -- # set +x 00:10:42.762 ************************************ 00:10:42.762 START TEST accel_wrong_workload 00:10:42.762 ************************************ 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w foobar 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # local es=0 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # type -t accel_perf 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:10:42.763 06:26:56 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:10:42.763 Unsupported workload type: foobar 00:10:42.763 [2024-07-25 06:26:56.140847] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:10:42.763 accel_perf options: 00:10:42.763 [-h help message] 00:10:42.763 [-q queue depth per core] 00:10:42.763 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:10:42.763 [-T number of threads per core 00:10:42.763 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:10:42.763 [-t time in seconds] 00:10:42.763 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:10:42.763 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:10:42.763 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:10:42.763 [-l for compress/decompress workloads, name of uncompressed input file 00:10:42.763 [-S for crc32c workload, use this seed value (default 0) 00:10:42.763 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:10:42.763 [-f for fill workload, use this BYTE value (default 255) 00:10:42.763 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:10:42.763 [-y verify result if this switch is on] 00:10:42.763 [-a tasks to allocate per core (default: same value as -q)] 00:10:42.763 Can be used to spread operations across a wider range of memory. 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # es=1 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:42.763 00:10:42.763 real 0m0.042s 00:10:42.763 user 0m0.023s 00:10:42.763 sys 0m0.019s 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:42.763 06:26:56 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:10:42.763 ************************************ 00:10:42.763 END TEST accel_wrong_workload 00:10:42.763 ************************************ 00:10:42.763 Error: writing output failed: Broken pipe 00:10:42.763 06:26:56 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:10:42.763 06:26:56 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:10:42.763 06:26:56 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:42.763 06:26:56 accel -- common/autotest_common.sh@10 -- # set +x 00:10:42.763 ************************************ 00:10:42.763 START TEST accel_negative_buffers 00:10:42.763 ************************************ 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # local es=0 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # type -t accel_perf 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:10:42.763 06:26:56 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:10:42.763 -x option must be non-negative. 00:10:42.763 [2024-07-25 06:26:56.268352] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:10:42.763 accel_perf options: 00:10:42.763 [-h help message] 00:10:42.763 [-q queue depth per core] 00:10:42.763 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:10:42.763 [-T number of threads per core 00:10:42.763 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:10:42.763 [-t time in seconds] 00:10:42.763 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:10:42.763 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:10:42.763 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:10:42.763 [-l for compress/decompress workloads, name of uncompressed input file 00:10:42.763 [-S for crc32c workload, use this seed value (default 0) 00:10:42.763 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:10:42.763 [-f for fill workload, use this BYTE value (default 255) 00:10:42.763 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:10:42.763 [-y verify result if this switch is on] 00:10:42.763 [-a tasks to allocate per core (default: same value as -q)] 00:10:42.763 Can be used to spread operations across a wider range of memory. 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # es=1 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:42.763 00:10:42.763 real 0m0.044s 00:10:42.763 user 0m0.025s 00:10:42.763 sys 0m0.018s 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:42.763 06:26:56 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:10:42.763 ************************************ 00:10:42.763 END TEST accel_negative_buffers 00:10:42.763 ************************************ 00:10:42.763 Error: writing output failed: Broken pipe 00:10:42.763 06:26:56 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:10:42.763 06:26:56 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:10:42.763 06:26:56 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:42.763 06:26:56 accel -- common/autotest_common.sh@10 -- # set +x 00:10:43.023 ************************************ 00:10:43.023 START TEST accel_crc32c 00:10:43.023 ************************************ 00:10:43.023 06:26:56 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -S 32 -y 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:10:43.023 06:26:56 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:10:43.023 [2024-07-25 06:26:56.394547] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:43.023 [2024-07-25 06:26:56.394614] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1058723 ] 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:43.023 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.023 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:43.024 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:43.024 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:43.024 [2024-07-25 06:26:56.527729] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:43.024 [2024-07-25 06:26:56.572584] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.283 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:43.284 06:26:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:10:44.221 06:26:57 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:44.221 00:10:44.221 real 0m1.410s 00:10:44.221 user 0m0.010s 00:10:44.221 sys 0m0.001s 00:10:44.221 06:26:57 accel.accel_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:44.221 06:26:57 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:10:44.221 ************************************ 00:10:44.221 END TEST accel_crc32c 00:10:44.221 ************************************ 00:10:44.481 06:26:57 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:10:44.481 06:26:57 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:10:44.481 06:26:57 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:44.481 06:26:57 accel -- common/autotest_common.sh@10 -- # set +x 00:10:44.481 ************************************ 00:10:44.481 START TEST accel_crc32c_C2 00:10:44.481 ************************************ 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -y -C 2 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:10:44.481 06:26:57 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:10:44.481 [2024-07-25 06:26:57.880231] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:44.481 [2024-07-25 06:26:57.880286] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1059000 ] 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:44.481 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:44.481 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:44.481 [2024-07-25 06:26:58.013379] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:44.740 [2024-07-25 06:26:58.057247] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.740 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:44.741 06:26:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:46.118 00:10:46.118 real 0m1.415s 00:10:46.118 user 0m0.008s 00:10:46.118 sys 0m0.000s 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:46.118 06:26:59 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:10:46.118 ************************************ 00:10:46.118 END TEST accel_crc32c_C2 00:10:46.118 ************************************ 00:10:46.118 06:26:59 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:10:46.118 06:26:59 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:10:46.118 06:26:59 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:46.118 06:26:59 accel -- common/autotest_common.sh@10 -- # set +x 00:10:46.119 ************************************ 00:10:46.119 START TEST accel_copy 00:10:46.119 ************************************ 00:10:46.119 06:26:59 accel.accel_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy -y 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:10:46.119 [2024-07-25 06:26:59.369660] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:46.119 [2024-07-25 06:26:59.369719] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1059288 ] 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:46.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:46.119 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:46.119 [2024-07-25 06:26:59.504732] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:46.119 [2024-07-25 06:26:59.546791] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:46.119 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:46.120 06:26:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:10:47.497 06:27:00 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:47.497 00:10:47.497 real 0m1.412s 00:10:47.497 user 0m0.008s 00:10:47.497 sys 0m0.002s 00:10:47.497 06:27:00 accel.accel_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:47.497 06:27:00 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:10:47.497 ************************************ 00:10:47.497 END TEST accel_copy 00:10:47.497 ************************************ 00:10:47.497 06:27:00 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:10:47.497 06:27:00 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:10:47.497 06:27:00 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:47.497 06:27:00 accel -- common/autotest_common.sh@10 -- # set +x 00:10:47.497 ************************************ 00:10:47.497 START TEST accel_fill 00:10:47.497 ************************************ 00:10:47.497 06:27:00 accel.accel_fill -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:10:47.497 06:27:00 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:10:47.497 [2024-07-25 06:27:00.859049] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:47.497 [2024-07-25 06:27:00.859116] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1059621 ] 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:47.497 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:47.497 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:47.497 [2024-07-25 06:27:00.992892] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:47.497 [2024-07-25 06:27:01.036924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:47.757 06:27:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:10:48.694 06:27:02 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:48.694 00:10:48.694 real 0m1.415s 00:10:48.694 user 0m0.009s 00:10:48.694 sys 0m0.004s 00:10:48.694 06:27:02 accel.accel_fill -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:48.694 06:27:02 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:10:48.694 ************************************ 00:10:48.694 END TEST accel_fill 00:10:48.694 ************************************ 00:10:48.953 06:27:02 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:10:48.953 06:27:02 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:10:48.953 06:27:02 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:48.953 06:27:02 accel -- common/autotest_common.sh@10 -- # set +x 00:10:48.953 ************************************ 00:10:48.953 START TEST accel_copy_crc32c 00:10:48.953 ************************************ 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:10:48.953 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:10:48.953 [2024-07-25 06:27:02.354419] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:48.953 [2024-07-25 06:27:02.354476] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1059975 ] 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.953 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:48.953 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:48.954 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:48.954 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:48.954 [2024-07-25 06:27:02.489322] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:49.212 [2024-07-25 06:27:02.532966] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:49.212 06:27:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:50.213 00:10:50.213 real 0m1.420s 00:10:50.213 user 0m1.240s 00:10:50.213 sys 0m0.182s 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:50.213 06:27:03 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:10:50.213 ************************************ 00:10:50.213 END TEST accel_copy_crc32c 00:10:50.213 ************************************ 00:10:50.471 06:27:03 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:10:50.471 06:27:03 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:10:50.471 06:27:03 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:50.471 06:27:03 accel -- common/autotest_common.sh@10 -- # set +x 00:10:50.471 ************************************ 00:10:50.471 START TEST accel_copy_crc32c_C2 00:10:50.471 ************************************ 00:10:50.471 06:27:03 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:10:50.471 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:10:50.471 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:10:50.471 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.471 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.471 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:10:50.471 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:10:50.471 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:10:50.472 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:50.472 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:50.472 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:50.472 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:50.472 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:50.472 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:10:50.472 06:27:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:10:50.472 [2024-07-25 06:27:03.858921] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:50.472 [2024-07-25 06:27:03.858978] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1060258 ] 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:50.472 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:50.472 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:50.472 [2024-07-25 06:27:03.992773] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:50.731 [2024-07-25 06:27:04.037664] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.731 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:50.732 06:27:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:52.109 00:10:52.109 real 0m1.418s 00:10:52.109 user 0m1.221s 00:10:52.109 sys 0m0.199s 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:52.109 06:27:05 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:10:52.109 ************************************ 00:10:52.109 END TEST accel_copy_crc32c_C2 00:10:52.109 ************************************ 00:10:52.109 06:27:05 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:10:52.109 06:27:05 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:10:52.109 06:27:05 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:52.109 06:27:05 accel -- common/autotest_common.sh@10 -- # set +x 00:10:52.109 ************************************ 00:10:52.109 START TEST accel_dualcast 00:10:52.109 ************************************ 00:10:52.109 06:27:05 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dualcast -y 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:10:52.109 [2024-07-25 06:27:05.352544] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:52.109 [2024-07-25 06:27:05.352607] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1060872 ] 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:52.109 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:52.109 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:52.109 [2024-07-25 06:27:05.484213] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:52.109 [2024-07-25 06:27:05.527313] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:10:52.109 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:52.110 06:27:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:53.484 06:27:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:53.485 06:27:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:53.485 06:27:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:10:53.485 06:27:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:53.485 00:10:53.485 real 0m1.404s 00:10:53.485 user 0m1.226s 00:10:53.485 sys 0m0.183s 00:10:53.485 06:27:06 accel.accel_dualcast -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:53.485 06:27:06 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:10:53.485 ************************************ 00:10:53.485 END TEST accel_dualcast 00:10:53.485 ************************************ 00:10:53.485 06:27:06 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:10:53.485 06:27:06 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:10:53.485 06:27:06 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:53.485 06:27:06 accel -- common/autotest_common.sh@10 -- # set +x 00:10:53.485 ************************************ 00:10:53.485 START TEST accel_compare 00:10:53.485 ************************************ 00:10:53.485 06:27:06 accel.accel_compare -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compare -y 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:10:53.485 06:27:06 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:10:53.485 [2024-07-25 06:27:06.849200] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:53.485 [2024-07-25 06:27:06.849270] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1061238 ] 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:53.485 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:53.485 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:53.485 [2024-07-25 06:27:06.983280] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:53.485 [2024-07-25 06:27:07.026613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:53.744 06:27:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:10:54.680 06:27:08 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:54.680 00:10:54.680 real 0m1.417s 00:10:54.680 user 0m1.225s 00:10:54.680 sys 0m0.190s 00:10:54.680 06:27:08 accel.accel_compare -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:54.680 06:27:08 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:10:54.680 ************************************ 00:10:54.680 END TEST accel_compare 00:10:54.680 ************************************ 00:10:54.939 06:27:08 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:10:54.940 06:27:08 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:10:54.940 06:27:08 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:54.940 06:27:08 accel -- common/autotest_common.sh@10 -- # set +x 00:10:54.940 ************************************ 00:10:54.940 START TEST accel_xor 00:10:54.940 ************************************ 00:10:54.940 06:27:08 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:10:54.940 06:27:08 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:10:54.940 [2024-07-25 06:27:08.351527] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:54.940 [2024-07-25 06:27:08.351579] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1061527 ] 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:54.940 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:54.940 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:54.940 [2024-07-25 06:27:08.484182] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:55.199 [2024-07-25 06:27:08.527487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.199 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:55.200 06:27:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:56.576 00:10:56.576 real 0m1.411s 00:10:56.576 user 0m1.226s 00:10:56.576 sys 0m0.192s 00:10:56.576 06:27:09 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:56.576 06:27:09 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:10:56.576 ************************************ 00:10:56.576 END TEST accel_xor 00:10:56.576 ************************************ 00:10:56.576 06:27:09 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:10:56.576 06:27:09 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:10:56.576 06:27:09 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:56.576 06:27:09 accel -- common/autotest_common.sh@10 -- # set +x 00:10:56.576 ************************************ 00:10:56.576 START TEST accel_xor 00:10:56.576 ************************************ 00:10:56.576 06:27:09 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y -x 3 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:10:56.576 06:27:09 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:10:56.576 [2024-07-25 06:27:09.840366] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:56.576 [2024-07-25 06:27:09.840418] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1061804 ] 00:10:56.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.576 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:56.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.576 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:56.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.576 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:56.576 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.576 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:56.577 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:56.577 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:56.577 [2024-07-25 06:27:09.973672] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:56.577 [2024-07-25 06:27:10.017376] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:56.577 06:27:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:10:57.951 06:27:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:57.951 00:10:57.951 real 0m1.409s 00:10:57.951 user 0m1.225s 00:10:57.951 sys 0m0.182s 00:10:57.951 06:27:11 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:57.951 06:27:11 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:10:57.951 ************************************ 00:10:57.951 END TEST accel_xor 00:10:57.951 ************************************ 00:10:57.951 06:27:11 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:10:57.951 06:27:11 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:10:57.951 06:27:11 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:57.951 06:27:11 accel -- common/autotest_common.sh@10 -- # set +x 00:10:57.951 ************************************ 00:10:57.951 START TEST accel_dif_verify 00:10:57.951 ************************************ 00:10:57.951 06:27:11 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_verify 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:10:57.951 06:27:11 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:10:57.951 [2024-07-25 06:27:11.336637] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:57.951 [2024-07-25 06:27:11.336691] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1062093 ] 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:57.951 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:57.951 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:57.951 [2024-07-25 06:27:11.466903] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:58.209 [2024-07-25 06:27:11.510171] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:58.209 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:58.210 06:27:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:59.141 06:27:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:59.141 06:27:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:59.141 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:59.141 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:59.141 06:27:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:10:59.399 06:27:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:59.399 00:10:59.399 real 0m1.402s 00:10:59.399 user 0m1.232s 00:10:59.399 sys 0m0.179s 00:10:59.399 06:27:12 accel.accel_dif_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:59.399 06:27:12 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:10:59.399 ************************************ 00:10:59.399 END TEST accel_dif_verify 00:10:59.399 ************************************ 00:10:59.399 06:27:12 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:10:59.399 06:27:12 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:10:59.399 06:27:12 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:59.399 06:27:12 accel -- common/autotest_common.sh@10 -- # set +x 00:10:59.400 ************************************ 00:10:59.400 START TEST accel_dif_generate 00:10:59.400 ************************************ 00:10:59.400 06:27:12 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:10:59.400 06:27:12 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:10:59.400 [2024-07-25 06:27:12.820071] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:10:59.400 [2024-07-25 06:27:12.820126] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1062370 ] 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:01.0 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:01.1 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:01.2 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:01.3 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:01.4 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:01.5 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:01.6 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:01.7 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:02.0 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:02.1 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:02.2 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:02.3 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:02.4 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:02.5 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:02.6 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3d:02.7 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:01.0 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:01.1 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:01.2 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:01.3 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:01.4 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:01.5 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:01.6 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:01.7 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:02.0 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:02.1 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:02.2 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:02.3 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:02.4 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:02.5 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:02.6 cannot be used 00:10:59.400 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:10:59.400 EAL: Requested device 0000:3f:02.7 cannot be used 00:10:59.400 [2024-07-25 06:27:12.954539] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:59.658 [2024-07-25 06:27:12.997076] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.658 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:59.659 06:27:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:11:01.029 06:27:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:01.029 00:11:01.029 real 0m1.408s 00:11:01.029 user 0m1.230s 00:11:01.029 sys 0m0.187s 00:11:01.029 06:27:14 accel.accel_dif_generate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:01.029 06:27:14 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:11:01.029 ************************************ 00:11:01.029 END TEST accel_dif_generate 00:11:01.029 ************************************ 00:11:01.030 06:27:14 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:11:01.030 06:27:14 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:11:01.030 06:27:14 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:01.030 06:27:14 accel -- common/autotest_common.sh@10 -- # set +x 00:11:01.030 ************************************ 00:11:01.030 START TEST accel_dif_generate_copy 00:11:01.030 ************************************ 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate_copy 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:11:01.030 [2024-07-25 06:27:14.315599] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:01.030 [2024-07-25 06:27:14.315660] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1062655 ] 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:01.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:01.030 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:01.030 [2024-07-25 06:27:14.448596] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:01.030 [2024-07-25 06:27:14.492327] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.030 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:01.031 06:27:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:02.402 00:11:02.402 real 0m1.411s 00:11:02.402 user 0m1.230s 00:11:02.402 sys 0m0.189s 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:02.402 06:27:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:11:02.402 ************************************ 00:11:02.402 END TEST accel_dif_generate_copy 00:11:02.402 ************************************ 00:11:02.403 06:27:15 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:11:02.403 06:27:15 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:02.403 06:27:15 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:11:02.403 06:27:15 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:02.403 06:27:15 accel -- common/autotest_common.sh@10 -- # set +x 00:11:02.403 ************************************ 00:11:02.403 START TEST accel_comp 00:11:02.403 ************************************ 00:11:02.403 06:27:15 accel.accel_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:11:02.403 06:27:15 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:11:02.403 [2024-07-25 06:27:15.805462] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:02.403 [2024-07-25 06:27:15.805518] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1062936 ] 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:02.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:02.403 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:02.403 [2024-07-25 06:27:15.939096] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:02.661 [2024-07-25 06:27:15.982234] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.661 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:02.662 06:27:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:11:04.035 06:27:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:04.035 00:11:04.035 real 0m1.411s 00:11:04.035 user 0m1.234s 00:11:04.035 sys 0m0.187s 00:11:04.035 06:27:17 accel.accel_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:04.035 06:27:17 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:11:04.035 ************************************ 00:11:04.035 END TEST accel_comp 00:11:04.035 ************************************ 00:11:04.035 06:27:17 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:11:04.035 06:27:17 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:11:04.035 06:27:17 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:04.035 06:27:17 accel -- common/autotest_common.sh@10 -- # set +x 00:11:04.035 ************************************ 00:11:04.035 START TEST accel_decomp 00:11:04.035 ************************************ 00:11:04.035 06:27:17 accel.accel_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:11:04.035 06:27:17 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:11:04.035 06:27:17 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:11:04.035 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.035 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:11:04.036 [2024-07-25 06:27:17.304476] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:04.036 [2024-07-25 06:27:17.304537] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1063215 ] 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:04.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:04.036 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:04.036 [2024-07-25 06:27:17.438947] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:04.036 [2024-07-25 06:27:17.483598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:04.036 06:27:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:04.037 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:04.037 06:27:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:05.448 06:27:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:05.448 00:11:05.448 real 0m1.416s 00:11:05.448 user 0m1.239s 00:11:05.448 sys 0m0.186s 00:11:05.448 06:27:18 accel.accel_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:05.448 06:27:18 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:11:05.448 ************************************ 00:11:05.448 END TEST accel_decomp 00:11:05.448 ************************************ 00:11:05.448 06:27:18 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:11:05.448 06:27:18 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:11:05.448 06:27:18 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:05.448 06:27:18 accel -- common/autotest_common.sh@10 -- # set +x 00:11:05.448 ************************************ 00:11:05.448 START TEST accel_decomp_full 00:11:05.448 ************************************ 00:11:05.448 06:27:18 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:11:05.448 06:27:18 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:11:05.448 [2024-07-25 06:27:18.801755] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:05.448 [2024-07-25 06:27:18.801814] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1063502 ] 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.448 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:05.448 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:05.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:05.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:05.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:05.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:05.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:05.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:05.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:05.449 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:05.449 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:05.449 [2024-07-25 06:27:18.935921] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:05.449 [2024-07-25 06:27:18.979075] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:05.707 06:27:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:06.642 06:27:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:06.901 06:27:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:06.901 00:11:06.901 real 0m1.431s 00:11:06.901 user 0m1.253s 00:11:06.902 sys 0m0.185s 00:11:06.902 06:27:20 accel.accel_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:06.902 06:27:20 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:11:06.902 ************************************ 00:11:06.902 END TEST accel_decomp_full 00:11:06.902 ************************************ 00:11:06.902 06:27:20 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:11:06.902 06:27:20 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:11:06.902 06:27:20 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:06.902 06:27:20 accel -- common/autotest_common.sh@10 -- # set +x 00:11:06.902 ************************************ 00:11:06.902 START TEST accel_decomp_mcore 00:11:06.902 ************************************ 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:11:06.902 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:11:06.902 [2024-07-25 06:27:20.317346] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:06.902 [2024-07-25 06:27:20.317416] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1063781 ] 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:06.902 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:06.902 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:06.902 [2024-07-25 06:27:20.450465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:11:07.162 [2024-07-25 06:27:20.497856] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:07.162 [2024-07-25 06:27:20.497952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:11:07.162 [2024-07-25 06:27:20.498061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:11:07.162 [2024-07-25 06:27:20.498065] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:07.162 06:27:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:08.544 06:27:21 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:08.544 00:11:08.544 real 0m1.432s 00:11:08.544 user 0m4.643s 00:11:08.544 sys 0m0.197s 00:11:08.545 06:27:21 accel.accel_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:08.545 06:27:21 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:11:08.545 ************************************ 00:11:08.545 END TEST accel_decomp_mcore 00:11:08.545 ************************************ 00:11:08.545 06:27:21 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:11:08.545 06:27:21 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:11:08.545 06:27:21 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:08.545 06:27:21 accel -- common/autotest_common.sh@10 -- # set +x 00:11:08.545 ************************************ 00:11:08.545 START TEST accel_decomp_full_mcore 00:11:08.545 ************************************ 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:11:08.545 06:27:21 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:11:08.545 [2024-07-25 06:27:21.829495] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:08.545 [2024-07-25 06:27:21.829560] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064068 ] 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:08.545 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:08.545 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:08.545 [2024-07-25 06:27:21.963194] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:11:08.545 [2024-07-25 06:27:22.010430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:08.545 [2024-07-25 06:27:22.010526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:11:08.545 [2024-07-25 06:27:22.010631] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:11:08.545 [2024-07-25 06:27:22.010635] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.545 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:08.546 06:27:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:09.921 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:09.922 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:09.922 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:09.922 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:09.922 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:09.922 06:27:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:09.922 00:11:09.922 real 0m1.440s 00:11:09.922 user 0m4.677s 00:11:09.922 sys 0m0.201s 00:11:09.922 06:27:23 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:09.922 06:27:23 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:11:09.922 ************************************ 00:11:09.922 END TEST accel_decomp_full_mcore 00:11:09.922 ************************************ 00:11:09.922 06:27:23 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:11:09.922 06:27:23 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:11:09.922 06:27:23 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:09.922 06:27:23 accel -- common/autotest_common.sh@10 -- # set +x 00:11:09.922 ************************************ 00:11:09.922 START TEST accel_decomp_mthread 00:11:09.922 ************************************ 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:11:09.922 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:11:09.922 [2024-07-25 06:27:23.355959] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:09.922 [2024-07-25 06:27:23.356026] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064353 ] 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:09.922 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:09.922 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:10.181 [2024-07-25 06:27:23.491312] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:10.181 [2024-07-25 06:27:23.534633] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:10.181 06:27:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:11.557 00:11:11.557 real 0m1.428s 00:11:11.557 user 0m1.242s 00:11:11.557 sys 0m0.191s 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:11.557 06:27:24 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:11:11.557 ************************************ 00:11:11.557 END TEST accel_decomp_mthread 00:11:11.557 ************************************ 00:11:11.557 06:27:24 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:11:11.557 06:27:24 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:11:11.557 06:27:24 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:11.557 06:27:24 accel -- common/autotest_common.sh@10 -- # set +x 00:11:11.557 ************************************ 00:11:11.557 START TEST accel_decomp_full_mthread 00:11:11.557 ************************************ 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:11:11.557 06:27:24 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:11:11.557 [2024-07-25 06:27:24.868852] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:11.557 [2024-07-25 06:27:24.868910] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064636 ] 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.557 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:11.557 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.558 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:11.558 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.558 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:11.558 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.558 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:11.558 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:11.558 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:11.558 [2024-07-25 06:27:25.005967] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:11.558 [2024-07-25 06:27:25.050082] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:11.816 06:27:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:11:12.752 00:11:12.752 real 0m1.453s 00:11:12.752 user 0m1.276s 00:11:12.752 sys 0m0.185s 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:12.752 06:27:26 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:11:12.752 ************************************ 00:11:12.752 END TEST accel_decomp_full_mthread 00:11:12.752 ************************************ 00:11:13.011 06:27:26 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:11:13.011 06:27:26 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:11:13.011 06:27:26 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:11:13.011 06:27:26 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:11:13.011 06:27:26 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1064915 00:11:13.011 06:27:26 accel -- accel/accel.sh@63 -- # waitforlisten 1064915 00:11:13.011 06:27:26 accel -- common/autotest_common.sh@831 -- # '[' -z 1064915 ']' 00:11:13.011 06:27:26 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:13.011 06:27:26 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:11:13.011 06:27:26 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:13.011 06:27:26 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:13.011 06:27:26 accel -- accel/accel.sh@61 -- # build_accel_config 00:11:13.011 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:13.011 06:27:26 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:13.011 06:27:26 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:13.011 06:27:26 accel -- common/autotest_common.sh@10 -- # set +x 00:11:13.011 06:27:26 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:13.011 06:27:26 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:13.011 06:27:26 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:13.011 06:27:26 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:11:13.011 06:27:26 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:11:13.011 06:27:26 accel -- accel/accel.sh@40 -- # local IFS=, 00:11:13.011 06:27:26 accel -- accel/accel.sh@41 -- # jq -r . 00:11:13.011 [2024-07-25 06:27:26.397827] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:13.011 [2024-07-25 06:27:26.397889] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064915 ] 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:13.011 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:13.011 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:13.011 [2024-07-25 06:27:26.534756] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:13.270 [2024-07-25 06:27:26.577736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:13.836 [2024-07-25 06:27:27.181478] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:11:13.836 06:27:27 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:13.836 06:27:27 accel -- common/autotest_common.sh@864 -- # return 0 00:11:13.836 06:27:27 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:11:13.836 06:27:27 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:11:13.836 06:27:27 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:11:13.836 06:27:27 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:11:13.836 06:27:27 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:11:13.836 06:27:27 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:11:13.836 06:27:27 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:13.836 06:27:27 accel -- common/autotest_common.sh@10 -- # set +x 00:11:13.836 06:27:27 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:11:13.836 06:27:27 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:14.095 "method": "compressdev_scan_accel_module", 00:11:14.095 06:27:27 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:11:14.095 06:27:27 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:11:14.095 06:27:27 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@10 -- # set +x 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # IFS== 00:11:14.095 06:27:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:11:14.095 06:27:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:11:14.095 06:27:27 accel -- accel/accel.sh@75 -- # killprocess 1064915 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@950 -- # '[' -z 1064915 ']' 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@954 -- # kill -0 1064915 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@955 -- # uname 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1064915 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1064915' 00:11:14.095 killing process with pid 1064915 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@969 -- # kill 1064915 00:11:14.095 06:27:27 accel -- common/autotest_common.sh@974 -- # wait 1064915 00:11:14.663 06:27:27 accel -- accel/accel.sh@76 -- # trap - ERR 00:11:14.663 06:27:27 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:14.663 06:27:27 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:11:14.663 06:27:27 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:14.663 06:27:27 accel -- common/autotest_common.sh@10 -- # set +x 00:11:14.663 ************************************ 00:11:14.663 START TEST accel_cdev_comp 00:11:14.663 ************************************ 00:11:14.663 06:27:27 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:11:14.663 06:27:27 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:11:14.663 [2024-07-25 06:27:28.004341] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:14.663 [2024-07-25 06:27:28.004393] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1065200 ] 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.663 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:14.663 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:14.664 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:14.664 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:14.664 [2024-07-25 06:27:28.135725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:14.664 [2024-07-25 06:27:28.179036] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:15.231 [2024-07-25 06:27:28.785453] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:11:15.503 [2024-07-25 06:27:28.787803] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xdfab30 PMD being used: compress_qat 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:15.503 [2024-07-25 06:27:28.791520] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xdfdae0 PMD being used: compress_qat 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:15.503 06:27:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:16.438 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:11:16.439 06:27:29 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:11:16.439 00:11:16.439 real 0m1.951s 00:11:16.439 user 0m1.431s 00:11:16.439 sys 0m0.523s 00:11:16.439 06:27:29 accel.accel_cdev_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:16.439 06:27:29 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:11:16.439 ************************************ 00:11:16.439 END TEST accel_cdev_comp 00:11:16.439 ************************************ 00:11:16.439 06:27:29 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:11:16.439 06:27:29 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:11:16.439 06:27:29 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:16.439 06:27:29 accel -- common/autotest_common.sh@10 -- # set +x 00:11:16.698 ************************************ 00:11:16.698 START TEST accel_cdev_decomp 00:11:16.698 ************************************ 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:11:16.698 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:11:16.698 [2024-07-25 06:27:30.043243] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:16.698 [2024-07-25 06:27:30.043298] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1065492 ] 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:16.698 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:16.698 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:16.698 [2024-07-25 06:27:30.174870] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:16.698 [2024-07-25 06:27:30.218219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:17.634 [2024-07-25 06:27:30.830340] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:11:17.634 [2024-07-25 06:27:30.832724] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1bfbb30 PMD being used: compress_qat 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:17.634 [2024-07-25 06:27:30.836539] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1bfeae0 PMD being used: compress_qat 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.634 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:17.635 06:27:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:11:18.571 00:11:18.571 real 0m1.967s 00:11:18.571 user 0m1.441s 00:11:18.571 sys 0m0.531s 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:18.571 06:27:31 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:11:18.571 ************************************ 00:11:18.571 END TEST accel_cdev_decomp 00:11:18.571 ************************************ 00:11:18.571 06:27:32 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:11:18.571 06:27:32 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:11:18.571 06:27:32 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:18.571 06:27:32 accel -- common/autotest_common.sh@10 -- # set +x 00:11:18.571 ************************************ 00:11:18.571 START TEST accel_cdev_decomp_full 00:11:18.571 ************************************ 00:11:18.571 06:27:32 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:11:18.571 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:11:18.571 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:11:18.571 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:18.571 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:18.571 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:11:18.571 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:11:18.572 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:11:18.572 [2024-07-25 06:27:32.099123] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:18.572 [2024-07-25 06:27:32.099192] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1065804 ] 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:18.831 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:18.831 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:18.831 [2024-07-25 06:27:32.234410] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:18.831 [2024-07-25 06:27:32.279233] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:19.400 [2024-07-25 06:27:32.875280] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:11:19.400 [2024-07-25 06:27:32.877635] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb2fb30 PMD being used: compress_qat 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:19.400 [2024-07-25 06:27:32.880570] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb328f0 PMD being used: compress_qat 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:19.400 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:19.401 06:27:32 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:20.779 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:20.779 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:20.779 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:20.779 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:11:20.780 00:11:20.780 real 0m1.960s 00:11:20.780 user 0m1.454s 00:11:20.780 sys 0m0.511s 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:20.780 06:27:34 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:11:20.780 ************************************ 00:11:20.780 END TEST accel_cdev_decomp_full 00:11:20.780 ************************************ 00:11:20.780 06:27:34 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:11:20.780 06:27:34 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:11:20.780 06:27:34 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:20.780 06:27:34 accel -- common/autotest_common.sh@10 -- # set +x 00:11:20.780 ************************************ 00:11:20.780 START TEST accel_cdev_decomp_mcore 00:11:20.780 ************************************ 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:11:20.780 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:11:20.780 [2024-07-25 06:27:34.141268] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:20.780 [2024-07-25 06:27:34.141321] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1066315 ] 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:20.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:20.780 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:20.780 [2024-07-25 06:27:34.260752] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:11:20.780 [2024-07-25 06:27:34.307950] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:20.780 [2024-07-25 06:27:34.308043] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:11:20.780 [2024-07-25 06:27:34.308158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:11:20.780 [2024-07-25 06:27:34.308173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:21.346 [2024-07-25 06:27:34.900508] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:11:21.604 [2024-07-25 06:27:34.902878] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x229d0e0 PMD being used: compress_qat 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:21.604 [2024-07-25 06:27:34.907900] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fbb1419b8b0 PMD being used: compress_qat 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:11:21.604 [2024-07-25 06:27:34.908787] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fbb0c19b8b0 PMD being used: compress_qat 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.604 [2024-07-25 06:27:34.909446] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22a01c0 PMD being used: compress_qat 00:11:21.604 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:21.605 [2024-07-25 06:27:34.909644] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fbb0419b8b0 PMD being used: compress_qat 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:21.605 06:27:34 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.539 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.539 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.539 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:11:22.540 00:11:22.540 real 0m1.959s 00:11:22.540 user 0m6.459s 00:11:22.540 sys 0m0.536s 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:22.540 06:27:36 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:11:22.540 ************************************ 00:11:22.540 END TEST accel_cdev_decomp_mcore 00:11:22.540 ************************************ 00:11:22.799 06:27:36 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:11:22.799 06:27:36 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:11:22.799 06:27:36 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:22.799 06:27:36 accel -- common/autotest_common.sh@10 -- # set +x 00:11:22.799 ************************************ 00:11:22.799 START TEST accel_cdev_decomp_full_mcore 00:11:22.799 ************************************ 00:11:22.799 06:27:36 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:11:22.800 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:11:22.800 [2024-07-25 06:27:36.182448] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:22.800 [2024-07-25 06:27:36.182511] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1066607 ] 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:22.800 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:22.800 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:22.800 [2024-07-25 06:27:36.318412] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:11:23.059 [2024-07-25 06:27:36.366807] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:23.059 [2024-07-25 06:27:36.366902] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:11:23.059 [2024-07-25 06:27:36.366996] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:11:23.059 [2024-07-25 06:27:36.366999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:23.628 [2024-07-25 06:27:36.966232] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:11:23.628 [2024-07-25 06:27:36.968626] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16580e0 PMD being used: compress_qat 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:23.628 [2024-07-25 06:27:36.972819] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f586419b8b0 PMD being used: compress_qat 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:11:23.628 [2024-07-25 06:27:36.973683] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f585c19b8b0 PMD being used: compress_qat 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.628 [2024-07-25 06:27:36.974387] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1658770 PMD being used: compress_qat 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:23.628 [2024-07-25 06:27:36.974571] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f585419b8b0 PMD being used: compress_qat 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:23.628 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:23.629 06:27:36 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.566 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.566 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.566 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.566 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.566 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:11:24.826 00:11:24.826 real 0m1.983s 00:11:24.826 user 0m6.510s 00:11:24.826 sys 0m0.526s 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:24.826 06:27:38 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:11:24.826 ************************************ 00:11:24.826 END TEST accel_cdev_decomp_full_mcore 00:11:24.826 ************************************ 00:11:24.826 06:27:38 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:11:24.826 06:27:38 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:11:24.826 06:27:38 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:24.826 06:27:38 accel -- common/autotest_common.sh@10 -- # set +x 00:11:24.826 ************************************ 00:11:24.826 START TEST accel_cdev_decomp_mthread 00:11:24.826 ************************************ 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:11:24.826 06:27:38 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:11:24.826 [2024-07-25 06:27:38.247605] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:24.826 [2024-07-25 06:27:38.247664] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1066898 ] 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:24.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.826 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:24.827 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.827 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:24.827 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:24.827 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:25.085 [2024-07-25 06:27:38.382215] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:25.085 [2024-07-25 06:27:38.426216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:25.653 [2024-07-25 06:27:39.029241] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:11:25.653 [2024-07-25 06:27:39.031622] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x157bb30 PMD being used: compress_qat 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 [2024-07-25 06:27:39.036088] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x157eae0 PMD being used: compress_qat 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:25.653 [2024-07-25 06:27:39.038433] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1583d90 PMD being used: compress_qat 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:25.653 06:27:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:11:27.029 00:11:27.029 real 0m1.967s 00:11:27.029 user 0m1.435s 00:11:27.029 sys 0m0.538s 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:27.029 06:27:40 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:11:27.029 ************************************ 00:11:27.029 END TEST accel_cdev_decomp_mthread 00:11:27.029 ************************************ 00:11:27.029 06:27:40 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:11:27.029 06:27:40 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:11:27.029 06:27:40 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:27.029 06:27:40 accel -- common/autotest_common.sh@10 -- # set +x 00:11:27.029 ************************************ 00:11:27.029 START TEST accel_cdev_decomp_full_mthread 00:11:27.029 ************************************ 00:11:27.029 06:27:40 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:11:27.029 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:11:27.029 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:11:27.029 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.029 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:11:27.030 06:27:40 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:11:27.030 [2024-07-25 06:27:40.293167] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:27.030 [2024-07-25 06:27:40.293222] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1067391 ] 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:27.030 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:27.030 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:27.030 [2024-07-25 06:27:40.426588] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:27.030 [2024-07-25 06:27:40.470013] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:27.598 [2024-07-25 06:27:41.072928] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:11:27.598 [2024-07-25 06:27:41.075336] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ed5b30 PMD being used: compress_qat 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 [2024-07-25 06:27:41.078889] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ed6240 PMD being used: compress_qat 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:27.598 [2024-07-25 06:27:41.081328] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1edaad0 PMD being used: compress_qat 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.598 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:27.599 06:27:41 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:28.976 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:11:28.977 00:11:28.977 real 0m1.964s 00:11:28.977 user 0m1.433s 00:11:28.977 sys 0m0.538s 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:28.977 06:27:42 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:11:28.977 ************************************ 00:11:28.977 END TEST accel_cdev_decomp_full_mthread 00:11:28.977 ************************************ 00:11:28.977 06:27:42 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:11:28.977 06:27:42 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:11:28.977 06:27:42 accel -- accel/accel.sh@137 -- # build_accel_config 00:11:28.977 06:27:42 accel -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:11:28.977 06:27:42 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:11:28.977 06:27:42 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:28.977 06:27:42 accel -- common/autotest_common.sh@10 -- # set +x 00:11:28.977 06:27:42 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:11:28.977 06:27:42 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:11:28.977 06:27:42 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:11:28.977 06:27:42 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:11:28.977 06:27:42 accel -- accel/accel.sh@40 -- # local IFS=, 00:11:28.977 06:27:42 accel -- accel/accel.sh@41 -- # jq -r . 00:11:28.977 ************************************ 00:11:28.977 START TEST accel_dif_functional_tests 00:11:28.977 ************************************ 00:11:28.977 06:27:42 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:11:28.977 [2024-07-25 06:27:42.370114] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:28.977 [2024-07-25 06:27:42.370184] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1067718 ] 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:28.977 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:28.977 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:28.977 [2024-07-25 06:27:42.507322] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:11:29.236 [2024-07-25 06:27:42.553065] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:29.236 [2024-07-25 06:27:42.553171] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:11:29.236 [2024-07-25 06:27:42.553190] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:29.236 00:11:29.236 00:11:29.236 CUnit - A unit testing framework for C - Version 2.1-3 00:11:29.236 http://cunit.sourceforge.net/ 00:11:29.236 00:11:29.236 00:11:29.236 Suite: accel_dif 00:11:29.236 Test: verify: DIF generated, GUARD check ...passed 00:11:29.236 Test: verify: DIF generated, APPTAG check ...passed 00:11:29.236 Test: verify: DIF generated, REFTAG check ...passed 00:11:29.236 Test: verify: DIF not generated, GUARD check ...[2024-07-25 06:27:42.630731] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:11:29.236 passed 00:11:29.236 Test: verify: DIF not generated, APPTAG check ...[2024-07-25 06:27:42.630795] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:11:29.236 passed 00:11:29.236 Test: verify: DIF not generated, REFTAG check ...[2024-07-25 06:27:42.630831] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:11:29.236 passed 00:11:29.236 Test: verify: APPTAG correct, APPTAG check ...passed 00:11:29.236 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-25 06:27:42.630894] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:11:29.236 passed 00:11:29.236 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:11:29.236 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:11:29.236 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:11:29.236 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-25 06:27:42.631035] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:11:29.236 passed 00:11:29.236 Test: verify copy: DIF generated, GUARD check ...passed 00:11:29.236 Test: verify copy: DIF generated, APPTAG check ...passed 00:11:29.236 Test: verify copy: DIF generated, REFTAG check ...passed 00:11:29.236 Test: verify copy: DIF not generated, GUARD check ...[2024-07-25 06:27:42.631189] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:11:29.236 passed 00:11:29.236 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-25 06:27:42.631221] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:11:29.236 passed 00:11:29.236 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-25 06:27:42.631252] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:11:29.236 passed 00:11:29.236 Test: generate copy: DIF generated, GUARD check ...passed 00:11:29.236 Test: generate copy: DIF generated, APTTAG check ...passed 00:11:29.236 Test: generate copy: DIF generated, REFTAG check ...passed 00:11:29.236 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:11:29.236 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:11:29.236 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:11:29.236 Test: generate copy: iovecs-len validate ...[2024-07-25 06:27:42.631480] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:11:29.236 passed 00:11:29.236 Test: generate copy: buffer alignment validate ...passed 00:11:29.236 00:11:29.236 Run Summary: Type Total Ran Passed Failed Inactive 00:11:29.236 suites 1 1 n/a 0 0 00:11:29.236 tests 26 26 26 0 0 00:11:29.236 asserts 115 115 115 0 n/a 00:11:29.236 00:11:29.236 Elapsed time = 0.002 seconds 00:11:29.495 00:11:29.495 real 0m0.490s 00:11:29.495 user 0m0.600s 00:11:29.495 sys 0m0.217s 00:11:29.495 06:27:42 accel.accel_dif_functional_tests -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:29.495 06:27:42 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:11:29.495 ************************************ 00:11:29.495 END TEST accel_dif_functional_tests 00:11:29.495 ************************************ 00:11:29.495 00:11:29.495 real 0m49.614s 00:11:29.495 user 0m57.375s 00:11:29.495 sys 0m11.418s 00:11:29.495 06:27:42 accel -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:29.495 06:27:42 accel -- common/autotest_common.sh@10 -- # set +x 00:11:29.495 ************************************ 00:11:29.495 END TEST accel 00:11:29.495 ************************************ 00:11:29.495 06:27:42 -- spdk/autotest.sh@186 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:11:29.495 06:27:42 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:11:29.495 06:27:42 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:29.495 06:27:42 -- common/autotest_common.sh@10 -- # set +x 00:11:29.495 ************************************ 00:11:29.495 START TEST accel_rpc 00:11:29.495 ************************************ 00:11:29.495 06:27:42 accel_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:11:29.495 * Looking for test storage... 00:11:29.495 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:11:29.495 06:27:43 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:11:29.495 06:27:43 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1067824 00:11:29.495 06:27:43 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1067824 00:11:29.495 06:27:43 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:11:29.495 06:27:43 accel_rpc -- common/autotest_common.sh@831 -- # '[' -z 1067824 ']' 00:11:29.495 06:27:43 accel_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:29.495 06:27:43 accel_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:29.495 06:27:43 accel_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:29.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:29.495 06:27:43 accel_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:29.495 06:27:43 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:29.754 [2024-07-25 06:27:43.109346] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:29.754 [2024-07-25 06:27:43.109411] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1067824 ] 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:29.754 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.754 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:29.755 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:29.755 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:29.755 [2024-07-25 06:27:43.246015] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:29.755 [2024-07-25 06:27:43.291127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:30.689 06:27:43 accel_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:30.689 06:27:43 accel_rpc -- common/autotest_common.sh@864 -- # return 0 00:11:30.689 06:27:43 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:11:30.689 06:27:43 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:11:30.689 06:27:43 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:11:30.689 06:27:43 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:11:30.689 06:27:43 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:11:30.689 06:27:43 accel_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:11:30.689 06:27:43 accel_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:30.689 06:27:43 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:30.689 ************************************ 00:11:30.689 START TEST accel_assign_opcode 00:11:30.689 ************************************ 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # accel_assign_opcode_test_suite 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:11:30.689 [2024-07-25 06:27:44.037596] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:11:30.689 [2024-07-25 06:27:44.045611] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:11:30.689 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:30.947 06:27:44 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:11:30.947 06:27:44 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:11:30.947 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:30.947 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:11:30.947 06:27:44 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:11:30.947 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:30.947 software 00:11:30.947 00:11:30.947 real 0m0.253s 00:11:30.947 user 0m0.046s 00:11:30.947 sys 0m0.016s 00:11:30.947 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:30.947 06:27:44 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:11:30.947 ************************************ 00:11:30.947 END TEST accel_assign_opcode 00:11:30.947 ************************************ 00:11:30.947 06:27:44 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1067824 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@950 -- # '[' -z 1067824 ']' 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@954 -- # kill -0 1067824 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@955 -- # uname 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1067824 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1067824' 00:11:30.947 killing process with pid 1067824 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@969 -- # kill 1067824 00:11:30.947 06:27:44 accel_rpc -- common/autotest_common.sh@974 -- # wait 1067824 00:11:31.206 00:11:31.206 real 0m1.754s 00:11:31.206 user 0m1.800s 00:11:31.206 sys 0m0.570s 00:11:31.206 06:27:44 accel_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:31.206 06:27:44 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:31.206 ************************************ 00:11:31.206 END TEST accel_rpc 00:11:31.206 ************************************ 00:11:31.206 06:27:44 -- spdk/autotest.sh@189 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:11:31.206 06:27:44 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:11:31.206 06:27:44 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:31.206 06:27:44 -- common/autotest_common.sh@10 -- # set +x 00:11:31.465 ************************************ 00:11:31.465 START TEST app_cmdline 00:11:31.465 ************************************ 00:11:31.465 06:27:44 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:11:31.465 * Looking for test storage... 00:11:31.465 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:11:31.465 06:27:44 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:11:31.465 06:27:44 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1068284 00:11:31.465 06:27:44 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1068284 00:11:31.465 06:27:44 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:11:31.465 06:27:44 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 1068284 ']' 00:11:31.465 06:27:44 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:31.465 06:27:44 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:31.465 06:27:44 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:31.465 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:31.465 06:27:44 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:31.465 06:27:44 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:11:31.465 [2024-07-25 06:27:44.949364] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:31.465 [2024-07-25 06:27:44.949434] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1068284 ] 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:31.724 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:31.724 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:31.724 [2024-07-25 06:27:45.088255] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:31.724 [2024-07-25 06:27:45.133623] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:32.292 06:27:45 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:32.292 06:27:45 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:11:32.292 06:27:45 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:11:32.551 { 00:11:32.551 "version": "SPDK v24.09-pre git sha1 d005e023b", 00:11:32.551 "fields": { 00:11:32.551 "major": 24, 00:11:32.551 "minor": 9, 00:11:32.551 "patch": 0, 00:11:32.551 "suffix": "-pre", 00:11:32.551 "commit": "d005e023b" 00:11:32.551 } 00:11:32.551 } 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:11:32.551 06:27:46 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@26 -- # sort 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:11:32.551 06:27:46 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:11:32.551 06:27:46 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:11:32.551 06:27:46 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:11:32.551 06:27:46 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:11:32.551 06:27:46 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:11:32.551 06:27:46 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:32.551 06:27:46 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:32.551 06:27:46 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:11:32.810 request: 00:11:32.810 { 00:11:32.810 "method": "env_dpdk_get_mem_stats", 00:11:32.810 "req_id": 1 00:11:32.810 } 00:11:32.810 Got JSON-RPC error response 00:11:32.810 response: 00:11:32.810 { 00:11:32.810 "code": -32601, 00:11:32.810 "message": "Method not found" 00:11:32.810 } 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:11:32.810 06:27:46 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1068284 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 1068284 ']' 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 1068284 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:32.810 06:27:46 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1068284 00:11:33.069 06:27:46 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:33.069 06:27:46 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:33.069 06:27:46 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1068284' 00:11:33.069 killing process with pid 1068284 00:11:33.069 06:27:46 app_cmdline -- common/autotest_common.sh@969 -- # kill 1068284 00:11:33.069 06:27:46 app_cmdline -- common/autotest_common.sh@974 -- # wait 1068284 00:11:33.328 00:11:33.328 real 0m1.937s 00:11:33.328 user 0m2.321s 00:11:33.328 sys 0m0.598s 00:11:33.328 06:27:46 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:33.328 06:27:46 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:11:33.328 ************************************ 00:11:33.328 END TEST app_cmdline 00:11:33.328 ************************************ 00:11:33.328 06:27:46 -- spdk/autotest.sh@190 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:11:33.328 06:27:46 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:11:33.328 06:27:46 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:33.328 06:27:46 -- common/autotest_common.sh@10 -- # set +x 00:11:33.328 ************************************ 00:11:33.328 START TEST version 00:11:33.328 ************************************ 00:11:33.328 06:27:46 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:11:33.589 * Looking for test storage... 00:11:33.589 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:11:33.589 06:27:46 version -- app/version.sh@17 -- # get_header_version major 00:11:33.589 06:27:46 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:11:33.589 06:27:46 version -- app/version.sh@14 -- # cut -f2 00:11:33.589 06:27:46 version -- app/version.sh@14 -- # tr -d '"' 00:11:33.589 06:27:46 version -- app/version.sh@17 -- # major=24 00:11:33.589 06:27:46 version -- app/version.sh@18 -- # get_header_version minor 00:11:33.589 06:27:46 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:11:33.589 06:27:46 version -- app/version.sh@14 -- # cut -f2 00:11:33.589 06:27:46 version -- app/version.sh@14 -- # tr -d '"' 00:11:33.589 06:27:46 version -- app/version.sh@18 -- # minor=9 00:11:33.589 06:27:46 version -- app/version.sh@19 -- # get_header_version patch 00:11:33.589 06:27:46 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:11:33.589 06:27:46 version -- app/version.sh@14 -- # cut -f2 00:11:33.589 06:27:46 version -- app/version.sh@14 -- # tr -d '"' 00:11:33.589 06:27:46 version -- app/version.sh@19 -- # patch=0 00:11:33.589 06:27:46 version -- app/version.sh@20 -- # get_header_version suffix 00:11:33.589 06:27:46 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:11:33.589 06:27:46 version -- app/version.sh@14 -- # cut -f2 00:11:33.589 06:27:46 version -- app/version.sh@14 -- # tr -d '"' 00:11:33.589 06:27:46 version -- app/version.sh@20 -- # suffix=-pre 00:11:33.589 06:27:46 version -- app/version.sh@22 -- # version=24.9 00:11:33.589 06:27:46 version -- app/version.sh@25 -- # (( patch != 0 )) 00:11:33.589 06:27:46 version -- app/version.sh@28 -- # version=24.9rc0 00:11:33.589 06:27:46 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:11:33.589 06:27:46 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:11:33.589 06:27:46 version -- app/version.sh@30 -- # py_version=24.9rc0 00:11:33.589 06:27:46 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:11:33.589 00:11:33.589 real 0m0.193s 00:11:33.589 user 0m0.095s 00:11:33.589 sys 0m0.149s 00:11:33.589 06:27:46 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:33.589 06:27:46 version -- common/autotest_common.sh@10 -- # set +x 00:11:33.589 ************************************ 00:11:33.589 END TEST version 00:11:33.589 ************************************ 00:11:33.589 06:27:47 -- spdk/autotest.sh@192 -- # '[' 1 -eq 1 ']' 00:11:33.589 06:27:47 -- spdk/autotest.sh@193 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:11:33.589 06:27:47 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:11:33.589 06:27:47 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:33.589 06:27:47 -- common/autotest_common.sh@10 -- # set +x 00:11:33.589 ************************************ 00:11:33.589 START TEST blockdev_general 00:11:33.589 ************************************ 00:11:33.589 06:27:47 blockdev_general -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:11:33.897 * Looking for test storage... 00:11:33.897 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:11:33.897 06:27:47 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1068775 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:11:33.897 06:27:47 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1068775 00:11:33.897 06:27:47 blockdev_general -- common/autotest_common.sh@831 -- # '[' -z 1068775 ']' 00:11:33.897 06:27:47 blockdev_general -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:33.897 06:27:47 blockdev_general -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:33.897 06:27:47 blockdev_general -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:33.897 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:33.897 06:27:47 blockdev_general -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:33.897 06:27:47 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:33.897 [2024-07-25 06:27:47.251207] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:33.897 [2024-07-25 06:27:47.251266] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1068775 ] 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.897 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:33.897 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.898 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:33.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.898 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:33.898 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:33.898 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:33.898 [2024-07-25 06:27:47.388544] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:33.898 [2024-07-25 06:27:47.431981] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:34.835 06:27:48 blockdev_general -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:34.835 06:27:48 blockdev_general -- common/autotest_common.sh@864 -- # return 0 00:11:34.835 06:27:48 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:11:34.835 06:27:48 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:11:34.835 06:27:48 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:11:34.835 06:27:48 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:34.835 06:27:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:34.835 [2024-07-25 06:27:48.359957] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:34.835 [2024-07-25 06:27:48.360011] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:34.835 00:11:34.835 [2024-07-25 06:27:48.367936] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:34.835 [2024-07-25 06:27:48.367961] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:34.835 00:11:34.835 Malloc0 00:11:35.094 Malloc1 00:11:35.094 Malloc2 00:11:35.094 Malloc3 00:11:35.094 Malloc4 00:11:35.094 Malloc5 00:11:35.094 Malloc6 00:11:35.094 Malloc7 00:11:35.094 Malloc8 00:11:35.094 Malloc9 00:11:35.094 [2024-07-25 06:27:48.502844] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:35.094 [2024-07-25 06:27:48.502890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:35.094 [2024-07-25 06:27:48.502908] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9d4a10 00:11:35.094 [2024-07-25 06:27:48.502920] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:35.094 [2024-07-25 06:27:48.504147] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:35.094 [2024-07-25 06:27:48.504173] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:35.094 TestPT 00:11:35.094 06:27:48 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.094 06:27:48 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:11:35.094 5000+0 records in 00:11:35.094 5000+0 records out 00:11:35.094 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0243192 s, 421 MB/s 00:11:35.094 06:27:48 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:35.095 AIO0 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.095 06:27:48 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.095 06:27:48 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:11:35.095 06:27:48 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.095 06:27:48 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:35.095 06:27:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:35.353 06:27:48 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.353 06:27:48 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:11:35.353 06:27:48 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:35.353 06:27:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:35.353 06:27:48 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.353 06:27:48 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:11:35.353 06:27:48 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:11:35.353 06:27:48 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:11:35.353 06:27:48 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:35.353 06:27:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:35.353 06:27:48 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.614 06:27:48 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:11:35.614 06:27:48 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:11:35.615 06:27:48 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "338b91d5-bbb4-4bfd-9c0b-9e616b856977"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "338b91d5-bbb4-4bfd-9c0b-9e616b856977",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ba827301-7bb7-50cd-a869-e535bc7f38ac"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ba827301-7bb7-50cd-a869-e535bc7f38ac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "015171f9-9c93-50bc-8d41-b8f5f4d3f310"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "015171f9-9c93-50bc-8d41-b8f5f4d3f310",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "0b91056f-0745-523b-ad1e-c775be7cdfc4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0b91056f-0745-523b-ad1e-c775be7cdfc4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "7859a930-eb52-5db1-8e5d-3a7bf7412197"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7859a930-eb52-5db1-8e5d-3a7bf7412197",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "681189be-89a4-5933-91d0-800d58ed2e4c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "681189be-89a4-5933-91d0-800d58ed2e4c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "405082da-e865-5482-a281-de2c65fdf73b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "405082da-e865-5482-a281-de2c65fdf73b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "9f6c1041-2e25-552d-be66-0e27f5e03785"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9f6c1041-2e25-552d-be66-0e27f5e03785",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "3c8a065c-0683-589e-8cbc-6b884c6bc769"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3c8a065c-0683-589e-8cbc-6b884c6bc769",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "67e03149-8029-5c88-b6e6-1da2182d2d96"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67e03149-8029-5c88-b6e6-1da2182d2d96",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f63574e3-b425-5ad5-98b0-bea75a97252d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f63574e3-b425-5ad5-98b0-bea75a97252d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "cddf302a-7a8a-56a7-b023-6509cdf80886"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cddf302a-7a8a-56a7-b023-6509cdf80886",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "89ce9af9-561c-49e7-aca9-9fbbb191be80"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "89ce9af9-561c-49e7-aca9-9fbbb191be80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "89ce9af9-561c-49e7-aca9-9fbbb191be80",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "43257330-dde0-4dc5-85b6-35d151195ad7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "9d463d02-a610-4aec-b938-f7f1abd74b2b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "59443b36-1612-4d13-a201-d1fc3f25d882"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "59443b36-1612-4d13-a201-d1fc3f25d882",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "59443b36-1612-4d13-a201-d1fc3f25d882",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "900b82ef-32b6-4277-85ae-4f5ba50121d8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "120be681-7867-4763-ae92-ad071d53a99d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "a81b42bc-e16a-4331-aa98-a1823ca77013"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "a81b42bc-e16a-4331-aa98-a1823ca77013",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a81b42bc-e16a-4331-aa98-a1823ca77013",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "1e039826-4eae-4772-9258-8b49162586bd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "f90e272f-2ac8-456d-9a9a-1c910abcd216",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "012db33f-d156-4d62-9701-2c720581426b"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "012db33f-d156-4d62-9701-2c720581426b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:11:35.615 06:27:48 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:11:35.615 06:27:48 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:11:35.615 06:27:48 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:11:35.615 06:27:48 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 1068775 00:11:35.615 06:27:48 blockdev_general -- common/autotest_common.sh@950 -- # '[' -z 1068775 ']' 00:11:35.615 06:27:48 blockdev_general -- common/autotest_common.sh@954 -- # kill -0 1068775 00:11:35.615 06:27:48 blockdev_general -- common/autotest_common.sh@955 -- # uname 00:11:35.615 06:27:48 blockdev_general -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:35.615 06:27:48 blockdev_general -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1068775 00:11:35.615 06:27:49 blockdev_general -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:35.615 06:27:49 blockdev_general -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:35.615 06:27:49 blockdev_general -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1068775' 00:11:35.615 killing process with pid 1068775 00:11:35.615 06:27:49 blockdev_general -- common/autotest_common.sh@969 -- # kill 1068775 00:11:35.615 06:27:49 blockdev_general -- common/autotest_common.sh@974 -- # wait 1068775 00:11:35.874 06:27:49 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:11:35.875 06:27:49 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:11:35.875 06:27:49 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:11:35.875 06:27:49 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:35.875 06:27:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:36.134 ************************************ 00:11:36.134 START TEST bdev_hello_world 00:11:36.134 ************************************ 00:11:36.134 06:27:49 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:11:36.134 [2024-07-25 06:27:49.514588] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:36.134 [2024-07-25 06:27:49.514648] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1069169 ] 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:36.134 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:36.134 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:36.134 [2024-07-25 06:27:49.649428] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:36.393 [2024-07-25 06:27:49.693179] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:36.393 [2024-07-25 06:27:49.841288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:36.393 [2024-07-25 06:27:49.841335] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:36.393 [2024-07-25 06:27:49.841349] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:36.393 [2024-07-25 06:27:49.849293] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:36.393 [2024-07-25 06:27:49.849318] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:36.393 [2024-07-25 06:27:49.857306] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:36.393 [2024-07-25 06:27:49.857328] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:36.393 [2024-07-25 06:27:49.928357] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:36.393 [2024-07-25 06:27:49.928405] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.393 [2024-07-25 06:27:49.928420] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd294b0 00:11:36.393 [2024-07-25 06:27:49.928432] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.393 [2024-07-25 06:27:49.929684] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.393 [2024-07-25 06:27:49.929712] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:36.652 [2024-07-25 06:27:50.077503] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:11:36.652 [2024-07-25 06:27:50.077562] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:11:36.652 [2024-07-25 06:27:50.077612] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:11:36.652 [2024-07-25 06:27:50.077677] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:11:36.652 [2024-07-25 06:27:50.077743] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:11:36.652 [2024-07-25 06:27:50.077773] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:11:36.652 [2024-07-25 06:27:50.077834] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:11:36.652 00:11:36.652 [2024-07-25 06:27:50.077873] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:11:36.912 00:11:36.912 real 0m0.883s 00:11:36.912 user 0m0.536s 00:11:36.912 sys 0m0.297s 00:11:36.912 06:27:50 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:36.912 06:27:50 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:11:36.912 ************************************ 00:11:36.912 END TEST bdev_hello_world 00:11:36.912 ************************************ 00:11:36.912 06:27:50 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:11:36.912 06:27:50 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:36.912 06:27:50 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:36.912 06:27:50 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:36.912 ************************************ 00:11:36.912 START TEST bdev_bounds 00:11:36.912 ************************************ 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1069349 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1069349' 00:11:36.912 Process bdevio pid: 1069349 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1069349 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1069349 ']' 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:36.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:36.912 06:27:50 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:11:37.171 [2024-07-25 06:27:50.487601] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:37.171 [2024-07-25 06:27:50.487662] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1069349 ] 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:37.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:37.171 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:37.171 [2024-07-25 06:27:50.625299] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:11:37.171 [2024-07-25 06:27:50.671514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:37.171 [2024-07-25 06:27:50.671608] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:11:37.171 [2024-07-25 06:27:50.671613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:37.430 [2024-07-25 06:27:50.807429] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:37.430 [2024-07-25 06:27:50.807483] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:37.430 [2024-07-25 06:27:50.807498] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:37.430 [2024-07-25 06:27:50.815443] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:37.430 [2024-07-25 06:27:50.815469] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:37.430 [2024-07-25 06:27:50.823456] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:37.430 [2024-07-25 06:27:50.823479] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:37.430 [2024-07-25 06:27:50.894941] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:37.430 [2024-07-25 06:27:50.894990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:37.430 [2024-07-25 06:27:50.895005] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c63fb0 00:11:37.430 [2024-07-25 06:27:50.895017] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:37.430 [2024-07-25 06:27:50.896302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:37.430 [2024-07-25 06:27:50.896331] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:37.998 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:37.998 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:11:37.998 06:27:51 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:11:37.998 I/O targets: 00:11:37.998 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:11:37.998 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:11:37.998 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:11:37.998 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:11:37.998 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:11:37.998 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:11:37.998 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:11:37.998 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:11:37.998 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:11:37.998 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:11:37.998 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:11:37.999 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:11:37.999 raid0: 131072 blocks of 512 bytes (64 MiB) 00:11:37.999 concat0: 131072 blocks of 512 bytes (64 MiB) 00:11:37.999 raid1: 65536 blocks of 512 bytes (32 MiB) 00:11:37.999 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:11:37.999 00:11:37.999 00:11:37.999 CUnit - A unit testing framework for C - Version 2.1-3 00:11:37.999 http://cunit.sourceforge.net/ 00:11:37.999 00:11:37.999 00:11:37.999 Suite: bdevio tests on: AIO0 00:11:37.999 Test: blockdev write read block ...passed 00:11:37.999 Test: blockdev write zeroes read block ...passed 00:11:37.999 Test: blockdev write zeroes read no split ...passed 00:11:37.999 Test: blockdev write zeroes read split ...passed 00:11:37.999 Test: blockdev write zeroes read split partial ...passed 00:11:37.999 Test: blockdev reset ...passed 00:11:37.999 Test: blockdev write read 8 blocks ...passed 00:11:37.999 Test: blockdev write read size > 128k ...passed 00:11:37.999 Test: blockdev write read invalid size ...passed 00:11:37.999 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:37.999 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:37.999 Test: blockdev write read max offset ...passed 00:11:37.999 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:37.999 Test: blockdev writev readv 8 blocks ...passed 00:11:37.999 Test: blockdev writev readv 30 x 1block ...passed 00:11:37.999 Test: blockdev writev readv block ...passed 00:11:37.999 Test: blockdev writev readv size > 128k ...passed 00:11:37.999 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:37.999 Test: blockdev comparev and writev ...passed 00:11:37.999 Test: blockdev nvme passthru rw ...passed 00:11:37.999 Test: blockdev nvme passthru vendor specific ...passed 00:11:37.999 Test: blockdev nvme admin passthru ...passed 00:11:37.999 Test: blockdev copy ...passed 00:11:37.999 Suite: bdevio tests on: raid1 00:11:37.999 Test: blockdev write read block ...passed 00:11:37.999 Test: blockdev write zeroes read block ...passed 00:11:37.999 Test: blockdev write zeroes read no split ...passed 00:11:37.999 Test: blockdev write zeroes read split ...passed 00:11:37.999 Test: blockdev write zeroes read split partial ...passed 00:11:37.999 Test: blockdev reset ...passed 00:11:37.999 Test: blockdev write read 8 blocks ...passed 00:11:37.999 Test: blockdev write read size > 128k ...passed 00:11:37.999 Test: blockdev write read invalid size ...passed 00:11:37.999 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:37.999 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:37.999 Test: blockdev write read max offset ...passed 00:11:37.999 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:37.999 Test: blockdev writev readv 8 blocks ...passed 00:11:37.999 Test: blockdev writev readv 30 x 1block ...passed 00:11:37.999 Test: blockdev writev readv block ...passed 00:11:37.999 Test: blockdev writev readv size > 128k ...passed 00:11:37.999 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:37.999 Test: blockdev comparev and writev ...passed 00:11:37.999 Test: blockdev nvme passthru rw ...passed 00:11:37.999 Test: blockdev nvme passthru vendor specific ...passed 00:11:37.999 Test: blockdev nvme admin passthru ...passed 00:11:37.999 Test: blockdev copy ...passed 00:11:37.999 Suite: bdevio tests on: concat0 00:11:37.999 Test: blockdev write read block ...passed 00:11:37.999 Test: blockdev write zeroes read block ...passed 00:11:37.999 Test: blockdev write zeroes read no split ...passed 00:11:37.999 Test: blockdev write zeroes read split ...passed 00:11:37.999 Test: blockdev write zeroes read split partial ...passed 00:11:37.999 Test: blockdev reset ...passed 00:11:37.999 Test: blockdev write read 8 blocks ...passed 00:11:37.999 Test: blockdev write read size > 128k ...passed 00:11:37.999 Test: blockdev write read invalid size ...passed 00:11:37.999 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:37.999 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:37.999 Test: blockdev write read max offset ...passed 00:11:37.999 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:37.999 Test: blockdev writev readv 8 blocks ...passed 00:11:37.999 Test: blockdev writev readv 30 x 1block ...passed 00:11:37.999 Test: blockdev writev readv block ...passed 00:11:37.999 Test: blockdev writev readv size > 128k ...passed 00:11:37.999 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:37.999 Test: blockdev comparev and writev ...passed 00:11:37.999 Test: blockdev nvme passthru rw ...passed 00:11:37.999 Test: blockdev nvme passthru vendor specific ...passed 00:11:37.999 Test: blockdev nvme admin passthru ...passed 00:11:37.999 Test: blockdev copy ...passed 00:11:37.999 Suite: bdevio tests on: raid0 00:11:37.999 Test: blockdev write read block ...passed 00:11:37.999 Test: blockdev write zeroes read block ...passed 00:11:37.999 Test: blockdev write zeroes read no split ...passed 00:11:37.999 Test: blockdev write zeroes read split ...passed 00:11:37.999 Test: blockdev write zeroes read split partial ...passed 00:11:37.999 Test: blockdev reset ...passed 00:11:37.999 Test: blockdev write read 8 blocks ...passed 00:11:37.999 Test: blockdev write read size > 128k ...passed 00:11:37.999 Test: blockdev write read invalid size ...passed 00:11:37.999 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:37.999 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:37.999 Test: blockdev write read max offset ...passed 00:11:37.999 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:37.999 Test: blockdev writev readv 8 blocks ...passed 00:11:37.999 Test: blockdev writev readv 30 x 1block ...passed 00:11:37.999 Test: blockdev writev readv block ...passed 00:11:37.999 Test: blockdev writev readv size > 128k ...passed 00:11:37.999 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:37.999 Test: blockdev comparev and writev ...passed 00:11:37.999 Test: blockdev nvme passthru rw ...passed 00:11:37.999 Test: blockdev nvme passthru vendor specific ...passed 00:11:37.999 Test: blockdev nvme admin passthru ...passed 00:11:37.999 Test: blockdev copy ...passed 00:11:37.999 Suite: bdevio tests on: TestPT 00:11:37.999 Test: blockdev write read block ...passed 00:11:37.999 Test: blockdev write zeroes read block ...passed 00:11:37.999 Test: blockdev write zeroes read no split ...passed 00:11:37.999 Test: blockdev write zeroes read split ...passed 00:11:37.999 Test: blockdev write zeroes read split partial ...passed 00:11:37.999 Test: blockdev reset ...passed 00:11:37.999 Test: blockdev write read 8 blocks ...passed 00:11:37.999 Test: blockdev write read size > 128k ...passed 00:11:37.999 Test: blockdev write read invalid size ...passed 00:11:37.999 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:37.999 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:37.999 Test: blockdev write read max offset ...passed 00:11:37.999 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:37.999 Test: blockdev writev readv 8 blocks ...passed 00:11:37.999 Test: blockdev writev readv 30 x 1block ...passed 00:11:37.999 Test: blockdev writev readv block ...passed 00:11:37.999 Test: blockdev writev readv size > 128k ...passed 00:11:37.999 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:37.999 Test: blockdev comparev and writev ...passed 00:11:37.999 Test: blockdev nvme passthru rw ...passed 00:11:37.999 Test: blockdev nvme passthru vendor specific ...passed 00:11:37.999 Test: blockdev nvme admin passthru ...passed 00:11:37.999 Test: blockdev copy ...passed 00:11:37.999 Suite: bdevio tests on: Malloc2p7 00:11:37.999 Test: blockdev write read block ...passed 00:11:37.999 Test: blockdev write zeroes read block ...passed 00:11:38.259 Test: blockdev write zeroes read no split ...passed 00:11:38.259 Test: blockdev write zeroes read split ...passed 00:11:38.259 Test: blockdev write zeroes read split partial ...passed 00:11:38.259 Test: blockdev reset ...passed 00:11:38.259 Test: blockdev write read 8 blocks ...passed 00:11:38.259 Test: blockdev write read size > 128k ...passed 00:11:38.259 Test: blockdev write read invalid size ...passed 00:11:38.259 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.259 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.259 Test: blockdev write read max offset ...passed 00:11:38.259 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.259 Test: blockdev writev readv 8 blocks ...passed 00:11:38.259 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.259 Test: blockdev writev readv block ...passed 00:11:38.259 Test: blockdev writev readv size > 128k ...passed 00:11:38.259 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.259 Test: blockdev comparev and writev ...passed 00:11:38.259 Test: blockdev nvme passthru rw ...passed 00:11:38.259 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.259 Test: blockdev nvme admin passthru ...passed 00:11:38.259 Test: blockdev copy ...passed 00:11:38.259 Suite: bdevio tests on: Malloc2p6 00:11:38.259 Test: blockdev write read block ...passed 00:11:38.259 Test: blockdev write zeroes read block ...passed 00:11:38.259 Test: blockdev write zeroes read no split ...passed 00:11:38.259 Test: blockdev write zeroes read split ...passed 00:11:38.259 Test: blockdev write zeroes read split partial ...passed 00:11:38.259 Test: blockdev reset ...passed 00:11:38.259 Test: blockdev write read 8 blocks ...passed 00:11:38.259 Test: blockdev write read size > 128k ...passed 00:11:38.259 Test: blockdev write read invalid size ...passed 00:11:38.259 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.259 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.259 Test: blockdev write read max offset ...passed 00:11:38.259 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.259 Test: blockdev writev readv 8 blocks ...passed 00:11:38.259 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.259 Test: blockdev writev readv block ...passed 00:11:38.259 Test: blockdev writev readv size > 128k ...passed 00:11:38.259 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.259 Test: blockdev comparev and writev ...passed 00:11:38.259 Test: blockdev nvme passthru rw ...passed 00:11:38.259 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.259 Test: blockdev nvme admin passthru ...passed 00:11:38.259 Test: blockdev copy ...passed 00:11:38.259 Suite: bdevio tests on: Malloc2p5 00:11:38.259 Test: blockdev write read block ...passed 00:11:38.259 Test: blockdev write zeroes read block ...passed 00:11:38.259 Test: blockdev write zeroes read no split ...passed 00:11:38.259 Test: blockdev write zeroes read split ...passed 00:11:38.259 Test: blockdev write zeroes read split partial ...passed 00:11:38.259 Test: blockdev reset ...passed 00:11:38.259 Test: blockdev write read 8 blocks ...passed 00:11:38.260 Test: blockdev write read size > 128k ...passed 00:11:38.260 Test: blockdev write read invalid size ...passed 00:11:38.260 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.260 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.260 Test: blockdev write read max offset ...passed 00:11:38.260 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.260 Test: blockdev writev readv 8 blocks ...passed 00:11:38.260 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.260 Test: blockdev writev readv block ...passed 00:11:38.260 Test: blockdev writev readv size > 128k ...passed 00:11:38.260 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.260 Test: blockdev comparev and writev ...passed 00:11:38.260 Test: blockdev nvme passthru rw ...passed 00:11:38.260 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.260 Test: blockdev nvme admin passthru ...passed 00:11:38.260 Test: blockdev copy ...passed 00:11:38.260 Suite: bdevio tests on: Malloc2p4 00:11:38.260 Test: blockdev write read block ...passed 00:11:38.260 Test: blockdev write zeroes read block ...passed 00:11:38.260 Test: blockdev write zeroes read no split ...passed 00:11:38.260 Test: blockdev write zeroes read split ...passed 00:11:38.260 Test: blockdev write zeroes read split partial ...passed 00:11:38.260 Test: blockdev reset ...passed 00:11:38.260 Test: blockdev write read 8 blocks ...passed 00:11:38.260 Test: blockdev write read size > 128k ...passed 00:11:38.260 Test: blockdev write read invalid size ...passed 00:11:38.260 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.260 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.260 Test: blockdev write read max offset ...passed 00:11:38.260 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.260 Test: blockdev writev readv 8 blocks ...passed 00:11:38.260 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.260 Test: blockdev writev readv block ...passed 00:11:38.260 Test: blockdev writev readv size > 128k ...passed 00:11:38.260 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.260 Test: blockdev comparev and writev ...passed 00:11:38.260 Test: blockdev nvme passthru rw ...passed 00:11:38.260 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.260 Test: blockdev nvme admin passthru ...passed 00:11:38.260 Test: blockdev copy ...passed 00:11:38.260 Suite: bdevio tests on: Malloc2p3 00:11:38.260 Test: blockdev write read block ...passed 00:11:38.260 Test: blockdev write zeroes read block ...passed 00:11:38.260 Test: blockdev write zeroes read no split ...passed 00:11:38.260 Test: blockdev write zeroes read split ...passed 00:11:38.260 Test: blockdev write zeroes read split partial ...passed 00:11:38.260 Test: blockdev reset ...passed 00:11:38.260 Test: blockdev write read 8 blocks ...passed 00:11:38.260 Test: blockdev write read size > 128k ...passed 00:11:38.260 Test: blockdev write read invalid size ...passed 00:11:38.260 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.260 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.260 Test: blockdev write read max offset ...passed 00:11:38.260 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.260 Test: blockdev writev readv 8 blocks ...passed 00:11:38.260 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.260 Test: blockdev writev readv block ...passed 00:11:38.260 Test: blockdev writev readv size > 128k ...passed 00:11:38.260 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.260 Test: blockdev comparev and writev ...passed 00:11:38.260 Test: blockdev nvme passthru rw ...passed 00:11:38.260 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.260 Test: blockdev nvme admin passthru ...passed 00:11:38.260 Test: blockdev copy ...passed 00:11:38.260 Suite: bdevio tests on: Malloc2p2 00:11:38.260 Test: blockdev write read block ...passed 00:11:38.260 Test: blockdev write zeroes read block ...passed 00:11:38.260 Test: blockdev write zeroes read no split ...passed 00:11:38.260 Test: blockdev write zeroes read split ...passed 00:11:38.260 Test: blockdev write zeroes read split partial ...passed 00:11:38.260 Test: blockdev reset ...passed 00:11:38.260 Test: blockdev write read 8 blocks ...passed 00:11:38.260 Test: blockdev write read size > 128k ...passed 00:11:38.260 Test: blockdev write read invalid size ...passed 00:11:38.260 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.260 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.260 Test: blockdev write read max offset ...passed 00:11:38.260 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.260 Test: blockdev writev readv 8 blocks ...passed 00:11:38.260 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.260 Test: blockdev writev readv block ...passed 00:11:38.260 Test: blockdev writev readv size > 128k ...passed 00:11:38.260 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.260 Test: blockdev comparev and writev ...passed 00:11:38.260 Test: blockdev nvme passthru rw ...passed 00:11:38.260 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.260 Test: blockdev nvme admin passthru ...passed 00:11:38.260 Test: blockdev copy ...passed 00:11:38.260 Suite: bdevio tests on: Malloc2p1 00:11:38.260 Test: blockdev write read block ...passed 00:11:38.260 Test: blockdev write zeroes read block ...passed 00:11:38.260 Test: blockdev write zeroes read no split ...passed 00:11:38.260 Test: blockdev write zeroes read split ...passed 00:11:38.260 Test: blockdev write zeroes read split partial ...passed 00:11:38.260 Test: blockdev reset ...passed 00:11:38.260 Test: blockdev write read 8 blocks ...passed 00:11:38.260 Test: blockdev write read size > 128k ...passed 00:11:38.260 Test: blockdev write read invalid size ...passed 00:11:38.260 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.260 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.260 Test: blockdev write read max offset ...passed 00:11:38.260 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.260 Test: blockdev writev readv 8 blocks ...passed 00:11:38.260 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.260 Test: blockdev writev readv block ...passed 00:11:38.260 Test: blockdev writev readv size > 128k ...passed 00:11:38.260 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.260 Test: blockdev comparev and writev ...passed 00:11:38.260 Test: blockdev nvme passthru rw ...passed 00:11:38.260 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.260 Test: blockdev nvme admin passthru ...passed 00:11:38.260 Test: blockdev copy ...passed 00:11:38.260 Suite: bdevio tests on: Malloc2p0 00:11:38.260 Test: blockdev write read block ...passed 00:11:38.260 Test: blockdev write zeroes read block ...passed 00:11:38.260 Test: blockdev write zeroes read no split ...passed 00:11:38.260 Test: blockdev write zeroes read split ...passed 00:11:38.260 Test: blockdev write zeroes read split partial ...passed 00:11:38.260 Test: blockdev reset ...passed 00:11:38.260 Test: blockdev write read 8 blocks ...passed 00:11:38.260 Test: blockdev write read size > 128k ...passed 00:11:38.260 Test: blockdev write read invalid size ...passed 00:11:38.260 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.260 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.260 Test: blockdev write read max offset ...passed 00:11:38.260 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.260 Test: blockdev writev readv 8 blocks ...passed 00:11:38.260 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.260 Test: blockdev writev readv block ...passed 00:11:38.260 Test: blockdev writev readv size > 128k ...passed 00:11:38.260 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.260 Test: blockdev comparev and writev ...passed 00:11:38.260 Test: blockdev nvme passthru rw ...passed 00:11:38.260 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.260 Test: blockdev nvme admin passthru ...passed 00:11:38.260 Test: blockdev copy ...passed 00:11:38.260 Suite: bdevio tests on: Malloc1p1 00:11:38.260 Test: blockdev write read block ...passed 00:11:38.260 Test: blockdev write zeroes read block ...passed 00:11:38.260 Test: blockdev write zeroes read no split ...passed 00:11:38.260 Test: blockdev write zeroes read split ...passed 00:11:38.260 Test: blockdev write zeroes read split partial ...passed 00:11:38.260 Test: blockdev reset ...passed 00:11:38.260 Test: blockdev write read 8 blocks ...passed 00:11:38.260 Test: blockdev write read size > 128k ...passed 00:11:38.260 Test: blockdev write read invalid size ...passed 00:11:38.260 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.260 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.260 Test: blockdev write read max offset ...passed 00:11:38.260 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.260 Test: blockdev writev readv 8 blocks ...passed 00:11:38.260 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.260 Test: blockdev writev readv block ...passed 00:11:38.260 Test: blockdev writev readv size > 128k ...passed 00:11:38.260 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.260 Test: blockdev comparev and writev ...passed 00:11:38.260 Test: blockdev nvme passthru rw ...passed 00:11:38.260 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.260 Test: blockdev nvme admin passthru ...passed 00:11:38.260 Test: blockdev copy ...passed 00:11:38.260 Suite: bdevio tests on: Malloc1p0 00:11:38.260 Test: blockdev write read block ...passed 00:11:38.260 Test: blockdev write zeroes read block ...passed 00:11:38.260 Test: blockdev write zeroes read no split ...passed 00:11:38.260 Test: blockdev write zeroes read split ...passed 00:11:38.260 Test: blockdev write zeroes read split partial ...passed 00:11:38.260 Test: blockdev reset ...passed 00:11:38.260 Test: blockdev write read 8 blocks ...passed 00:11:38.260 Test: blockdev write read size > 128k ...passed 00:11:38.260 Test: blockdev write read invalid size ...passed 00:11:38.260 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.260 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.260 Test: blockdev write read max offset ...passed 00:11:38.260 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.261 Test: blockdev writev readv 8 blocks ...passed 00:11:38.261 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.261 Test: blockdev writev readv block ...passed 00:11:38.261 Test: blockdev writev readv size > 128k ...passed 00:11:38.261 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.261 Test: blockdev comparev and writev ...passed 00:11:38.261 Test: blockdev nvme passthru rw ...passed 00:11:38.261 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.261 Test: blockdev nvme admin passthru ...passed 00:11:38.261 Test: blockdev copy ...passed 00:11:38.261 Suite: bdevio tests on: Malloc0 00:11:38.261 Test: blockdev write read block ...passed 00:11:38.261 Test: blockdev write zeroes read block ...passed 00:11:38.261 Test: blockdev write zeroes read no split ...passed 00:11:38.261 Test: blockdev write zeroes read split ...passed 00:11:38.261 Test: blockdev write zeroes read split partial ...passed 00:11:38.261 Test: blockdev reset ...passed 00:11:38.261 Test: blockdev write read 8 blocks ...passed 00:11:38.261 Test: blockdev write read size > 128k ...passed 00:11:38.261 Test: blockdev write read invalid size ...passed 00:11:38.261 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:11:38.261 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:11:38.261 Test: blockdev write read max offset ...passed 00:11:38.261 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:11:38.261 Test: blockdev writev readv 8 blocks ...passed 00:11:38.261 Test: blockdev writev readv 30 x 1block ...passed 00:11:38.261 Test: blockdev writev readv block ...passed 00:11:38.261 Test: blockdev writev readv size > 128k ...passed 00:11:38.261 Test: blockdev writev readv size > 128k in two iovs ...passed 00:11:38.261 Test: blockdev comparev and writev ...passed 00:11:38.261 Test: blockdev nvme passthru rw ...passed 00:11:38.261 Test: blockdev nvme passthru vendor specific ...passed 00:11:38.261 Test: blockdev nvme admin passthru ...passed 00:11:38.261 Test: blockdev copy ...passed 00:11:38.261 00:11:38.261 Run Summary: Type Total Ran Passed Failed Inactive 00:11:38.261 suites 16 16 n/a 0 0 00:11:38.261 tests 368 368 368 0 0 00:11:38.261 asserts 2224 2224 2224 0 n/a 00:11:38.261 00:11:38.261 Elapsed time = 0.474 seconds 00:11:38.261 0 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1069349 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1069349 ']' 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1069349 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1069349 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1069349' 00:11:38.261 killing process with pid 1069349 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1069349 00:11:38.261 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1069349 00:11:38.520 06:27:51 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:11:38.520 00:11:38.520 real 0m1.532s 00:11:38.520 user 0m3.820s 00:11:38.520 sys 0m0.455s 00:11:38.520 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:38.520 06:27:51 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:11:38.520 ************************************ 00:11:38.520 END TEST bdev_bounds 00:11:38.520 ************************************ 00:11:38.520 06:27:52 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:11:38.520 06:27:52 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:38.520 06:27:52 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:38.520 06:27:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:38.520 ************************************ 00:11:38.520 START TEST bdev_nbd 00:11:38.520 ************************************ 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1069641 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1069641 /var/tmp/spdk-nbd.sock 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1069641 ']' 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:38.520 06:27:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:11:38.521 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:11:38.521 06:27:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:38.521 06:27:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:11:38.780 [2024-07-25 06:27:52.116753] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:11:38.780 [2024-07-25 06:27:52.116811] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:01.0 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:01.1 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:01.2 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:01.3 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:01.4 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:01.5 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:01.6 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:01.7 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:02.0 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:02.1 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:02.2 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:02.3 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:02.4 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:02.5 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:02.6 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3d:02.7 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:01.0 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:01.1 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:01.2 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:01.3 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:01.4 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:01.5 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:01.6 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:01.7 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:02.0 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:02.1 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:02.2 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:02.3 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:02.4 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:02.5 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:02.6 cannot be used 00:11:38.780 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:11:38.780 EAL: Requested device 0000:3f:02.7 cannot be used 00:11:38.780 [2024-07-25 06:27:52.247289] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:38.780 [2024-07-25 06:27:52.291713] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:39.039 [2024-07-25 06:27:52.437072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:39.039 [2024-07-25 06:27:52.437130] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:39.039 [2024-07-25 06:27:52.437154] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:39.039 [2024-07-25 06:27:52.445079] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:39.039 [2024-07-25 06:27:52.445104] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:39.039 [2024-07-25 06:27:52.453092] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:39.039 [2024-07-25 06:27:52.453114] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:39.039 [2024-07-25 06:27:52.524490] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:39.039 [2024-07-25 06:27:52.524538] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:39.039 [2024-07-25 06:27:52.524554] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ac9940 00:11:39.039 [2024-07-25 06:27:52.524565] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:39.039 [2024-07-25 06:27:52.525822] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:39.039 [2024-07-25 06:27:52.525849] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:39.606 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:39.607 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:39.865 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:39.866 1+0 records in 00:11:39.866 1+0 records out 00:11:39.866 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024576 s, 16.7 MB/s 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:39.866 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:40.123 1+0 records in 00:11:40.123 1+0 records out 00:11:40.123 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000205058 s, 20.0 MB/s 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:40.123 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:40.380 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:40.381 1+0 records in 00:11:40.381 1+0 records out 00:11:40.381 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000196985 s, 20.8 MB/s 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:40.381 06:27:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:40.639 1+0 records in 00:11:40.639 1+0 records out 00:11:40.639 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328592 s, 12.5 MB/s 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:40.639 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:40.898 1+0 records in 00:11:40.898 1+0 records out 00:11:40.898 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00038698 s, 10.6 MB/s 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:40.898 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:41.157 1+0 records in 00:11:41.157 1+0 records out 00:11:41.157 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000393592 s, 10.4 MB/s 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:41.157 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:41.416 1+0 records in 00:11:41.416 1+0 records out 00:11:41.416 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297135 s, 13.8 MB/s 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:41.416 06:27:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:41.675 1+0 records in 00:11:41.675 1+0 records out 00:11:41.675 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000388103 s, 10.6 MB/s 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:41.675 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:41.934 1+0 records in 00:11:41.934 1+0 records out 00:11:41.934 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000418913 s, 9.8 MB/s 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:41.934 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:42.193 1+0 records in 00:11:42.193 1+0 records out 00:11:42.193 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000491891 s, 8.3 MB/s 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:42.193 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:42.452 1+0 records in 00:11:42.452 1+0 records out 00:11:42.452 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00040014 s, 10.2 MB/s 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:42.452 06:27:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:42.711 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:42.711 1+0 records in 00:11:42.711 1+0 records out 00:11:42.711 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000549439 s, 7.5 MB/s 00:11:42.712 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:42.712 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:42.712 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:42.712 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:42.712 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:42.712 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:42.712 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:42.712 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:42.971 1+0 records in 00:11:42.971 1+0 records out 00:11:42.971 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000705017 s, 5.8 MB/s 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:42.971 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:43.230 1+0 records in 00:11:43.230 1+0 records out 00:11:43.230 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000693931 s, 5.9 MB/s 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:43.230 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:43.488 1+0 records in 00:11:43.488 1+0 records out 00:11:43.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000695214 s, 5.9 MB/s 00:11:43.488 06:27:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:43.488 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:43.488 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:43.488 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:43.488 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:43.488 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:43.488 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:43.488 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:43.747 1+0 records in 00:11:43.747 1+0 records out 00:11:43.747 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000790891 s, 5.2 MB/s 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:43.747 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:44.006 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd0", 00:11:44.006 "bdev_name": "Malloc0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd1", 00:11:44.006 "bdev_name": "Malloc1p0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd2", 00:11:44.006 "bdev_name": "Malloc1p1" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd3", 00:11:44.006 "bdev_name": "Malloc2p0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd4", 00:11:44.006 "bdev_name": "Malloc2p1" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd5", 00:11:44.006 "bdev_name": "Malloc2p2" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd6", 00:11:44.006 "bdev_name": "Malloc2p3" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd7", 00:11:44.006 "bdev_name": "Malloc2p4" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd8", 00:11:44.006 "bdev_name": "Malloc2p5" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd9", 00:11:44.006 "bdev_name": "Malloc2p6" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd10", 00:11:44.006 "bdev_name": "Malloc2p7" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd11", 00:11:44.006 "bdev_name": "TestPT" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd12", 00:11:44.006 "bdev_name": "raid0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd13", 00:11:44.006 "bdev_name": "concat0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd14", 00:11:44.006 "bdev_name": "raid1" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd15", 00:11:44.006 "bdev_name": "AIO0" 00:11:44.006 } 00:11:44.006 ]' 00:11:44.006 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:11:44.006 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd0", 00:11:44.006 "bdev_name": "Malloc0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd1", 00:11:44.006 "bdev_name": "Malloc1p0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd2", 00:11:44.006 "bdev_name": "Malloc1p1" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd3", 00:11:44.006 "bdev_name": "Malloc2p0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd4", 00:11:44.006 "bdev_name": "Malloc2p1" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd5", 00:11:44.006 "bdev_name": "Malloc2p2" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd6", 00:11:44.006 "bdev_name": "Malloc2p3" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd7", 00:11:44.006 "bdev_name": "Malloc2p4" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd8", 00:11:44.006 "bdev_name": "Malloc2p5" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd9", 00:11:44.006 "bdev_name": "Malloc2p6" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd10", 00:11:44.006 "bdev_name": "Malloc2p7" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd11", 00:11:44.006 "bdev_name": "TestPT" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd12", 00:11:44.006 "bdev_name": "raid0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd13", 00:11:44.006 "bdev_name": "concat0" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd14", 00:11:44.006 "bdev_name": "raid1" 00:11:44.006 }, 00:11:44.006 { 00:11:44.006 "nbd_device": "/dev/nbd15", 00:11:44.006 "bdev_name": "AIO0" 00:11:44.006 } 00:11:44.006 ]' 00:11:44.006 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:11:44.007 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:11:44.007 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:44.007 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:11:44.007 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:44.007 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:11:44.007 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:44.007 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:44.265 06:27:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:44.523 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:44.784 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:45.043 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:45.302 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:11:45.560 06:27:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:45.560 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:45.819 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:46.080 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:46.378 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:46.652 06:27:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:46.910 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:47.168 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:47.426 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:47.683 06:28:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:47.683 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:11:47.940 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:11:47.940 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:11:47.940 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:11:47.940 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:47.941 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:47.941 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:11:47.941 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:47.941 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:47.941 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:47.941 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:11:48.197 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:11:48.197 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:11:48.197 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:11:48.197 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:48.197 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:48.197 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:11:48.197 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:48.197 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:48.198 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:48.198 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:48.198 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:48.198 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:48.198 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:48.198 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:48.455 06:28:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:11:48.714 /dev/nbd0 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:48.714 1+0 records in 00:11:48.714 1+0 records out 00:11:48.714 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259699 s, 15.8 MB/s 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:48.714 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:11:48.972 /dev/nbd1 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:48.972 1+0 records in 00:11:48.972 1+0 records out 00:11:48.972 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254947 s, 16.1 MB/s 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:48.972 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:48.973 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:11:49.232 /dev/nbd10 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:49.232 1+0 records in 00:11:49.232 1+0 records out 00:11:49.232 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282815 s, 14.5 MB/s 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:49.232 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:11:49.490 /dev/nbd11 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:49.490 1+0 records in 00:11:49.490 1+0 records out 00:11:49.490 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339542 s, 12.1 MB/s 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:49.490 06:28:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:11:49.749 /dev/nbd12 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:49.749 1+0 records in 00:11:49.749 1+0 records out 00:11:49.749 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000348458 s, 11.8 MB/s 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:49.749 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:11:50.016 /dev/nbd13 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:50.016 1+0 records in 00:11:50.016 1+0 records out 00:11:50.016 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337215 s, 12.1 MB/s 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:50.016 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:11:50.273 /dev/nbd14 00:11:50.273 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:11:50.273 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:11:50.273 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:11:50.273 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:50.273 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:50.273 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:50.273 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:11:50.273 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:50.274 1+0 records in 00:11:50.274 1+0 records out 00:11:50.274 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000366756 s, 11.2 MB/s 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:50.274 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:11:50.531 /dev/nbd15 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:50.531 1+0 records in 00:11:50.531 1+0 records out 00:11:50.531 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00033247 s, 12.3 MB/s 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:50.531 06:28:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:11:50.788 /dev/nbd2 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:50.788 1+0 records in 00:11:50.788 1+0 records out 00:11:50.788 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000388235 s, 10.6 MB/s 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:50.788 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:11:51.045 /dev/nbd3 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:51.045 1+0 records in 00:11:51.045 1+0 records out 00:11:51.045 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00057282 s, 7.2 MB/s 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:51.045 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:11:51.303 /dev/nbd4 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:51.303 1+0 records in 00:11:51.303 1+0 records out 00:11:51.303 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000586726 s, 7.0 MB/s 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:51.303 06:28:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:11:51.560 /dev/nbd5 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:51.560 1+0 records in 00:11:51.560 1+0 records out 00:11:51.560 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00062256 s, 6.6 MB/s 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:51.560 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:11:51.818 /dev/nbd6 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:51.818 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:51.818 1+0 records in 00:11:51.818 1+0 records out 00:11:51.819 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000642854 s, 6.4 MB/s 00:11:51.819 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:51.819 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:51.819 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:51.819 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:51.819 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:51.819 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:51.819 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:51.819 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:11:52.077 /dev/nbd7 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:52.077 1+0 records in 00:11:52.077 1+0 records out 00:11:52.077 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000505869 s, 8.1 MB/s 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:52.077 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:11:52.335 /dev/nbd8 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:52.335 1+0 records in 00:11:52.335 1+0 records out 00:11:52.335 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000752307 s, 5.4 MB/s 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:52.335 06:28:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:11:52.593 /dev/nbd9 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:52.594 1+0 records in 00:11:52.594 1+0 records out 00:11:52.594 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000559416 s, 7.3 MB/s 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:11:52.594 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd0", 00:11:52.851 "bdev_name": "Malloc0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd1", 00:11:52.851 "bdev_name": "Malloc1p0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd10", 00:11:52.851 "bdev_name": "Malloc1p1" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd11", 00:11:52.851 "bdev_name": "Malloc2p0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd12", 00:11:52.851 "bdev_name": "Malloc2p1" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd13", 00:11:52.851 "bdev_name": "Malloc2p2" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd14", 00:11:52.851 "bdev_name": "Malloc2p3" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd15", 00:11:52.851 "bdev_name": "Malloc2p4" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd2", 00:11:52.851 "bdev_name": "Malloc2p5" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd3", 00:11:52.851 "bdev_name": "Malloc2p6" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd4", 00:11:52.851 "bdev_name": "Malloc2p7" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd5", 00:11:52.851 "bdev_name": "TestPT" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd6", 00:11:52.851 "bdev_name": "raid0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd7", 00:11:52.851 "bdev_name": "concat0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd8", 00:11:52.851 "bdev_name": "raid1" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd9", 00:11:52.851 "bdev_name": "AIO0" 00:11:52.851 } 00:11:52.851 ]' 00:11:52.851 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd0", 00:11:52.851 "bdev_name": "Malloc0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd1", 00:11:52.851 "bdev_name": "Malloc1p0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd10", 00:11:52.851 "bdev_name": "Malloc1p1" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd11", 00:11:52.851 "bdev_name": "Malloc2p0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd12", 00:11:52.851 "bdev_name": "Malloc2p1" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd13", 00:11:52.851 "bdev_name": "Malloc2p2" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd14", 00:11:52.851 "bdev_name": "Malloc2p3" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd15", 00:11:52.851 "bdev_name": "Malloc2p4" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd2", 00:11:52.851 "bdev_name": "Malloc2p5" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd3", 00:11:52.851 "bdev_name": "Malloc2p6" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd4", 00:11:52.851 "bdev_name": "Malloc2p7" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd5", 00:11:52.851 "bdev_name": "TestPT" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd6", 00:11:52.851 "bdev_name": "raid0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd7", 00:11:52.851 "bdev_name": "concat0" 00:11:52.851 }, 00:11:52.851 { 00:11:52.851 "nbd_device": "/dev/nbd8", 00:11:52.852 "bdev_name": "raid1" 00:11:52.852 }, 00:11:52.852 { 00:11:52.852 "nbd_device": "/dev/nbd9", 00:11:52.852 "bdev_name": "AIO0" 00:11:52.852 } 00:11:52.852 ]' 00:11:52.852 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:53.108 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:11:53.108 /dev/nbd1 00:11:53.108 /dev/nbd10 00:11:53.108 /dev/nbd11 00:11:53.108 /dev/nbd12 00:11:53.108 /dev/nbd13 00:11:53.108 /dev/nbd14 00:11:53.108 /dev/nbd15 00:11:53.108 /dev/nbd2 00:11:53.108 /dev/nbd3 00:11:53.108 /dev/nbd4 00:11:53.108 /dev/nbd5 00:11:53.108 /dev/nbd6 00:11:53.108 /dev/nbd7 00:11:53.108 /dev/nbd8 00:11:53.108 /dev/nbd9' 00:11:53.108 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:11:53.108 /dev/nbd1 00:11:53.108 /dev/nbd10 00:11:53.108 /dev/nbd11 00:11:53.108 /dev/nbd12 00:11:53.108 /dev/nbd13 00:11:53.108 /dev/nbd14 00:11:53.108 /dev/nbd15 00:11:53.108 /dev/nbd2 00:11:53.108 /dev/nbd3 00:11:53.108 /dev/nbd4 00:11:53.108 /dev/nbd5 00:11:53.108 /dev/nbd6 00:11:53.108 /dev/nbd7 00:11:53.108 /dev/nbd8 00:11:53.108 /dev/nbd9' 00:11:53.108 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:53.108 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:11:53.108 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:11:53.108 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:11:53.108 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:11:53.109 256+0 records in 00:11:53.109 256+0 records out 00:11:53.109 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104456 s, 100 MB/s 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:11:53.109 256+0 records in 00:11:53.109 256+0 records out 00:11:53.109 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.166869 s, 6.3 MB/s 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:53.109 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:11:53.366 256+0 records in 00:11:53.366 256+0 records out 00:11:53.366 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.168305 s, 6.2 MB/s 00:11:53.366 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:53.366 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:11:53.623 256+0 records in 00:11:53.623 256+0 records out 00:11:53.623 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.168236 s, 6.2 MB/s 00:11:53.623 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:53.623 06:28:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:11:53.623 256+0 records in 00:11:53.623 256+0 records out 00:11:53.623 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167778 s, 6.2 MB/s 00:11:53.623 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:53.623 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:11:53.880 256+0 records in 00:11:53.880 256+0 records out 00:11:53.880 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167423 s, 6.3 MB/s 00:11:53.880 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:53.880 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:11:54.138 256+0 records in 00:11:54.138 256+0 records out 00:11:54.138 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.168224 s, 6.2 MB/s 00:11:54.138 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:54.138 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:11:54.138 256+0 records in 00:11:54.138 256+0 records out 00:11:54.138 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.168368 s, 6.2 MB/s 00:11:54.138 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:54.139 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:11:54.397 256+0 records in 00:11:54.397 256+0 records out 00:11:54.397 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.16822 s, 6.2 MB/s 00:11:54.397 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:54.397 06:28:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:11:54.654 256+0 records in 00:11:54.654 256+0 records out 00:11:54.654 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.168368 s, 6.2 MB/s 00:11:54.654 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:54.655 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:11:54.655 256+0 records in 00:11:54.655 256+0 records out 00:11:54.655 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167499 s, 6.3 MB/s 00:11:54.655 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:54.655 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:11:54.912 256+0 records in 00:11:54.912 256+0 records out 00:11:54.912 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.168338 s, 6.2 MB/s 00:11:54.912 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:54.912 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:11:55.169 256+0 records in 00:11:55.169 256+0 records out 00:11:55.169 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167822 s, 6.2 MB/s 00:11:55.169 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:55.169 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:11:55.169 256+0 records in 00:11:55.169 256+0 records out 00:11:55.169 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.169053 s, 6.2 MB/s 00:11:55.169 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:55.169 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:11:55.427 256+0 records in 00:11:55.428 256+0 records out 00:11:55.428 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.169364 s, 6.2 MB/s 00:11:55.428 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:55.428 06:28:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:11:55.685 256+0 records in 00:11:55.685 256+0 records out 00:11:55.685 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.172203 s, 6.1 MB/s 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:11:55.685 256+0 records in 00:11:55.685 256+0 records out 00:11:55.685 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.167415 s, 6.3 MB/s 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.685 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:11:55.942 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:55.943 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:11:56.199 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:56.199 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:56.199 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:56.199 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:56.199 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:56.200 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:56.200 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:56.200 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:56.200 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:56.200 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:56.457 06:28:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:56.714 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:56.972 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:57.230 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:57.488 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:57.489 06:28:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:11:57.489 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:11:57.489 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:11:57.489 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:11:57.489 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:57.489 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:57.489 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:57.746 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:58.003 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:58.261 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:11:58.525 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:11:58.525 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:11:58.525 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:11:58.525 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:58.525 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:58.525 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:11:58.526 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:58.526 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:58.526 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:58.526 06:28:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:58.831 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:59.089 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:59.347 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:59.604 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:59.605 06:28:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:59.862 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:12:00.119 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:12:00.120 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:12:00.120 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:12:00.120 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:12:00.120 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:12:00.376 malloc_lvol_verify 00:12:00.376 06:28:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:12:00.633 c719a3df-4694-4b5d-8477-fdeef6b19c78 00:12:00.891 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:12:00.891 4cd43971-8c3b-4176-98d5-4fb3241a23d4 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:12:01.148 /dev/nbd0 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:12:01.148 mke2fs 1.46.5 (30-Dec-2021) 00:12:01.148 Discarding device blocks: 0/4096 done 00:12:01.148 Creating filesystem with 4096 1k blocks and 1024 inodes 00:12:01.148 00:12:01.148 Allocating group tables: 0/1 done 00:12:01.148 Writing inode tables: 0/1 done 00:12:01.148 Creating journal (1024 blocks): done 00:12:01.148 Writing superblocks and filesystem accounting information: 0/1 done 00:12:01.148 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:12:01.148 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1069641 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1069641 ']' 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1069641 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1069641 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1069641' 00:12:01.406 killing process with pid 1069641 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1069641 00:12:01.406 06:28:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1069641 00:12:01.973 06:28:15 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:12:01.973 00:12:01.973 real 0m23.330s 00:12:01.973 user 0m28.783s 00:12:01.973 sys 0m13.157s 00:12:01.973 06:28:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:01.973 06:28:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:12:01.973 ************************************ 00:12:01.973 END TEST bdev_nbd 00:12:01.973 ************************************ 00:12:01.973 06:28:15 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:12:01.973 06:28:15 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:12:01.973 06:28:15 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:12:01.973 06:28:15 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:12:01.973 06:28:15 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:01.973 06:28:15 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:01.973 06:28:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:01.973 ************************************ 00:12:01.973 START TEST bdev_fio 00:12:01.973 ************************************ 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:12:01.973 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:12:01.973 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:12:02.231 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:02.232 06:28:15 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:12:02.232 ************************************ 00:12:02.232 START TEST bdev_fio_rw_verify 00:12:02.232 ************************************ 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:12:02.232 06:28:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:12:02.490 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.490 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:02.491 fio-3.35 00:12:02.491 Starting 16 threads 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:02.749 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:02.749 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:14.954 00:12:14.954 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1074587: Thu Jul 25 06:28:26 2024 00:12:14.954 read: IOPS=98.5k, BW=385MiB/s (404MB/s)(3850MiB/10001msec) 00:12:14.954 slat (nsec): min=1911, max=285385, avg=32998.76, stdev=13050.67 00:12:14.954 clat (usec): min=8, max=1860, avg=263.30, stdev=119.64 00:12:14.954 lat (usec): min=20, max=1907, avg=296.30, stdev=126.84 00:12:14.954 clat percentiles (usec): 00:12:14.954 | 50.000th=[ 258], 99.000th=[ 519], 99.900th=[ 635], 99.990th=[ 873], 00:12:14.954 | 99.999th=[ 1549] 00:12:14.954 write: IOPS=156k, BW=609MiB/s (639MB/s)(6008MiB/9863msec); 0 zone resets 00:12:14.954 slat (usec): min=7, max=4539, avg=44.58, stdev=13.92 00:12:14.954 clat (usec): min=10, max=5253, avg=308.40, stdev=140.05 00:12:14.954 lat (usec): min=33, max=5297, avg=352.98, stdev=146.96 00:12:14.954 clat percentiles (usec): 00:12:14.954 | 50.000th=[ 297], 99.000th=[ 660], 99.900th=[ 865], 99.990th=[ 971], 00:12:14.954 | 99.999th=[ 1467] 00:12:14.954 bw ( KiB/s): min=514904, max=795251, per=98.99%, avg=617416.58, stdev=4761.44, samples=304 00:12:14.954 iops : min=128726, max=198808, avg=154353.89, stdev=1190.32, samples=304 00:12:14.954 lat (usec) : 10=0.01%, 20=0.01%, 50=0.80%, 100=5.38%, 250=36.43% 00:12:14.954 lat (usec) : 500=51.02%, 750=6.05%, 1000=0.30% 00:12:14.954 lat (msec) : 2=0.01%, 4=0.01%, 10=0.01% 00:12:14.954 cpu : usr=99.25%, sys=0.35%, ctx=688, majf=0, minf=1907 00:12:14.954 IO depths : 1=12.5%, 2=24.9%, 4=50.1%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:12:14.954 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:12:14.954 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:12:14.954 issued rwts: total=985557,1537967,0,0 short=0,0,0,0 dropped=0,0,0,0 00:12:14.954 latency : target=0, window=0, percentile=100.00%, depth=8 00:12:14.954 00:12:14.954 Run status group 0 (all jobs): 00:12:14.954 READ: bw=385MiB/s (404MB/s), 385MiB/s-385MiB/s (404MB/s-404MB/s), io=3850MiB (4037MB), run=10001-10001msec 00:12:14.954 WRITE: bw=609MiB/s (639MB/s), 609MiB/s-609MiB/s (639MB/s-639MB/s), io=6008MiB (6300MB), run=9863-9863msec 00:12:14.955 00:12:14.955 real 0m11.804s 00:12:14.955 user 2m54.062s 00:12:14.955 sys 0m1.414s 00:12:14.955 06:28:27 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:14.955 06:28:27 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:12:14.955 ************************************ 00:12:14.955 END TEST bdev_fio_rw_verify 00:12:14.955 ************************************ 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:12:14.955 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:12:14.956 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "338b91d5-bbb4-4bfd-9c0b-9e616b856977"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "338b91d5-bbb4-4bfd-9c0b-9e616b856977",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ba827301-7bb7-50cd-a869-e535bc7f38ac"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ba827301-7bb7-50cd-a869-e535bc7f38ac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "015171f9-9c93-50bc-8d41-b8f5f4d3f310"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "015171f9-9c93-50bc-8d41-b8f5f4d3f310",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "0b91056f-0745-523b-ad1e-c775be7cdfc4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0b91056f-0745-523b-ad1e-c775be7cdfc4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "7859a930-eb52-5db1-8e5d-3a7bf7412197"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7859a930-eb52-5db1-8e5d-3a7bf7412197",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "681189be-89a4-5933-91d0-800d58ed2e4c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "681189be-89a4-5933-91d0-800d58ed2e4c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "405082da-e865-5482-a281-de2c65fdf73b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "405082da-e865-5482-a281-de2c65fdf73b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "9f6c1041-2e25-552d-be66-0e27f5e03785"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9f6c1041-2e25-552d-be66-0e27f5e03785",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "3c8a065c-0683-589e-8cbc-6b884c6bc769"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3c8a065c-0683-589e-8cbc-6b884c6bc769",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "67e03149-8029-5c88-b6e6-1da2182d2d96"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67e03149-8029-5c88-b6e6-1da2182d2d96",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f63574e3-b425-5ad5-98b0-bea75a97252d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f63574e3-b425-5ad5-98b0-bea75a97252d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "cddf302a-7a8a-56a7-b023-6509cdf80886"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cddf302a-7a8a-56a7-b023-6509cdf80886",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "89ce9af9-561c-49e7-aca9-9fbbb191be80"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "89ce9af9-561c-49e7-aca9-9fbbb191be80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "89ce9af9-561c-49e7-aca9-9fbbb191be80",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "43257330-dde0-4dc5-85b6-35d151195ad7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "9d463d02-a610-4aec-b938-f7f1abd74b2b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "59443b36-1612-4d13-a201-d1fc3f25d882"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "59443b36-1612-4d13-a201-d1fc3f25d882",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "59443b36-1612-4d13-a201-d1fc3f25d882",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "900b82ef-32b6-4277-85ae-4f5ba50121d8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "120be681-7867-4763-ae92-ad071d53a99d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "a81b42bc-e16a-4331-aa98-a1823ca77013"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "a81b42bc-e16a-4331-aa98-a1823ca77013",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a81b42bc-e16a-4331-aa98-a1823ca77013",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "1e039826-4eae-4772-9258-8b49162586bd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "f90e272f-2ac8-456d-9a9a-1c910abcd216",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "012db33f-d156-4d62-9701-2c720581426b"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "012db33f-d156-4d62-9701-2c720581426b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:12:14.956 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:12:14.956 Malloc1p0 00:12:14.956 Malloc1p1 00:12:14.956 Malloc2p0 00:12:14.956 Malloc2p1 00:12:14.956 Malloc2p2 00:12:14.956 Malloc2p3 00:12:14.956 Malloc2p4 00:12:14.956 Malloc2p5 00:12:14.956 Malloc2p6 00:12:14.956 Malloc2p7 00:12:14.956 TestPT 00:12:14.956 raid0 00:12:14.956 concat0 ]] 00:12:14.956 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:12:14.957 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "338b91d5-bbb4-4bfd-9c0b-9e616b856977"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "338b91d5-bbb4-4bfd-9c0b-9e616b856977",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "ba827301-7bb7-50cd-a869-e535bc7f38ac"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ba827301-7bb7-50cd-a869-e535bc7f38ac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "015171f9-9c93-50bc-8d41-b8f5f4d3f310"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "015171f9-9c93-50bc-8d41-b8f5f4d3f310",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "0b91056f-0745-523b-ad1e-c775be7cdfc4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0b91056f-0745-523b-ad1e-c775be7cdfc4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "7859a930-eb52-5db1-8e5d-3a7bf7412197"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7859a930-eb52-5db1-8e5d-3a7bf7412197",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "681189be-89a4-5933-91d0-800d58ed2e4c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "681189be-89a4-5933-91d0-800d58ed2e4c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "405082da-e865-5482-a281-de2c65fdf73b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "405082da-e865-5482-a281-de2c65fdf73b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "9f6c1041-2e25-552d-be66-0e27f5e03785"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9f6c1041-2e25-552d-be66-0e27f5e03785",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "3c8a065c-0683-589e-8cbc-6b884c6bc769"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3c8a065c-0683-589e-8cbc-6b884c6bc769",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "67e03149-8029-5c88-b6e6-1da2182d2d96"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "67e03149-8029-5c88-b6e6-1da2182d2d96",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f63574e3-b425-5ad5-98b0-bea75a97252d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f63574e3-b425-5ad5-98b0-bea75a97252d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "cddf302a-7a8a-56a7-b023-6509cdf80886"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cddf302a-7a8a-56a7-b023-6509cdf80886",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "89ce9af9-561c-49e7-aca9-9fbbb191be80"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "89ce9af9-561c-49e7-aca9-9fbbb191be80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "89ce9af9-561c-49e7-aca9-9fbbb191be80",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "43257330-dde0-4dc5-85b6-35d151195ad7",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "9d463d02-a610-4aec-b938-f7f1abd74b2b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "59443b36-1612-4d13-a201-d1fc3f25d882"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "59443b36-1612-4d13-a201-d1fc3f25d882",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "59443b36-1612-4d13-a201-d1fc3f25d882",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "900b82ef-32b6-4277-85ae-4f5ba50121d8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "120be681-7867-4763-ae92-ad071d53a99d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "a81b42bc-e16a-4331-aa98-a1823ca77013"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "a81b42bc-e16a-4331-aa98-a1823ca77013",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a81b42bc-e16a-4331-aa98-a1823ca77013",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "1e039826-4eae-4772-9258-8b49162586bd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "f90e272f-2ac8-456d-9a9a-1c910abcd216",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "012db33f-d156-4d62-9701-2c720581426b"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "012db33f-d156-4d62-9701-2c720581426b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:12:14.957 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.957 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:12:14.957 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:12:14.957 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:14.958 06:28:27 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:12:14.958 ************************************ 00:12:14.958 START TEST bdev_fio_trim 00:12:14.958 ************************************ 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:12:14.958 06:28:27 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:12:14.958 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:12:14.958 fio-3.35 00:12:14.958 Starting 14 threads 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:14.958 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.958 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:14.959 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:14.959 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:27.170 00:12:27.170 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1076837: Thu Jul 25 06:28:38 2024 00:12:27.170 write: IOPS=141k, BW=552MiB/s (579MB/s)(5521MiB/10001msec); 0 zone resets 00:12:27.170 slat (usec): min=9, max=504, avg=35.42, stdev= 8.85 00:12:27.170 clat (usec): min=44, max=3270, avg=245.01, stdev=81.34 00:12:27.170 lat (usec): min=59, max=3306, avg=280.43, stdev=84.17 00:12:27.170 clat percentiles (usec): 00:12:27.170 | 50.000th=[ 239], 99.000th=[ 424], 99.900th=[ 545], 99.990th=[ 660], 00:12:27.170 | 99.999th=[ 1004] 00:12:27.170 bw ( KiB/s): min=496384, max=734502, per=100.00%, avg=567227.79, stdev=5301.58, samples=266 00:12:27.170 iops : min=124096, max=183623, avg=141806.68, stdev=1325.38, samples=266 00:12:27.170 trim: IOPS=141k, BW=552MiB/s (579MB/s)(5521MiB/10001msec); 0 zone resets 00:12:27.170 slat (usec): min=5, max=530, avg=24.06, stdev= 5.99 00:12:27.170 clat (usec): min=16, max=3307, avg=280.60, stdev=84.17 00:12:27.170 lat (usec): min=36, max=3322, avg=304.66, stdev=86.16 00:12:27.170 clat percentiles (usec): 00:12:27.170 | 50.000th=[ 273], 99.000th=[ 465], 99.900th=[ 603], 99.990th=[ 725], 00:12:27.170 | 99.999th=[ 1090] 00:12:27.170 bw ( KiB/s): min=496384, max=734502, per=100.00%, avg=567227.79, stdev=5301.59, samples=266 00:12:27.170 iops : min=124096, max=183623, avg=141806.79, stdev=1325.37, samples=266 00:12:27.170 lat (usec) : 20=0.01%, 50=0.01%, 100=0.69%, 250=46.96%, 500=52.00% 00:12:27.170 lat (usec) : 750=0.35%, 1000=0.01% 00:12:27.170 lat (msec) : 2=0.01%, 4=0.01% 00:12:27.170 cpu : usr=99.60%, sys=0.00%, ctx=660, majf=0, minf=1046 00:12:27.170 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:12:27.170 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:12:27.170 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:12:27.171 issued rwts: total=0,1413444,1413446,0 short=0,0,0,0 dropped=0,0,0,0 00:12:27.171 latency : target=0, window=0, percentile=100.00%, depth=8 00:12:27.171 00:12:27.171 Run status group 0 (all jobs): 00:12:27.171 WRITE: bw=552MiB/s (579MB/s), 552MiB/s-552MiB/s (579MB/s-579MB/s), io=5521MiB (5789MB), run=10001-10001msec 00:12:27.171 TRIM: bw=552MiB/s (579MB/s), 552MiB/s-552MiB/s (579MB/s-579MB/s), io=5521MiB (5789MB), run=10001-10001msec 00:12:27.171 00:12:27.171 real 0m11.448s 00:12:27.171 user 2m33.500s 00:12:27.171 sys 0m0.747s 00:12:27.171 06:28:39 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:27.171 06:28:39 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:12:27.171 ************************************ 00:12:27.171 END TEST bdev_fio_trim 00:12:27.171 ************************************ 00:12:27.171 06:28:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:12:27.171 06:28:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:12:27.171 06:28:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:12:27.171 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:12:27.171 06:28:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:12:27.171 00:12:27.171 real 0m23.674s 00:12:27.171 user 5m27.794s 00:12:27.171 sys 0m2.375s 00:12:27.171 06:28:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:27.171 06:28:39 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:12:27.171 ************************************ 00:12:27.171 END TEST bdev_fio 00:12:27.171 ************************************ 00:12:27.171 06:28:39 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:12:27.171 06:28:39 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:12:27.171 06:28:39 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:12:27.171 06:28:39 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:27.171 06:28:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:27.171 ************************************ 00:12:27.171 START TEST bdev_verify 00:12:27.171 ************************************ 00:12:27.171 06:28:39 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:12:27.171 [2024-07-25 06:28:39.267827] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:12:27.171 [2024-07-25 06:28:39.267881] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1078558 ] 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:27.171 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:27.171 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:27.171 [2024-07-25 06:28:39.402307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:12:27.171 [2024-07-25 06:28:39.446670] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:27.171 [2024-07-25 06:28:39.446675] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:27.171 [2024-07-25 06:28:39.578903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:12:27.171 [2024-07-25 06:28:39.578947] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:12:27.171 [2024-07-25 06:28:39.578961] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:12:27.171 [2024-07-25 06:28:39.586916] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:12:27.171 [2024-07-25 06:28:39.586941] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:12:27.171 [2024-07-25 06:28:39.594929] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:12:27.171 [2024-07-25 06:28:39.594951] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:12:27.171 [2024-07-25 06:28:39.665783] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:12:27.171 [2024-07-25 06:28:39.665830] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:27.171 [2024-07-25 06:28:39.665847] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27c7fb0 00:12:27.171 [2024-07-25 06:28:39.665859] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:27.171 [2024-07-25 06:28:39.667127] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:27.171 [2024-07-25 06:28:39.667161] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:12:27.171 Running I/O for 5 seconds... 00:12:32.498 00:12:32.498 Latency(us) 00:12:32.498 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:32.498 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x1000 00:12:32.498 Malloc0 : 5.21 1105.68 4.32 0.00 0.00 115523.20 678.30 234881.02 00:12:32.498 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x1000 length 0x1000 00:12:32.498 Malloc0 : 5.20 1083.23 4.23 0.00 0.00 117924.02 557.06 377487.36 00:12:32.498 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x800 00:12:32.498 Malloc1p0 : 5.21 564.78 2.21 0.00 0.00 225334.96 3316.12 219781.53 00:12:32.498 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x800 length 0x800 00:12:32.498 Malloc1p0 : 5.20 566.00 2.21 0.00 0.00 224900.93 3316.12 198810.01 00:12:32.498 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x800 00:12:32.498 Malloc1p1 : 5.22 564.43 2.20 0.00 0.00 224768.41 3329.23 216426.09 00:12:32.498 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x800 length 0x800 00:12:32.498 Malloc1p1 : 5.20 565.76 2.21 0.00 0.00 224256.75 3329.23 192937.98 00:12:32.498 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x200 00:12:32.498 Malloc2p0 : 5.22 564.07 2.20 0.00 0.00 224200.62 3355.44 207198.62 00:12:32.498 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x200 length 0x200 00:12:32.498 Malloc2p0 : 5.21 565.52 2.21 0.00 0.00 223622.91 3355.44 189582.54 00:12:32.498 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x200 00:12:32.498 Malloc2p1 : 5.22 563.72 2.20 0.00 0.00 223625.63 3434.09 202165.45 00:12:32.498 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x200 length 0x200 00:12:32.498 Malloc2p1 : 5.21 565.30 2.21 0.00 0.00 222993.33 3381.66 183710.52 00:12:32.498 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x200 00:12:32.498 Malloc2p2 : 5.23 563.37 2.20 0.00 0.00 223041.63 3381.66 197132.29 00:12:32.498 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x200 length 0x200 00:12:32.498 Malloc2p2 : 5.21 565.07 2.21 0.00 0.00 222365.64 3355.44 175321.91 00:12:32.498 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x200 00:12:32.498 Malloc2p3 : 5.23 563.02 2.20 0.00 0.00 222457.99 3355.44 191260.26 00:12:32.498 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x200 length 0x200 00:12:32.498 Malloc2p3 : 5.21 564.73 2.21 0.00 0.00 221780.72 3329.23 170288.74 00:12:32.498 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x200 00:12:32.498 Malloc2p4 : 5.23 562.81 2.20 0.00 0.00 221861.79 3237.48 188743.68 00:12:32.498 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x200 length 0x200 00:12:32.498 Malloc2p4 : 5.22 564.38 2.20 0.00 0.00 221233.52 3250.59 167772.16 00:12:32.498 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x200 00:12:32.498 Malloc2p5 : 5.23 562.60 2.20 0.00 0.00 221271.35 2831.16 187065.96 00:12:32.498 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x200 length 0x200 00:12:32.498 Malloc2p5 : 5.22 564.02 2.20 0.00 0.00 220732.17 2608.33 165255.58 00:12:32.498 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x200 00:12:32.498 Malloc2p6 : 5.23 562.38 2.20 0.00 0.00 220762.34 2569.01 183710.52 00:12:32.498 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x200 length 0x200 00:12:32.498 Malloc2p6 : 5.22 563.67 2.20 0.00 0.00 220382.80 2595.23 166094.44 00:12:32.498 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x200 00:12:32.498 Malloc2p7 : 5.24 562.17 2.20 0.00 0.00 220340.05 2542.80 182871.65 00:12:32.498 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x200 length 0x200 00:12:32.498 Malloc2p7 : 5.23 563.32 2.20 0.00 0.00 220023.05 2621.44 166094.44 00:12:32.498 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x1000 00:12:32.498 TestPT : 5.24 561.96 2.20 0.00 0.00 219846.20 2424.83 174483.05 00:12:32.498 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x1000 length 0x1000 00:12:32.498 TestPT : 5.26 540.52 2.11 0.00 0.00 227792.07 11219.76 244947.35 00:12:32.498 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x2000 00:12:32.498 raid0 : 5.24 561.75 2.19 0.00 0.00 219277.22 2503.48 172805.32 00:12:32.498 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x2000 length 0x2000 00:12:32.498 raid0 : 5.27 582.91 2.28 0.00 0.00 211586.44 2555.90 155189.25 00:12:32.498 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x2000 00:12:32.498 concat0 : 5.27 582.46 2.28 0.00 0.00 211033.34 2555.90 171966.46 00:12:32.498 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x2000 length 0x2000 00:12:32.498 concat0 : 5.27 582.68 2.28 0.00 0.00 211185.55 2503.48 163577.86 00:12:32.498 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.498 Verification LBA range: start 0x0 length 0x1000 00:12:32.498 raid1 : 5.28 582.09 2.27 0.00 0.00 210690.19 2700.08 177838.49 00:12:32.498 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.499 Verification LBA range: start 0x1000 length 0x1000 00:12:32.499 raid1 : 5.27 582.42 2.28 0.00 0.00 210828.88 3106.41 170288.74 00:12:32.499 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:12:32.499 Verification LBA range: start 0x0 length 0x4e2 00:12:32.499 AIO0 : 5.28 581.70 2.27 0.00 0.00 210458.28 1264.84 181193.93 00:12:32.499 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:12:32.499 Verification LBA range: start 0x4e2 length 0x4e2 00:12:32.499 AIO0 : 5.28 582.10 2.27 0.00 0.00 210473.04 1205.86 176999.63 00:12:32.499 =================================================================================================================== 00:12:32.499 Total : 19210.59 75.04 0.00 0.00 208002.07 557.06 377487.36 00:12:32.499 00:12:32.499 real 0m6.303s 00:12:32.499 user 0m11.809s 00:12:32.499 sys 0m0.384s 00:12:32.499 06:28:45 blockdev_general.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:32.499 06:28:45 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:12:32.499 ************************************ 00:12:32.499 END TEST bdev_verify 00:12:32.499 ************************************ 00:12:32.499 06:28:45 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:12:32.499 06:28:45 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:12:32.499 06:28:45 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:32.499 06:28:45 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:32.499 ************************************ 00:12:32.499 START TEST bdev_verify_big_io 00:12:32.499 ************************************ 00:12:32.499 06:28:45 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:12:32.499 [2024-07-25 06:28:45.711987] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:12:32.499 [2024-07-25 06:28:45.712112] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1079701 ] 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:32.499 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:32.499 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:32.499 [2024-07-25 06:28:45.926416] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:12:32.499 [2024-07-25 06:28:45.971195] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:32.499 [2024-07-25 06:28:45.971200] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:32.759 [2024-07-25 06:28:46.109092] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:12:32.759 [2024-07-25 06:28:46.109142] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:12:32.759 [2024-07-25 06:28:46.109156] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:12:32.759 [2024-07-25 06:28:46.117102] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:12:32.759 [2024-07-25 06:28:46.117127] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:12:32.759 [2024-07-25 06:28:46.125114] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:12:32.759 [2024-07-25 06:28:46.125136] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:12:32.759 [2024-07-25 06:28:46.196395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:12:32.759 [2024-07-25 06:28:46.196444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:32.759 [2024-07-25 06:28:46.196461] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1934fb0 00:12:32.759 [2024-07-25 06:28:46.196472] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:32.759 [2024-07-25 06:28:46.197744] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:32.759 [2024-07-25 06:28:46.197771] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:12:33.018 [2024-07-25 06:28:46.368556] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.369680] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.371327] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.372445] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.374161] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.375305] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.376995] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.378709] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.379860] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.381556] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.382509] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.383826] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.384679] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.385984] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.386832] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.388159] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:12:33.018 [2024-07-25 06:28:46.408396] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:12:33.018 [2024-07-25 06:28:46.410041] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:12:33.018 Running I/O for 5 seconds... 00:12:41.141 00:12:41.141 Latency(us) 00:12:41.141 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:41.141 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x0 length 0x100 00:12:41.141 Malloc0 : 5.66 158.35 9.90 0.00 0.00 792614.71 829.03 2201170.74 00:12:41.141 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x100 length 0x100 00:12:41.141 Malloc0 : 5.91 151.52 9.47 0.00 0.00 828956.50 825.75 2550136.83 00:12:41.141 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x0 length 0x80 00:12:41.141 Malloc1p0 : 6.16 70.83 4.43 0.00 0.00 1650130.51 2870.48 2617245.70 00:12:41.141 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x80 length 0x80 00:12:41.141 Malloc1p0 : 6.36 60.33 3.77 0.00 0.00 1957484.89 2634.55 2966211.79 00:12:41.141 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x0 length 0x80 00:12:41.141 Malloc1p1 : 6.70 38.20 2.39 0.00 0.00 2930312.84 1415.58 5100273.66 00:12:41.141 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x80 length 0x80 00:12:41.141 Malloc1p1 : 6.75 37.93 2.37 0.00 0.00 2944851.78 1422.13 5046586.57 00:12:41.141 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x0 length 0x20 00:12:41.141 Malloc2p0 : 6.16 25.98 1.62 0.00 0.00 1089699.79 586.55 1879048.19 00:12:41.141 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x20 length 0x20 00:12:41.141 Malloc2p0 : 6.16 25.98 1.62 0.00 0.00 1082419.09 589.82 1650878.05 00:12:41.141 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x0 length 0x20 00:12:41.141 Malloc2p1 : 6.16 25.97 1.62 0.00 0.00 1080872.34 573.44 1852204.65 00:12:41.141 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x20 length 0x20 00:12:41.141 Malloc2p1 : 6.16 25.97 1.62 0.00 0.00 1073154.67 589.82 1630745.40 00:12:41.141 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x0 length 0x20 00:12:41.141 Malloc2p2 : 6.16 25.96 1.62 0.00 0.00 1070947.34 576.72 1825361.10 00:12:41.141 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x20 length 0x20 00:12:41.141 Malloc2p2 : 6.26 28.10 1.76 0.00 0.00 998462.32 579.99 1603901.85 00:12:41.141 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x0 length 0x20 00:12:41.141 Malloc2p3 : 6.16 25.95 1.62 0.00 0.00 1061391.41 583.27 1798517.56 00:12:41.141 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x20 length 0x20 00:12:41.141 Malloc2p3 : 6.27 28.08 1.76 0.00 0.00 990459.78 579.99 1583769.19 00:12:41.141 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x0 length 0x20 00:12:41.141 Malloc2p4 : 6.17 25.95 1.62 0.00 0.00 1051952.69 573.44 1771674.01 00:12:41.141 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:12:41.141 Verification LBA range: start 0x20 length 0x20 00:12:41.141 Malloc2p4 : 6.27 28.07 1.75 0.00 0.00 982472.53 583.27 1556925.64 00:12:41.142 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x0 length 0x20 00:12:41.142 Malloc2p5 : 6.26 28.10 1.76 0.00 0.00 974433.55 566.89 1744830.46 00:12:41.142 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x20 length 0x20 00:12:41.142 Malloc2p5 : 6.27 28.07 1.75 0.00 0.00 973995.46 586.55 1536792.99 00:12:41.142 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x0 length 0x20 00:12:41.142 Malloc2p6 : 6.26 28.10 1.76 0.00 0.00 966194.71 593.10 1717986.92 00:12:41.142 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x20 length 0x20 00:12:41.142 Malloc2p6 : 6.27 28.06 1.75 0.00 0.00 965145.42 586.55 1516660.33 00:12:41.142 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x0 length 0x20 00:12:41.142 Malloc2p7 : 6.27 28.09 1.76 0.00 0.00 957848.25 579.99 1691143.37 00:12:41.142 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x20 length 0x20 00:12:41.142 Malloc2p7 : 6.27 28.05 1.75 0.00 0.00 956445.57 583.27 1489816.78 00:12:41.142 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x0 length 0x100 00:12:41.142 TestPT : 6.75 38.21 2.39 0.00 0.00 2675063.69 104857.60 3946001.20 00:12:41.142 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x100 length 0x100 00:12:41.142 TestPT : 6.54 36.71 2.29 0.00 0.00 2821491.70 100663.30 3704409.29 00:12:41.142 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x0 length 0x200 00:12:41.142 raid0 : 6.82 42.23 2.64 0.00 0.00 2347626.64 1461.45 4563402.75 00:12:41.142 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x200 length 0x200 00:12:41.142 raid0 : 6.83 42.20 2.64 0.00 0.00 2363812.47 1441.79 4482872.12 00:12:41.142 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x0 length 0x200 00:12:41.142 concat0 : 6.82 49.26 3.08 0.00 0.00 1988655.95 1435.24 4375497.93 00:12:41.142 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x200 length 0x200 00:12:41.142 concat0 : 6.83 46.85 2.93 0.00 0.00 2100503.21 1454.90 4321810.84 00:12:41.142 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x0 length 0x100 00:12:41.142 raid1 : 6.76 62.77 3.92 0.00 0.00 1535549.87 1861.22 4214436.66 00:12:41.142 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x100 length 0x100 00:12:41.142 raid1 : 6.82 60.99 3.81 0.00 0.00 1577265.01 1874.33 4160749.57 00:12:41.142 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x0 length 0x4e 00:12:41.142 AIO0 : 6.83 67.10 4.19 0.00 0.00 857163.91 530.84 2670932.79 00:12:41.142 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:12:41.142 Verification LBA range: start 0x4e length 0x4e 00:12:41.142 AIO0 : 6.83 69.71 4.36 0.00 0.00 822372.57 737.28 2389075.56 00:12:41.142 =================================================================================================================== 00:12:41.142 Total : 1467.68 91.73 0.00 0.00 1428236.79 530.84 5100273.66 00:12:41.142 00:12:41.142 real 0m8.103s 00:12:41.142 user 0m15.120s 00:12:41.142 sys 0m0.459s 00:12:41.142 06:28:53 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:41.142 06:28:53 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:12:41.142 ************************************ 00:12:41.142 END TEST bdev_verify_big_io 00:12:41.142 ************************************ 00:12:41.142 06:28:53 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:41.142 06:28:53 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:12:41.142 06:28:53 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:41.142 06:28:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:41.142 ************************************ 00:12:41.142 START TEST bdev_write_zeroes 00:12:41.142 ************************************ 00:12:41.142 06:28:53 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:41.142 [2024-07-25 06:28:53.858842] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:12:41.142 [2024-07-25 06:28:53.858899] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1081165 ] 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:41.142 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:41.142 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:41.142 [2024-07-25 06:28:53.993207] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:41.142 [2024-07-25 06:28:54.037050] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:41.142 [2024-07-25 06:28:54.178263] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:12:41.142 [2024-07-25 06:28:54.178313] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:12:41.142 [2024-07-25 06:28:54.178327] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:12:41.142 [2024-07-25 06:28:54.186268] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:12:41.142 [2024-07-25 06:28:54.186293] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:12:41.143 [2024-07-25 06:28:54.194279] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:12:41.143 [2024-07-25 06:28:54.194301] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:12:41.143 [2024-07-25 06:28:54.265586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:12:41.143 [2024-07-25 06:28:54.265636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:41.143 [2024-07-25 06:28:54.265651] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb9880 00:12:41.143 [2024-07-25 06:28:54.265663] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:41.143 [2024-07-25 06:28:54.266920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:41.143 [2024-07-25 06:28:54.266950] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:12:41.143 Running I/O for 1 seconds... 00:12:42.079 00:12:42.079 Latency(us) 00:12:42.080 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:42.080 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc0 : 1.04 5400.85 21.10 0.00 0.00 23675.34 606.21 39426.46 00:12:42.080 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc1p0 : 1.04 5393.68 21.07 0.00 0.00 23669.40 822.48 38587.60 00:12:42.080 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc1p1 : 1.05 5386.62 21.04 0.00 0.00 23657.45 822.48 37958.45 00:12:42.080 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc2p0 : 1.05 5379.48 21.01 0.00 0.00 23641.00 825.75 37119.59 00:12:42.080 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc2p1 : 1.05 5372.47 20.99 0.00 0.00 23622.90 845.41 36280.73 00:12:42.080 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc2p2 : 1.05 5365.45 20.96 0.00 0.00 23605.70 822.48 35441.87 00:12:42.080 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc2p3 : 1.05 5358.36 20.93 0.00 0.00 23587.62 825.75 34603.01 00:12:42.080 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc2p4 : 1.05 5351.40 20.90 0.00 0.00 23572.11 825.75 33764.15 00:12:42.080 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc2p5 : 1.05 5344.46 20.88 0.00 0.00 23556.15 825.75 32925.29 00:12:42.080 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc2p6 : 1.06 5337.43 20.85 0.00 0.00 23538.98 819.20 32086.43 00:12:42.080 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 Malloc2p7 : 1.06 5330.52 20.82 0.00 0.00 23518.75 832.31 31247.56 00:12:42.080 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 TestPT : 1.06 5323.60 20.80 0.00 0.00 23501.00 871.63 30408.70 00:12:42.080 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 raid0 : 1.06 5315.66 20.76 0.00 0.00 23479.87 1494.22 28940.70 00:12:42.080 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 concat0 : 1.06 5307.87 20.73 0.00 0.00 23437.90 1481.11 27472.69 00:12:42.080 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 raid1 : 1.06 5298.14 20.70 0.00 0.00 23384.57 2372.40 24956.11 00:12:42.080 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:12:42.080 AIO0 : 1.06 5292.18 20.67 0.00 0.00 23301.30 956.83 24222.11 00:12:42.080 =================================================================================================================== 00:12:42.080 Total : 85558.18 334.21 0.00 0.00 23546.88 606.21 39426.46 00:12:42.339 00:12:42.339 real 0m2.067s 00:12:42.339 user 0m1.673s 00:12:42.339 sys 0m0.330s 00:12:42.339 06:28:55 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:42.339 06:28:55 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:12:42.339 ************************************ 00:12:42.339 END TEST bdev_write_zeroes 00:12:42.339 ************************************ 00:12:42.599 06:28:55 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:42.599 06:28:55 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:12:42.599 06:28:55 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:42.599 06:28:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:42.599 ************************************ 00:12:42.599 START TEST bdev_json_nonenclosed 00:12:42.599 ************************************ 00:12:42.599 06:28:55 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:42.599 [2024-07-25 06:28:56.016211] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:12:42.599 [2024-07-25 06:28:56.016269] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1081455 ] 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:42.599 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:42.599 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:42.599 [2024-07-25 06:28:56.154037] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:42.859 [2024-07-25 06:28:56.197277] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:42.859 [2024-07-25 06:28:56.197344] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:12:42.859 [2024-07-25 06:28:56.197360] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:12:42.859 [2024-07-25 06:28:56.197372] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:12:42.859 00:12:42.859 real 0m0.314s 00:12:42.859 user 0m0.155s 00:12:42.859 sys 0m0.157s 00:12:42.859 06:28:56 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:42.859 06:28:56 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:12:42.859 ************************************ 00:12:42.859 END TEST bdev_json_nonenclosed 00:12:42.859 ************************************ 00:12:42.859 06:28:56 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:42.859 06:28:56 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:12:42.859 06:28:56 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:42.859 06:28:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:42.859 ************************************ 00:12:42.859 START TEST bdev_json_nonarray 00:12:42.859 ************************************ 00:12:42.859 06:28:56 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:42.859 [2024-07-25 06:28:56.407969] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:12:42.859 [2024-07-25 06:28:56.408027] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1081512 ] 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:43.119 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.119 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:43.120 [2024-07-25 06:28:56.529933] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:43.120 [2024-07-25 06:28:56.584872] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:43.120 [2024-07-25 06:28:56.584942] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:12:43.120 [2024-07-25 06:28:56.584958] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:12:43.120 [2024-07-25 06:28:56.584969] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:12:43.120 00:12:43.120 real 0m0.309s 00:12:43.120 user 0m0.155s 00:12:43.120 sys 0m0.153s 00:12:43.120 06:28:56 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:43.120 06:28:56 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:12:43.120 ************************************ 00:12:43.120 END TEST bdev_json_nonarray 00:12:43.120 ************************************ 00:12:43.380 06:28:56 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:12:43.380 06:28:56 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:12:43.380 06:28:56 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:43.380 06:28:56 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:43.380 06:28:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:43.380 ************************************ 00:12:43.380 START TEST bdev_qos 00:12:43.380 ************************************ 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # qos_test_suite '' 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=1081744 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 1081744' 00:12:43.380 Process qos testing pid: 1081744 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 1081744 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- common/autotest_common.sh@831 -- # '[' -z 1081744 ']' 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:43.380 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:43.380 06:28:56 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:43.380 [2024-07-25 06:28:56.802512] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:12:43.380 [2024-07-25 06:28:56.802567] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1081744 ] 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:01.0 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:01.1 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:01.2 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:01.3 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:01.4 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:01.5 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:01.6 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:01.7 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:02.0 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:02.1 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:02.2 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:02.3 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:02.4 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:02.5 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:02.6 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3d:02.7 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:01.0 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:01.1 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:01.2 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:01.3 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:01.4 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:01.5 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:01.6 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:01.7 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:02.0 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:02.1 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:02.2 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:02.3 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:02.4 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:02.5 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:02.6 cannot be used 00:12:43.380 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:12:43.380 EAL: Requested device 0000:3f:02.7 cannot be used 00:12:43.380 [2024-07-25 06:28:56.924883] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:43.639 [2024-07-25 06:28:56.969634] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@864 -- # return 0 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:44.208 Malloc_0 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_0 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:44.208 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:44.208 [ 00:12:44.208 { 00:12:44.208 "name": "Malloc_0", 00:12:44.208 "aliases": [ 00:12:44.208 "092cb2ae-df11-4d76-8f60-d29ef4bc45b5" 00:12:44.208 ], 00:12:44.208 "product_name": "Malloc disk", 00:12:44.208 "block_size": 512, 00:12:44.208 "num_blocks": 262144, 00:12:44.208 "uuid": "092cb2ae-df11-4d76-8f60-d29ef4bc45b5", 00:12:44.208 "assigned_rate_limits": { 00:12:44.208 "rw_ios_per_sec": 0, 00:12:44.208 "rw_mbytes_per_sec": 0, 00:12:44.208 "r_mbytes_per_sec": 0, 00:12:44.208 "w_mbytes_per_sec": 0 00:12:44.208 }, 00:12:44.208 "claimed": false, 00:12:44.208 "zoned": false, 00:12:44.208 "supported_io_types": { 00:12:44.208 "read": true, 00:12:44.208 "write": true, 00:12:44.208 "unmap": true, 00:12:44.208 "flush": true, 00:12:44.208 "reset": true, 00:12:44.208 "nvme_admin": false, 00:12:44.208 "nvme_io": false, 00:12:44.208 "nvme_io_md": false, 00:12:44.208 "write_zeroes": true, 00:12:44.208 "zcopy": true, 00:12:44.208 "get_zone_info": false, 00:12:44.208 "zone_management": false, 00:12:44.208 "zone_append": false, 00:12:44.208 "compare": false, 00:12:44.208 "compare_and_write": false, 00:12:44.208 "abort": true, 00:12:44.208 "seek_hole": false, 00:12:44.208 "seek_data": false, 00:12:44.208 "copy": true, 00:12:44.208 "nvme_iov_md": false 00:12:44.208 }, 00:12:44.467 "memory_domains": [ 00:12:44.467 { 00:12:44.467 "dma_device_id": "system", 00:12:44.467 "dma_device_type": 1 00:12:44.467 }, 00:12:44.467 { 00:12:44.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.467 "dma_device_type": 2 00:12:44.467 } 00:12:44.467 ], 00:12:44.467 "driver_specific": {} 00:12:44.467 } 00:12:44.467 ] 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:44.467 Null_1 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Null_1 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:44.467 [ 00:12:44.467 { 00:12:44.467 "name": "Null_1", 00:12:44.467 "aliases": [ 00:12:44.467 "7b1ff975-9346-4765-b27d-1e5a2499ee49" 00:12:44.467 ], 00:12:44.467 "product_name": "Null disk", 00:12:44.467 "block_size": 512, 00:12:44.467 "num_blocks": 262144, 00:12:44.467 "uuid": "7b1ff975-9346-4765-b27d-1e5a2499ee49", 00:12:44.467 "assigned_rate_limits": { 00:12:44.467 "rw_ios_per_sec": 0, 00:12:44.467 "rw_mbytes_per_sec": 0, 00:12:44.467 "r_mbytes_per_sec": 0, 00:12:44.467 "w_mbytes_per_sec": 0 00:12:44.467 }, 00:12:44.467 "claimed": false, 00:12:44.467 "zoned": false, 00:12:44.467 "supported_io_types": { 00:12:44.467 "read": true, 00:12:44.467 "write": true, 00:12:44.467 "unmap": false, 00:12:44.467 "flush": false, 00:12:44.467 "reset": true, 00:12:44.467 "nvme_admin": false, 00:12:44.467 "nvme_io": false, 00:12:44.467 "nvme_io_md": false, 00:12:44.467 "write_zeroes": true, 00:12:44.467 "zcopy": false, 00:12:44.467 "get_zone_info": false, 00:12:44.467 "zone_management": false, 00:12:44.467 "zone_append": false, 00:12:44.467 "compare": false, 00:12:44.467 "compare_and_write": false, 00:12:44.467 "abort": true, 00:12:44.467 "seek_hole": false, 00:12:44.467 "seek_data": false, 00:12:44.467 "copy": false, 00:12:44.467 "nvme_iov_md": false 00:12:44.467 }, 00:12:44.467 "driver_specific": {} 00:12:44.467 } 00:12:44.467 ] 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:12:44.467 06:28:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:12:44.467 Running I/O for 60 seconds... 00:12:49.735 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 67912.95 271651.81 0.00 0.00 274432.00 0.00 0.00 ' 00:12:49.735 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:12:49.735 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=67912.95 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 67912 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=67912 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=16000 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 16000 -gt 1000 ']' 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 16000 Malloc_0 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 16000 IOPS Malloc_0 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:49.736 06:29:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:49.736 ************************************ 00:12:49.736 START TEST bdev_qos_iops 00:12:49.736 ************************************ 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # run_qos_test 16000 IOPS Malloc_0 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=16000 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:12:49.736 06:29:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:12:55.005 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 15996.31 63985.22 0.00 0.00 65152.00 0.00 0.00 ' 00:12:55.005 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:12:55.005 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:12:55.005 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=15996.31 00:12:55.005 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 15996 00:12:55.006 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=15996 00:12:55.006 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:12:55.006 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=14400 00:12:55.006 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=17600 00:12:55.006 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 15996 -lt 14400 ']' 00:12:55.006 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 15996 -gt 17600 ']' 00:12:55.006 00:12:55.006 real 0m5.247s 00:12:55.006 user 0m0.109s 00:12:55.006 sys 0m0.048s 00:12:55.006 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:55.006 06:29:08 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:12:55.006 ************************************ 00:12:55.006 END TEST bdev_qos_iops 00:12:55.006 ************************************ 00:12:55.006 06:29:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:12:55.006 06:29:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:12:55.006 06:29:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:12:55.006 06:29:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:12:55.006 06:29:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:12:55.006 06:29:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:12:55.006 06:29:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:13:00.309 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 22584.75 90339.00 0.00 0.00 92160.00 0.00 0.00 ' 00:13:00.309 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:13:00.309 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:13:00.309 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:13:00.309 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=92160.00 00:13:00.309 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 92160 00:13:00.309 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=92160 00:13:00.309 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=9 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 9 -lt 2 ']' 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 9 Null_1 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 9 BANDWIDTH Null_1 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:00.310 06:29:13 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:13:00.310 ************************************ 00:13:00.310 START TEST bdev_qos_bw 00:13:00.310 ************************************ 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # run_qos_test 9 BANDWIDTH Null_1 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=9 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:13:00.310 06:29:13 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 2303.90 9215.58 0.00 0.00 9376.00 0.00 0.00 ' 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=9376.00 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 9376 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=9376 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=9216 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=8294 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=10137 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 9376 -lt 8294 ']' 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 9376 -gt 10137 ']' 00:13:05.571 00:13:05.571 real 0m5.252s 00:13:05.571 user 0m0.110s 00:13:05.571 sys 0m0.046s 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:13:05.571 ************************************ 00:13:05.571 END TEST bdev_qos_bw 00:13:05.571 ************************************ 00:13:05.571 06:29:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:13:05.571 06:29:18 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:05.571 06:29:18 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:13:05.571 06:29:18 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:05.571 06:29:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:13:05.571 06:29:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:05.571 06:29:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:05.571 06:29:18 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:13:05.571 ************************************ 00:13:05.571 START TEST bdev_qos_ro_bw 00:13:05.571 ************************************ 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:13:05.571 06:29:18 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 511.90 2047.58 0.00 0.00 2060.00 0.00 0.00 ' 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2060.00 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2060 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2060 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -lt 1843 ']' 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -gt 2252 ']' 00:13:10.836 00:13:10.836 real 0m5.177s 00:13:10.836 user 0m0.111s 00:13:10.836 sys 0m0.040s 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:10.836 06:29:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:13:10.836 ************************************ 00:13:10.836 END TEST bdev_qos_ro_bw 00:13:10.836 ************************************ 00:13:10.836 06:29:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:13:10.836 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:10.836 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:13:11.402 00:13:11.402 Latency(us) 00:13:11.402 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:11.402 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:13:11.402 Malloc_0 : 26.73 22457.15 87.72 0.00 0.00 11289.86 1861.22 503316.48 00:13:11.402 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:13:11.402 Null_1 : 26.87 22605.77 88.30 0.00 0.00 11293.94 707.79 136734.31 00:13:11.402 =================================================================================================================== 00:13:11.402 Total : 45062.92 176.03 0.00 0.00 11291.91 707.79 503316.48 00:13:11.402 0 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 1081744 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@950 -- # '[' -z 1081744 ']' 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # kill -0 1081744 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # uname 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1081744 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1081744' 00:13:11.402 killing process with pid 1081744 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@969 -- # kill 1081744 00:13:11.402 Received shutdown signal, test time was about 26.939238 seconds 00:13:11.402 00:13:11.402 Latency(us) 00:13:11.402 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:11.402 =================================================================================================================== 00:13:11.402 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:13:11.402 06:29:24 blockdev_general.bdev_qos -- common/autotest_common.sh@974 -- # wait 1081744 00:13:11.660 06:29:25 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:13:11.661 00:13:11.661 real 0m28.352s 00:13:11.661 user 0m29.027s 00:13:11.661 sys 0m0.823s 00:13:11.661 06:29:25 blockdev_general.bdev_qos -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:11.661 06:29:25 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:13:11.661 ************************************ 00:13:11.661 END TEST bdev_qos 00:13:11.661 ************************************ 00:13:11.661 06:29:25 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:13:11.661 06:29:25 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:11.661 06:29:25 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:11.661 06:29:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:11.661 ************************************ 00:13:11.661 START TEST bdev_qd_sampling 00:13:11.661 ************************************ 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # qd_sampling_test_suite '' 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=1086602 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 1086602' 00:13:11.661 Process bdev QD sampling period testing pid: 1086602 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 1086602 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@831 -- # '[' -z 1086602 ']' 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:13:11.661 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:11.661 06:29:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:13:11.919 [2024-07-25 06:29:25.230927] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:11.919 [2024-07-25 06:29:25.230982] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1086602 ] 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.919 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:11.919 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:11.920 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:11.920 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:11.920 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:11.920 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:11.920 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:11.920 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:11.920 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:11.920 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:11.920 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:11.920 [2024-07-25 06:29:25.365689] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:13:11.920 [2024-07-25 06:29:25.411365] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:13:11.920 [2024-07-25 06:29:25.411371] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@864 -- # return 0 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:13:12.853 Malloc_QD 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_QD 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # local i 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:13:12.853 [ 00:13:12.853 { 00:13:12.853 "name": "Malloc_QD", 00:13:12.853 "aliases": [ 00:13:12.853 "ce152845-5625-47e0-a3cb-99c001aabceb" 00:13:12.853 ], 00:13:12.853 "product_name": "Malloc disk", 00:13:12.853 "block_size": 512, 00:13:12.853 "num_blocks": 262144, 00:13:12.853 "uuid": "ce152845-5625-47e0-a3cb-99c001aabceb", 00:13:12.853 "assigned_rate_limits": { 00:13:12.853 "rw_ios_per_sec": 0, 00:13:12.853 "rw_mbytes_per_sec": 0, 00:13:12.853 "r_mbytes_per_sec": 0, 00:13:12.853 "w_mbytes_per_sec": 0 00:13:12.853 }, 00:13:12.853 "claimed": false, 00:13:12.853 "zoned": false, 00:13:12.853 "supported_io_types": { 00:13:12.853 "read": true, 00:13:12.853 "write": true, 00:13:12.853 "unmap": true, 00:13:12.853 "flush": true, 00:13:12.853 "reset": true, 00:13:12.853 "nvme_admin": false, 00:13:12.853 "nvme_io": false, 00:13:12.853 "nvme_io_md": false, 00:13:12.853 "write_zeroes": true, 00:13:12.853 "zcopy": true, 00:13:12.853 "get_zone_info": false, 00:13:12.853 "zone_management": false, 00:13:12.853 "zone_append": false, 00:13:12.853 "compare": false, 00:13:12.853 "compare_and_write": false, 00:13:12.853 "abort": true, 00:13:12.853 "seek_hole": false, 00:13:12.853 "seek_data": false, 00:13:12.853 "copy": true, 00:13:12.853 "nvme_iov_md": false 00:13:12.853 }, 00:13:12.853 "memory_domains": [ 00:13:12.853 { 00:13:12.853 "dma_device_id": "system", 00:13:12.853 "dma_device_type": 1 00:13:12.853 }, 00:13:12.853 { 00:13:12.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.853 "dma_device_type": 2 00:13:12.853 } 00:13:12.853 ], 00:13:12.853 "driver_specific": {} 00:13:12.853 } 00:13:12.853 ] 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@907 -- # return 0 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:13:12.853 06:29:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:13:12.853 Running I/O for 5 seconds... 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:13:14.751 "tick_rate": 2500000000, 00:13:14.751 "ticks": 14203896291432512, 00:13:14.751 "bdevs": [ 00:13:14.751 { 00:13:14.751 "name": "Malloc_QD", 00:13:14.751 "bytes_read": 800109056, 00:13:14.751 "num_read_ops": 195332, 00:13:14.751 "bytes_written": 0, 00:13:14.751 "num_write_ops": 0, 00:13:14.751 "bytes_unmapped": 0, 00:13:14.751 "num_unmap_ops": 0, 00:13:14.751 "bytes_copied": 0, 00:13:14.751 "num_copy_ops": 0, 00:13:14.751 "read_latency_ticks": 2451050118376, 00:13:14.751 "max_read_latency_ticks": 14856164, 00:13:14.751 "min_read_latency_ticks": 239318, 00:13:14.751 "write_latency_ticks": 0, 00:13:14.751 "max_write_latency_ticks": 0, 00:13:14.751 "min_write_latency_ticks": 0, 00:13:14.751 "unmap_latency_ticks": 0, 00:13:14.751 "max_unmap_latency_ticks": 0, 00:13:14.751 "min_unmap_latency_ticks": 0, 00:13:14.751 "copy_latency_ticks": 0, 00:13:14.751 "max_copy_latency_ticks": 0, 00:13:14.751 "min_copy_latency_ticks": 0, 00:13:14.751 "io_error": {}, 00:13:14.751 "queue_depth_polling_period": 10, 00:13:14.751 "queue_depth": 512, 00:13:14.751 "io_time": 30, 00:13:14.751 "weighted_io_time": 15360 00:13:14.751 } 00:13:14.751 ] 00:13:14.751 }' 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:14.751 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:13:14.751 00:13:14.751 Latency(us) 00:13:14.751 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:14.751 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:13:14.751 Malloc_QD : 1.99 50753.29 198.26 0.00 0.00 5031.80 1336.93 5347.74 00:13:14.751 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:13:14.751 Malloc_QD : 1.99 51109.26 199.65 0.00 0.00 4997.24 871.63 5950.67 00:13:14.751 =================================================================================================================== 00:13:14.751 Total : 101862.56 397.90 0.00 0.00 5014.45 871.63 5950.67 00:13:15.008 0 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 1086602 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@950 -- # '[' -z 1086602 ']' 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # kill -0 1086602 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # uname 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1086602 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1086602' 00:13:15.008 killing process with pid 1086602 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@969 -- # kill 1086602 00:13:15.008 Received shutdown signal, test time was about 2.078966 seconds 00:13:15.008 00:13:15.008 Latency(us) 00:13:15.008 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:15.008 =================================================================================================================== 00:13:15.008 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:13:15.008 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@974 -- # wait 1086602 00:13:15.267 06:29:28 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:13:15.267 00:13:15.267 real 0m3.394s 00:13:15.267 user 0m6.748s 00:13:15.267 sys 0m0.398s 00:13:15.267 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:15.267 06:29:28 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:13:15.267 ************************************ 00:13:15.267 END TEST bdev_qd_sampling 00:13:15.267 ************************************ 00:13:15.267 06:29:28 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:13:15.267 06:29:28 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:15.267 06:29:28 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:15.267 06:29:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:15.267 ************************************ 00:13:15.267 START TEST bdev_error 00:13:15.267 ************************************ 00:13:15.267 06:29:28 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # error_test_suite '' 00:13:15.267 06:29:28 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:13:15.267 06:29:28 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:13:15.267 06:29:28 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:13:15.267 06:29:28 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=1087169 00:13:15.267 06:29:28 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 1087169' 00:13:15.267 Process error testing pid: 1087169 00:13:15.267 06:29:28 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:13:15.267 06:29:28 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 1087169 00:13:15.267 06:29:28 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 1087169 ']' 00:13:15.267 06:29:28 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:13:15.267 06:29:28 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:15.267 06:29:28 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:13:15.267 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:13:15.267 06:29:28 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:15.267 06:29:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:15.267 [2024-07-25 06:29:28.704658] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:15.267 [2024-07-25 06:29:28.704712] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1087169 ] 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:15.267 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.267 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:15.268 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:15.268 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:15.526 [2024-07-25 06:29:28.828267] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:15.526 [2024-07-25 06:29:28.872986] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:13:16.094 06:29:29 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:16.094 Dev_1 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.094 06:29:29 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.094 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:16.353 [ 00:13:16.353 { 00:13:16.353 "name": "Dev_1", 00:13:16.353 "aliases": [ 00:13:16.353 "cebf9c91-f980-4304-aba8-79911b5ae9b2" 00:13:16.353 ], 00:13:16.353 "product_name": "Malloc disk", 00:13:16.353 "block_size": 512, 00:13:16.353 "num_blocks": 262144, 00:13:16.353 "uuid": "cebf9c91-f980-4304-aba8-79911b5ae9b2", 00:13:16.353 "assigned_rate_limits": { 00:13:16.353 "rw_ios_per_sec": 0, 00:13:16.353 "rw_mbytes_per_sec": 0, 00:13:16.353 "r_mbytes_per_sec": 0, 00:13:16.353 "w_mbytes_per_sec": 0 00:13:16.353 }, 00:13:16.353 "claimed": false, 00:13:16.353 "zoned": false, 00:13:16.353 "supported_io_types": { 00:13:16.353 "read": true, 00:13:16.353 "write": true, 00:13:16.353 "unmap": true, 00:13:16.353 "flush": true, 00:13:16.353 "reset": true, 00:13:16.353 "nvme_admin": false, 00:13:16.353 "nvme_io": false, 00:13:16.353 "nvme_io_md": false, 00:13:16.353 "write_zeroes": true, 00:13:16.353 "zcopy": true, 00:13:16.353 "get_zone_info": false, 00:13:16.353 "zone_management": false, 00:13:16.353 "zone_append": false, 00:13:16.353 "compare": false, 00:13:16.353 "compare_and_write": false, 00:13:16.353 "abort": true, 00:13:16.353 "seek_hole": false, 00:13:16.353 "seek_data": false, 00:13:16.353 "copy": true, 00:13:16.353 "nvme_iov_md": false 00:13:16.353 }, 00:13:16.353 "memory_domains": [ 00:13:16.353 { 00:13:16.353 "dma_device_id": "system", 00:13:16.353 "dma_device_type": 1 00:13:16.353 }, 00:13:16.353 { 00:13:16.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.353 "dma_device_type": 2 00:13:16.353 } 00:13:16.353 ], 00:13:16.353 "driver_specific": {} 00:13:16.353 } 00:13:16.353 ] 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:13:16.353 06:29:29 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:16.353 true 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.353 06:29:29 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:16.353 Dev_2 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.353 06:29:29 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.353 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:16.353 [ 00:13:16.353 { 00:13:16.353 "name": "Dev_2", 00:13:16.353 "aliases": [ 00:13:16.353 "a98624b8-d70d-4f52-8da2-fe687f4ba30b" 00:13:16.353 ], 00:13:16.353 "product_name": "Malloc disk", 00:13:16.353 "block_size": 512, 00:13:16.353 "num_blocks": 262144, 00:13:16.353 "uuid": "a98624b8-d70d-4f52-8da2-fe687f4ba30b", 00:13:16.353 "assigned_rate_limits": { 00:13:16.354 "rw_ios_per_sec": 0, 00:13:16.354 "rw_mbytes_per_sec": 0, 00:13:16.354 "r_mbytes_per_sec": 0, 00:13:16.354 "w_mbytes_per_sec": 0 00:13:16.354 }, 00:13:16.354 "claimed": false, 00:13:16.354 "zoned": false, 00:13:16.354 "supported_io_types": { 00:13:16.354 "read": true, 00:13:16.354 "write": true, 00:13:16.354 "unmap": true, 00:13:16.354 "flush": true, 00:13:16.354 "reset": true, 00:13:16.354 "nvme_admin": false, 00:13:16.354 "nvme_io": false, 00:13:16.354 "nvme_io_md": false, 00:13:16.354 "write_zeroes": true, 00:13:16.354 "zcopy": true, 00:13:16.354 "get_zone_info": false, 00:13:16.354 "zone_management": false, 00:13:16.354 "zone_append": false, 00:13:16.354 "compare": false, 00:13:16.354 "compare_and_write": false, 00:13:16.354 "abort": true, 00:13:16.354 "seek_hole": false, 00:13:16.354 "seek_data": false, 00:13:16.354 "copy": true, 00:13:16.354 "nvme_iov_md": false 00:13:16.354 }, 00:13:16.354 "memory_domains": [ 00:13:16.354 { 00:13:16.354 "dma_device_id": "system", 00:13:16.354 "dma_device_type": 1 00:13:16.354 }, 00:13:16.354 { 00:13:16.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.354 "dma_device_type": 2 00:13:16.354 } 00:13:16.354 ], 00:13:16.354 "driver_specific": {} 00:13:16.354 } 00:13:16.354 ] 00:13:16.354 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.354 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:13:16.354 06:29:29 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:13:16.354 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.354 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:16.354 06:29:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.354 06:29:29 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:13:16.354 06:29:29 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:13:16.354 Running I/O for 5 seconds... 00:13:17.290 06:29:30 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 1087169 00:13:17.290 06:29:30 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 1087169' 00:13:17.290 Process is existed as continue on error is set. Pid: 1087169 00:13:17.290 06:29:30 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:13:17.290 06:29:30 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:17.290 06:29:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:17.290 06:29:30 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:17.290 06:29:30 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:13:17.290 06:29:30 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:17.290 06:29:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:17.290 06:29:30 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:17.290 06:29:30 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:13:17.549 Timeout while waiting for response: 00:13:17.549 00:13:17.549 00:13:21.775 00:13:21.775 Latency(us) 00:13:21.775 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:21.775 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:13:21.775 EE_Dev_1 : 0.91 40318.64 157.49 5.52 0.00 393.51 122.06 645.53 00:13:21.775 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:13:21.775 Dev_2 : 5.00 87183.23 340.56 0.00 0.00 180.26 61.03 19398.66 00:13:21.775 =================================================================================================================== 00:13:21.775 Total : 127501.87 498.05 5.52 0.00 196.74 61.03 19398.66 00:13:22.341 06:29:35 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 1087169 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@950 -- # '[' -z 1087169 ']' 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # kill -0 1087169 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # uname 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1087169 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1087169' 00:13:22.341 killing process with pid 1087169 00:13:22.341 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@969 -- # kill 1087169 00:13:22.341 Received shutdown signal, test time was about 5.000000 seconds 00:13:22.341 00:13:22.341 Latency(us) 00:13:22.342 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:22.342 =================================================================================================================== 00:13:22.342 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:13:22.342 06:29:35 blockdev_general.bdev_error -- common/autotest_common.sh@974 -- # wait 1087169 00:13:22.600 06:29:36 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=1088486 00:13:22.600 06:29:36 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 1088486' 00:13:22.600 Process error testing pid: 1088486 00:13:22.600 06:29:36 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:13:22.600 06:29:36 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 1088486 00:13:22.600 06:29:36 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 1088486 ']' 00:13:22.600 06:29:36 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:13:22.600 06:29:36 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:22.600 06:29:36 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:13:22.600 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:13:22.600 06:29:36 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:22.600 06:29:36 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:22.600 [2024-07-25 06:29:36.115947] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:22.600 [2024-07-25 06:29:36.116010] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1088486 ] 00:13:22.858 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.858 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:22.858 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.858 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:22.858 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.858 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:22.858 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.858 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:22.858 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.858 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:22.859 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:22.859 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:22.859 [2024-07-25 06:29:36.238979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:22.859 [2024-07-25 06:29:36.284074] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:13:23.794 06:29:37 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:23.794 Dev_1 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.794 06:29:37 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:23.794 [ 00:13:23.794 { 00:13:23.794 "name": "Dev_1", 00:13:23.794 "aliases": [ 00:13:23.794 "be5f8bac-d030-450a-8fe8-8e950eee242e" 00:13:23.794 ], 00:13:23.794 "product_name": "Malloc disk", 00:13:23.794 "block_size": 512, 00:13:23.794 "num_blocks": 262144, 00:13:23.794 "uuid": "be5f8bac-d030-450a-8fe8-8e950eee242e", 00:13:23.794 "assigned_rate_limits": { 00:13:23.794 "rw_ios_per_sec": 0, 00:13:23.794 "rw_mbytes_per_sec": 0, 00:13:23.794 "r_mbytes_per_sec": 0, 00:13:23.794 "w_mbytes_per_sec": 0 00:13:23.794 }, 00:13:23.794 "claimed": false, 00:13:23.794 "zoned": false, 00:13:23.794 "supported_io_types": { 00:13:23.794 "read": true, 00:13:23.794 "write": true, 00:13:23.794 "unmap": true, 00:13:23.794 "flush": true, 00:13:23.794 "reset": true, 00:13:23.794 "nvme_admin": false, 00:13:23.794 "nvme_io": false, 00:13:23.794 "nvme_io_md": false, 00:13:23.794 "write_zeroes": true, 00:13:23.794 "zcopy": true, 00:13:23.794 "get_zone_info": false, 00:13:23.794 "zone_management": false, 00:13:23.794 "zone_append": false, 00:13:23.794 "compare": false, 00:13:23.794 "compare_and_write": false, 00:13:23.794 "abort": true, 00:13:23.794 "seek_hole": false, 00:13:23.794 "seek_data": false, 00:13:23.794 "copy": true, 00:13:23.794 "nvme_iov_md": false 00:13:23.794 }, 00:13:23.794 "memory_domains": [ 00:13:23.794 { 00:13:23.794 "dma_device_id": "system", 00:13:23.794 "dma_device_type": 1 00:13:23.794 }, 00:13:23.794 { 00:13:23.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.794 "dma_device_type": 2 00:13:23.794 } 00:13:23.794 ], 00:13:23.794 "driver_specific": {} 00:13:23.794 } 00:13:23.794 ] 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:13:23.794 06:29:37 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:23.794 true 00:13:23.794 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.794 06:29:37 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:23.795 Dev_2 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.795 06:29:37 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:23.795 [ 00:13:23.795 { 00:13:23.795 "name": "Dev_2", 00:13:23.795 "aliases": [ 00:13:23.795 "159d06a7-b5f6-42fa-aa96-5b60452d12d0" 00:13:23.795 ], 00:13:23.795 "product_name": "Malloc disk", 00:13:23.795 "block_size": 512, 00:13:23.795 "num_blocks": 262144, 00:13:23.795 "uuid": "159d06a7-b5f6-42fa-aa96-5b60452d12d0", 00:13:23.795 "assigned_rate_limits": { 00:13:23.795 "rw_ios_per_sec": 0, 00:13:23.795 "rw_mbytes_per_sec": 0, 00:13:23.795 "r_mbytes_per_sec": 0, 00:13:23.795 "w_mbytes_per_sec": 0 00:13:23.795 }, 00:13:23.795 "claimed": false, 00:13:23.795 "zoned": false, 00:13:23.795 "supported_io_types": { 00:13:23.795 "read": true, 00:13:23.795 "write": true, 00:13:23.795 "unmap": true, 00:13:23.795 "flush": true, 00:13:23.795 "reset": true, 00:13:23.795 "nvme_admin": false, 00:13:23.795 "nvme_io": false, 00:13:23.795 "nvme_io_md": false, 00:13:23.795 "write_zeroes": true, 00:13:23.795 "zcopy": true, 00:13:23.795 "get_zone_info": false, 00:13:23.795 "zone_management": false, 00:13:23.795 "zone_append": false, 00:13:23.795 "compare": false, 00:13:23.795 "compare_and_write": false, 00:13:23.795 "abort": true, 00:13:23.795 "seek_hole": false, 00:13:23.795 "seek_data": false, 00:13:23.795 "copy": true, 00:13:23.795 "nvme_iov_md": false 00:13:23.795 }, 00:13:23.795 "memory_domains": [ 00:13:23.795 { 00:13:23.795 "dma_device_id": "system", 00:13:23.795 "dma_device_type": 1 00:13:23.795 }, 00:13:23.795 { 00:13:23.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.795 "dma_device_type": 2 00:13:23.795 } 00:13:23.795 ], 00:13:23.795 "driver_specific": {} 00:13:23.795 } 00:13:23.795 ] 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:13:23.795 06:29:37 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.795 06:29:37 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 1088486 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # local es=0 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # valid_exec_arg wait 1088486 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@638 -- # local arg=wait 00:13:23.795 06:29:37 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # type -t wait 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:23.795 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # wait 1088486 00:13:23.795 Running I/O for 5 seconds... 00:13:23.795 task offset: 19752 on job bdev=EE_Dev_1 fails 00:13:23.795 00:13:23.795 Latency(us) 00:13:23.795 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:23.795 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:13:23.795 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:13:23.795 EE_Dev_1 : 0.00 31294.45 122.24 7112.38 0.00 349.72 120.42 622.59 00:13:23.795 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:13:23.795 Dev_2 : 0.00 19595.84 76.55 0.00 0.00 608.67 130.25 1127.22 00:13:23.795 =================================================================================================================== 00:13:23.795 Total : 50890.29 198.79 7112.38 0.00 490.17 120.42 1127.22 00:13:23.795 [2024-07-25 06:29:37.293017] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:13:23.795 request: 00:13:23.795 { 00:13:23.795 "method": "perform_tests", 00:13:23.795 "req_id": 1 00:13:23.795 } 00:13:23.795 Got JSON-RPC error response 00:13:23.795 response: 00:13:23.795 { 00:13:23.795 "code": -32603, 00:13:23.795 "message": "bdevperf failed with error Operation not permitted" 00:13:23.795 } 00:13:24.055 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # es=255 00:13:24.055 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:13:24.055 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # es=127 00:13:24.055 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@663 -- # case "$es" in 00:13:24.055 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@670 -- # es=1 00:13:24.055 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:13:24.055 00:13:24.055 real 0m8.885s 00:13:24.055 user 0m9.309s 00:13:24.055 sys 0m0.794s 00:13:24.055 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:24.055 06:29:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:13:24.055 ************************************ 00:13:24.055 END TEST bdev_error 00:13:24.055 ************************************ 00:13:24.055 06:29:37 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:13:24.055 06:29:37 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:24.055 06:29:37 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:24.055 06:29:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:24.314 ************************************ 00:13:24.314 START TEST bdev_stat 00:13:24.314 ************************************ 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # stat_test_suite '' 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=1088777 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 1088777' 00:13:24.314 Process Bdev IO statistics testing pid: 1088777 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 1088777 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- common/autotest_common.sh@831 -- # '[' -z 1088777 ']' 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:13:24.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:24.314 06:29:37 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:24.314 [2024-07-25 06:29:37.681022] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:24.314 [2024-07-25 06:29:37.681083] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1088777 ] 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:24.314 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:24.314 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:24.314 [2024-07-25 06:29:37.818035] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:13:24.314 [2024-07-25 06:29:37.862152] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:13:24.314 [2024-07-25 06:29:37.862162] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@864 -- # return 0 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:25.250 Malloc_STAT 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_STAT 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # local i 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:25.250 [ 00:13:25.250 { 00:13:25.250 "name": "Malloc_STAT", 00:13:25.250 "aliases": [ 00:13:25.250 "447875b6-d7e4-4648-8a5e-22aa2bb6489a" 00:13:25.250 ], 00:13:25.250 "product_name": "Malloc disk", 00:13:25.250 "block_size": 512, 00:13:25.250 "num_blocks": 262144, 00:13:25.250 "uuid": "447875b6-d7e4-4648-8a5e-22aa2bb6489a", 00:13:25.250 "assigned_rate_limits": { 00:13:25.250 "rw_ios_per_sec": 0, 00:13:25.250 "rw_mbytes_per_sec": 0, 00:13:25.250 "r_mbytes_per_sec": 0, 00:13:25.250 "w_mbytes_per_sec": 0 00:13:25.250 }, 00:13:25.250 "claimed": false, 00:13:25.250 "zoned": false, 00:13:25.250 "supported_io_types": { 00:13:25.250 "read": true, 00:13:25.250 "write": true, 00:13:25.250 "unmap": true, 00:13:25.250 "flush": true, 00:13:25.250 "reset": true, 00:13:25.250 "nvme_admin": false, 00:13:25.250 "nvme_io": false, 00:13:25.250 "nvme_io_md": false, 00:13:25.250 "write_zeroes": true, 00:13:25.250 "zcopy": true, 00:13:25.250 "get_zone_info": false, 00:13:25.250 "zone_management": false, 00:13:25.250 "zone_append": false, 00:13:25.250 "compare": false, 00:13:25.250 "compare_and_write": false, 00:13:25.250 "abort": true, 00:13:25.250 "seek_hole": false, 00:13:25.250 "seek_data": false, 00:13:25.250 "copy": true, 00:13:25.250 "nvme_iov_md": false 00:13:25.250 }, 00:13:25.250 "memory_domains": [ 00:13:25.250 { 00:13:25.250 "dma_device_id": "system", 00:13:25.250 "dma_device_type": 1 00:13:25.250 }, 00:13:25.250 { 00:13:25.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.250 "dma_device_type": 2 00:13:25.250 } 00:13:25.250 ], 00:13:25.250 "driver_specific": {} 00:13:25.250 } 00:13:25.250 ] 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- common/autotest_common.sh@907 -- # return 0 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:13:25.250 06:29:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:13:25.250 Running I/O for 10 seconds... 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:13:27.154 "tick_rate": 2500000000, 00:13:27.154 "ticks": 14203927166354610, 00:13:27.154 "bdevs": [ 00:13:27.154 { 00:13:27.154 "name": "Malloc_STAT", 00:13:27.154 "bytes_read": 798011904, 00:13:27.154 "num_read_ops": 194820, 00:13:27.154 "bytes_written": 0, 00:13:27.154 "num_write_ops": 0, 00:13:27.154 "bytes_unmapped": 0, 00:13:27.154 "num_unmap_ops": 0, 00:13:27.154 "bytes_copied": 0, 00:13:27.154 "num_copy_ops": 0, 00:13:27.154 "read_latency_ticks": 2433846399990, 00:13:27.154 "max_read_latency_ticks": 15505562, 00:13:27.154 "min_read_latency_ticks": 292198, 00:13:27.154 "write_latency_ticks": 0, 00:13:27.154 "max_write_latency_ticks": 0, 00:13:27.154 "min_write_latency_ticks": 0, 00:13:27.154 "unmap_latency_ticks": 0, 00:13:27.154 "max_unmap_latency_ticks": 0, 00:13:27.154 "min_unmap_latency_ticks": 0, 00:13:27.154 "copy_latency_ticks": 0, 00:13:27.154 "max_copy_latency_ticks": 0, 00:13:27.154 "min_copy_latency_ticks": 0, 00:13:27.154 "io_error": {} 00:13:27.154 } 00:13:27.154 ] 00:13:27.154 }' 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=194820 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:13:27.154 "tick_rate": 2500000000, 00:13:27.154 "ticks": 14203927318465200, 00:13:27.154 "name": "Malloc_STAT", 00:13:27.154 "channels": [ 00:13:27.154 { 00:13:27.154 "thread_id": 2, 00:13:27.154 "bytes_read": 409993216, 00:13:27.154 "num_read_ops": 100096, 00:13:27.154 "bytes_written": 0, 00:13:27.154 "num_write_ops": 0, 00:13:27.154 "bytes_unmapped": 0, 00:13:27.154 "num_unmap_ops": 0, 00:13:27.154 "bytes_copied": 0, 00:13:27.154 "num_copy_ops": 0, 00:13:27.154 "read_latency_ticks": 1254711299784, 00:13:27.154 "max_read_latency_ticks": 13334956, 00:13:27.154 "min_read_latency_ticks": 8184532, 00:13:27.154 "write_latency_ticks": 0, 00:13:27.154 "max_write_latency_ticks": 0, 00:13:27.154 "min_write_latency_ticks": 0, 00:13:27.154 "unmap_latency_ticks": 0, 00:13:27.154 "max_unmap_latency_ticks": 0, 00:13:27.154 "min_unmap_latency_ticks": 0, 00:13:27.154 "copy_latency_ticks": 0, 00:13:27.154 "max_copy_latency_ticks": 0, 00:13:27.154 "min_copy_latency_ticks": 0 00:13:27.154 }, 00:13:27.154 { 00:13:27.154 "thread_id": 3, 00:13:27.154 "bytes_read": 413138944, 00:13:27.154 "num_read_ops": 100864, 00:13:27.154 "bytes_written": 0, 00:13:27.154 "num_write_ops": 0, 00:13:27.154 "bytes_unmapped": 0, 00:13:27.154 "num_unmap_ops": 0, 00:13:27.154 "bytes_copied": 0, 00:13:27.154 "num_copy_ops": 0, 00:13:27.154 "read_latency_ticks": 1256186106556, 00:13:27.154 "max_read_latency_ticks": 15505562, 00:13:27.154 "min_read_latency_ticks": 8227068, 00:13:27.154 "write_latency_ticks": 0, 00:13:27.154 "max_write_latency_ticks": 0, 00:13:27.154 "min_write_latency_ticks": 0, 00:13:27.154 "unmap_latency_ticks": 0, 00:13:27.154 "max_unmap_latency_ticks": 0, 00:13:27.154 "min_unmap_latency_ticks": 0, 00:13:27.154 "copy_latency_ticks": 0, 00:13:27.154 "max_copy_latency_ticks": 0, 00:13:27.154 "min_copy_latency_ticks": 0 00:13:27.154 } 00:13:27.154 ] 00:13:27.154 }' 00:13:27.154 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=100096 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=100096 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=100864 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=200960 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:27.413 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:13:27.413 "tick_rate": 2500000000, 00:13:27.413 "ticks": 14203927608063236, 00:13:27.413 "bdevs": [ 00:13:27.413 { 00:13:27.413 "name": "Malloc_STAT", 00:13:27.413 "bytes_read": 871412224, 00:13:27.413 "num_read_ops": 212740, 00:13:27.413 "bytes_written": 0, 00:13:27.414 "num_write_ops": 0, 00:13:27.414 "bytes_unmapped": 0, 00:13:27.414 "num_unmap_ops": 0, 00:13:27.414 "bytes_copied": 0, 00:13:27.414 "num_copy_ops": 0, 00:13:27.414 "read_latency_ticks": 2658422205080, 00:13:27.414 "max_read_latency_ticks": 15505562, 00:13:27.414 "min_read_latency_ticks": 292198, 00:13:27.414 "write_latency_ticks": 0, 00:13:27.414 "max_write_latency_ticks": 0, 00:13:27.414 "min_write_latency_ticks": 0, 00:13:27.414 "unmap_latency_ticks": 0, 00:13:27.414 "max_unmap_latency_ticks": 0, 00:13:27.414 "min_unmap_latency_ticks": 0, 00:13:27.414 "copy_latency_ticks": 0, 00:13:27.414 "max_copy_latency_ticks": 0, 00:13:27.414 "min_copy_latency_ticks": 0, 00:13:27.414 "io_error": {} 00:13:27.414 } 00:13:27.414 ] 00:13:27.414 }' 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=212740 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 200960 -lt 194820 ']' 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 200960 -gt 212740 ']' 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:27.414 00:13:27.414 Latency(us) 00:13:27.414 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:27.414 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:13:27.414 Malloc_STAT : 2.16 50959.24 199.06 0.00 0.00 5011.28 1579.42 5347.74 00:13:27.414 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:13:27.414 Malloc_STAT : 2.16 51305.81 200.41 0.00 0.00 4978.11 1356.60 6212.81 00:13:27.414 =================================================================================================================== 00:13:27.414 Total : 102265.05 399.47 0.00 0.00 4994.64 1356.60 6212.81 00:13:27.414 0 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 1088777 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@950 -- # '[' -z 1088777 ']' 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # kill -0 1088777 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # uname 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1088777 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1088777' 00:13:27.414 killing process with pid 1088777 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@969 -- # kill 1088777 00:13:27.414 Received shutdown signal, test time was about 2.244297 seconds 00:13:27.414 00:13:27.414 Latency(us) 00:13:27.414 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:27.414 =================================================================================================================== 00:13:27.414 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:13:27.414 06:29:40 blockdev_general.bdev_stat -- common/autotest_common.sh@974 -- # wait 1088777 00:13:27.673 06:29:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:13:27.673 00:13:27.673 real 0m3.505s 00:13:27.673 user 0m7.008s 00:13:27.673 sys 0m0.448s 00:13:27.673 06:29:41 blockdev_general.bdev_stat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:27.673 06:29:41 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:13:27.673 ************************************ 00:13:27.673 END TEST bdev_stat 00:13:27.673 ************************************ 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:13:27.673 06:29:41 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:13:27.673 00:13:27.673 real 1m54.127s 00:13:27.673 user 7m24.756s 00:13:27.673 sys 0m21.826s 00:13:27.673 06:29:41 blockdev_general -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:27.673 06:29:41 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:27.673 ************************************ 00:13:27.673 END TEST blockdev_general 00:13:27.673 ************************************ 00:13:27.673 06:29:41 -- spdk/autotest.sh@194 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:13:27.673 06:29:41 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:13:27.673 06:29:41 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:27.932 06:29:41 -- common/autotest_common.sh@10 -- # set +x 00:13:27.932 ************************************ 00:13:27.932 START TEST bdev_raid 00:13:27.932 ************************************ 00:13:27.932 06:29:41 bdev_raid -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:13:27.932 * Looking for test storage... 00:13:27.932 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:13:27.932 06:29:41 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:13:27.932 06:29:41 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:13:27.932 06:29:41 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:13:27.932 06:29:41 bdev_raid -- bdev/bdev_raid.sh@927 -- # mkdir -p /raidtest 00:13:27.932 06:29:41 bdev_raid -- bdev/bdev_raid.sh@928 -- # trap 'cleanup; exit 1' EXIT 00:13:27.932 06:29:41 bdev_raid -- bdev/bdev_raid.sh@930 -- # base_blocklen=512 00:13:27.932 06:29:41 bdev_raid -- bdev/bdev_raid.sh@932 -- # run_test raid0_resize_superblock_test raid_resize_superblock_test 0 00:13:27.932 06:29:41 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:27.932 06:29:41 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:27.932 06:29:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:27.932 ************************************ 00:13:27.932 START TEST raid0_resize_superblock_test 00:13:27.932 ************************************ 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@1125 -- # raid_resize_superblock_test 0 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@868 -- # local raid_level=0 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@871 -- # raid_pid=1089393 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@872 -- # echo 'Process raid pid: 1089393' 00:13:27.932 Process raid pid: 1089393 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@870 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@873 -- # waitforlisten 1089393 /var/tmp/spdk-raid.sock 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1089393 ']' 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:27.932 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:27.932 06:29:41 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.190 [2024-07-25 06:29:41.488861] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:28.190 [2024-07-25 06:29:41.488917] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.190 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:28.190 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:28.191 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:28.191 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:28.191 [2024-07-25 06:29:41.625972] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:28.191 [2024-07-25 06:29:41.670969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:28.191 [2024-07-25 06:29:41.735556] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:28.191 [2024-07-25 06:29:41.735583] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:29.124 06:29:42 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:29.124 06:29:42 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:13:29.124 06:29:42 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@875 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create -b malloc0 512 512 00:13:29.382 malloc0 00:13:29.382 06:29:42 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@877 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:13:29.382 [2024-07-25 06:29:42.936805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:13:29.382 [2024-07-25 06:29:42.936850] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:29.382 [2024-07-25 06:29:42.936872] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x225b150 00:13:29.382 [2024-07-25 06:29:42.936884] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:29.640 [2024-07-25 06:29:42.938359] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:29.640 [2024-07-25 06:29:42.938387] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:13:29.640 pt0 00:13:29.640 06:29:42 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@878 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create_lvstore pt0 lvs0 00:13:29.898 30439f93-e241-4114-bf93-7c31238d1abf 00:13:29.898 06:29:43 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@880 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol0 64 00:13:30.156 8a864667-d92e-4f0f-aea1-4d1cf3716c03 00:13:30.156 06:29:43 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@881 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol1 64 00:13:30.156 7611d86a-9220-42d8-bd5a-f7b5c538b7fc 00:13:30.156 06:29:43 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@883 -- # case $raid_level in 00:13:30.156 06:29:43 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@884 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -n Raid -r 0 -z 64 -b 'lvs0/lvol0 lvs0/lvol1' -s 00:13:30.721 [2024-07-25 06:29:44.174983] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 8a864667-d92e-4f0f-aea1-4d1cf3716c03 is claimed 00:13:30.721 [2024-07-25 06:29:44.175067] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 7611d86a-9220-42d8-bd5a-f7b5c538b7fc is claimed 00:13:30.721 [2024-07-25 06:29:44.175201] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24271f0 00:13:30.721 [2024-07-25 06:29:44.175212] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 245760, blocklen 512 00:13:30.721 [2024-07-25 06:29:44.175391] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2427890 00:13:30.721 [2024-07-25 06:29:44.175533] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24271f0 00:13:30.721 [2024-07-25 06:29:44.175543] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x24271f0 00:13:30.721 [2024-07-25 06:29:44.175641] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:30.721 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:13:30.721 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # jq '.[].num_blocks' 00:13:30.978 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # (( 64 == 64 )) 00:13:30.978 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:13:30.978 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # jq '.[].num_blocks' 00:13:31.236 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # (( 64 == 64 )) 00:13:31.236 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:13:31.236 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:31.306 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:13:31.306 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # jq '.[].num_blocks' 00:13:31.564 [2024-07-25 06:29:44.872986] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:31.564 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:13:31.564 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:13:31.564 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # (( 245760 == 245760 )) 00:13:31.564 06:29:44 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@899 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol0 100 00:13:31.564 [2024-07-25 06:29:45.101541] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:13:31.564 [2024-07-25 06:29:45.101562] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '8a864667-d92e-4f0f-aea1-4d1cf3716c03' was resized: old size 131072, new size 204800 00:13:31.821 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol1 100 00:13:31.821 [2024-07-25 06:29:45.330089] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:13:31.821 [2024-07-25 06:29:45.330107] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '7611d86a-9220-42d8-bd5a-f7b5c538b7fc' was resized: old size 131072, new size 204800 00:13:31.821 [2024-07-25 06:29:45.330128] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 245760 to 393216 00:13:31.821 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:13:31.822 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # jq '.[].num_blocks' 00:13:32.079 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # (( 100 == 100 )) 00:13:32.079 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:13:32.079 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # jq '.[].num_blocks' 00:13:32.337 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # (( 100 == 100 )) 00:13:32.337 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:13:32.337 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:32.337 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:13:32.337 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # jq '.[].num_blocks' 00:13:32.594 [2024-07-25 06:29:46.007968] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:32.594 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:13:32.594 06:29:45 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:13:32.594 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # (( 393216 == 393216 )) 00:13:32.594 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@912 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt0 00:13:32.852 [2024-07-25 06:29:46.232372] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev pt0 being removed: closing lvstore lvs0 00:13:32.852 [2024-07-25 06:29:46.232426] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol0 00:13:32.852 [2024-07-25 06:29:46.232435] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:32.852 [2024-07-25 06:29:46.232446] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol1 00:13:32.852 [2024-07-25 06:29:46.232520] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:32.852 [2024-07-25 06:29:46.232547] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:32.852 [2024-07-25 06:29:46.232558] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24271f0 name Raid, state offline 00:13:32.852 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@913 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:13:33.109 [2024-07-25 06:29:46.456930] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:13:33.109 [2024-07-25 06:29:46.456967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:33.110 [2024-07-25 06:29:46.456984] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240d990 00:13:33.110 [2024-07-25 06:29:46.456996] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:33.110 [2024-07-25 06:29:46.458458] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:33.110 [2024-07-25 06:29:46.458485] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:13:33.110 [2024-07-25 06:29:46.459646] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 8a864667-d92e-4f0f-aea1-4d1cf3716c03 00:13:33.110 [2024-07-25 06:29:46.459677] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 8a864667-d92e-4f0f-aea1-4d1cf3716c03 is claimed 00:13:33.110 [2024-07-25 06:29:46.459755] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 7611d86a-9220-42d8-bd5a-f7b5c538b7fc 00:13:33.110 [2024-07-25 06:29:46.459772] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 7611d86a-9220-42d8-bd5a-f7b5c538b7fc is claimed 00:13:33.110 [2024-07-25 06:29:46.459871] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev 7611d86a-9220-42d8-bd5a-f7b5c538b7fc (2) smaller than existing raid bdev Raid (3) 00:13:33.110 [2024-07-25 06:29:46.459899] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2410ea0 00:13:33.110 [2024-07-25 06:29:46.459906] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 393216, blocklen 512 00:13:33.110 [2024-07-25 06:29:46.460060] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2427890 00:13:33.110 [2024-07-25 06:29:46.460196] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2410ea0 00:13:33.110 [2024-07-25 06:29:46.460206] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x2410ea0 00:13:33.110 [2024-07-25 06:29:46.460306] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:33.110 pt0 00:13:33.110 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:13:33.110 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:33.110 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:13:33.110 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # jq '.[].num_blocks' 00:13:33.367 [2024-07-25 06:29:46.685770] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # (( 393216 == 393216 )) 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@922 -- # killprocess 1089393 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1089393 ']' 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1089393 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@955 -- # uname 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1089393 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:33.367 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:33.368 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1089393' 00:13:33.368 killing process with pid 1089393 00:13:33.368 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@969 -- # kill 1089393 00:13:33.368 [2024-07-25 06:29:46.761582] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:33.368 [2024-07-25 06:29:46.761626] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:33.368 [2024-07-25 06:29:46.761659] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:33.368 [2024-07-25 06:29:46.761669] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2410ea0 name Raid, state offline 00:13:33.368 06:29:46 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@974 -- # wait 1089393 00:13:33.368 [2024-07-25 06:29:46.839613] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:33.626 06:29:47 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@924 -- # return 0 00:13:33.626 00:13:33.626 real 0m5.583s 00:13:33.626 user 0m9.152s 00:13:33.626 sys 0m1.158s 00:13:33.626 06:29:47 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:33.626 06:29:47 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.626 ************************************ 00:13:33.626 END TEST raid0_resize_superblock_test 00:13:33.626 ************************************ 00:13:33.626 06:29:47 bdev_raid -- bdev/bdev_raid.sh@933 -- # run_test raid1_resize_superblock_test raid_resize_superblock_test 1 00:13:33.626 06:29:47 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:33.626 06:29:47 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:33.626 06:29:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:33.626 ************************************ 00:13:33.626 START TEST raid1_resize_superblock_test 00:13:33.626 ************************************ 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@1125 -- # raid_resize_superblock_test 1 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@868 -- # local raid_level=1 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@871 -- # raid_pid=1090499 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@872 -- # echo 'Process raid pid: 1090499' 00:13:33.626 Process raid pid: 1090499 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@870 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@873 -- # waitforlisten 1090499 /var/tmp/spdk-raid.sock 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1090499 ']' 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:33.626 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:33.626 06:29:47 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.626 [2024-07-25 06:29:47.153749] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:33.626 [2024-07-25 06:29:47.153804] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:33.885 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:33.885 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:33.885 [2024-07-25 06:29:47.293690] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:33.885 [2024-07-25 06:29:47.338454] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:33.885 [2024-07-25 06:29:47.401320] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:33.885 [2024-07-25 06:29:47.401355] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:34.838 06:29:48 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:34.838 06:29:48 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:13:34.838 06:29:48 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@875 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create -b malloc0 512 512 00:13:35.109 malloc0 00:13:35.109 06:29:48 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@877 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:13:35.109 [2024-07-25 06:29:48.614846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:13:35.109 [2024-07-25 06:29:48.614890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:35.109 [2024-07-25 06:29:48.614911] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1765150 00:13:35.109 [2024-07-25 06:29:48.614923] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:35.109 [2024-07-25 06:29:48.616378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:35.109 [2024-07-25 06:29:48.616407] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:13:35.109 pt0 00:13:35.109 06:29:48 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@878 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create_lvstore pt0 lvs0 00:13:35.676 eec9773a-f9dd-4034-848a-0a6236cf9b2f 00:13:35.676 06:29:48 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@880 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol0 64 00:13:35.676 e81ebe72-2837-4114-8679-c4566714a555 00:13:35.676 06:29:49 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@881 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol1 64 00:13:35.933 a0cc48cf-90bd-4aba-bce9-2d88150d865e 00:13:35.933 06:29:49 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@883 -- # case $raid_level in 00:13:35.933 06:29:49 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@885 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -n Raid -r 1 -b 'lvs0/lvol0 lvs0/lvol1' -s 00:13:36.191 [2024-07-25 06:29:49.604714] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev e81ebe72-2837-4114-8679-c4566714a555 is claimed 00:13:36.191 [2024-07-25 06:29:49.604785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev a0cc48cf-90bd-4aba-bce9-2d88150d865e is claimed 00:13:36.191 [2024-07-25 06:29:49.604912] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19311f0 00:13:36.191 [2024-07-25 06:29:49.604922] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 122880, blocklen 512 00:13:36.191 [2024-07-25 06:29:49.605097] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1931890 00:13:36.191 [2024-07-25 06:29:49.605256] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19311f0 00:13:36.191 [2024-07-25 06:29:49.605266] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x19311f0 00:13:36.191 [2024-07-25 06:29:49.605363] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:36.191 06:29:49 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:13:36.191 06:29:49 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # jq '.[].num_blocks' 00:13:36.449 06:29:49 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # (( 64 == 64 )) 00:13:36.449 06:29:49 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:13:36.449 06:29:49 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # jq '.[].num_blocks' 00:13:36.707 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # (( 64 == 64 )) 00:13:36.707 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:13:36.707 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:36.707 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:13:36.707 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # jq '.[].num_blocks' 00:13:36.964 [2024-07-25 06:29:50.294721] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:36.964 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:13:36.964 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:13:36.964 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # (( 122880 == 122880 )) 00:13:36.964 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@899 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol0 100 00:13:37.222 [2024-07-25 06:29:50.523259] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:13:37.222 [2024-07-25 06:29:50.523278] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'e81ebe72-2837-4114-8679-c4566714a555' was resized: old size 131072, new size 204800 00:13:37.222 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol1 100 00:13:37.222 [2024-07-25 06:29:50.747785] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:13:37.222 [2024-07-25 06:29:50.747804] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'a0cc48cf-90bd-4aba-bce9-2d88150d865e' was resized: old size 131072, new size 204800 00:13:37.222 [2024-07-25 06:29:50.747825] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 122880 to 196608 00:13:37.480 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:13:37.480 06:29:50 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # jq '.[].num_blocks' 00:13:37.480 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # (( 100 == 100 )) 00:13:37.480 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:13:37.480 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # jq '.[].num_blocks' 00:13:37.738 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # (( 100 == 100 )) 00:13:37.738 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:13:37.738 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:37.738 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:13:37.738 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # jq '.[].num_blocks' 00:13:37.995 [2024-07-25 06:29:51.437710] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:37.995 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:13:37.995 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:13:37.995 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # (( 196608 == 196608 )) 00:13:37.995 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@912 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt0 00:13:38.253 [2024-07-25 06:29:51.666121] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev pt0 being removed: closing lvstore lvs0 00:13:38.253 [2024-07-25 06:29:51.666181] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol0 00:13:38.253 [2024-07-25 06:29:51.666206] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol1 00:13:38.253 [2024-07-25 06:29:51.666316] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:38.253 [2024-07-25 06:29:51.666443] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:38.253 [2024-07-25 06:29:51.666497] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:38.253 [2024-07-25 06:29:51.666509] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19311f0 name Raid, state offline 00:13:38.253 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@913 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:13:38.510 [2024-07-25 06:29:51.890681] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:13:38.510 [2024-07-25 06:29:51.890718] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:38.510 [2024-07-25 06:29:51.890735] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x191c4f0 00:13:38.510 [2024-07-25 06:29:51.890746] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:38.510 [2024-07-25 06:29:51.892191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:38.510 [2024-07-25 06:29:51.892220] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:13:38.510 [2024-07-25 06:29:51.893375] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev e81ebe72-2837-4114-8679-c4566714a555 00:13:38.510 [2024-07-25 06:29:51.893408] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev e81ebe72-2837-4114-8679-c4566714a555 is claimed 00:13:38.510 [2024-07-25 06:29:51.893486] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev a0cc48cf-90bd-4aba-bce9-2d88150d865e 00:13:38.510 [2024-07-25 06:29:51.893503] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev a0cc48cf-90bd-4aba-bce9-2d88150d865e is claimed 00:13:38.510 [2024-07-25 06:29:51.893605] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev a0cc48cf-90bd-4aba-bce9-2d88150d865e (2) smaller than existing raid bdev Raid (3) 00:13:38.510 [2024-07-25 06:29:51.893633] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x191aea0 00:13:38.510 [2024-07-25 06:29:51.893645] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:38.510 [2024-07-25 06:29:51.893795] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1931890 00:13:38.510 [2024-07-25 06:29:51.893928] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x191aea0 00:13:38.510 [2024-07-25 06:29:51.893937] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x191aea0 00:13:38.510 [2024-07-25 06:29:51.894034] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:38.510 pt0 00:13:38.510 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:13:38.510 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:38.510 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:13:38.510 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # jq '.[].num_blocks' 00:13:38.767 [2024-07-25 06:29:52.119678] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:38.767 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:13:38.767 06:29:51 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # (( 196608 == 196608 )) 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@922 -- # killprocess 1090499 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1090499 ']' 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1090499 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@955 -- # uname 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1090499 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1090499' 00:13:38.767 killing process with pid 1090499 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@969 -- # kill 1090499 00:13:38.767 [2024-07-25 06:29:52.195826] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:38.767 [2024-07-25 06:29:52.195874] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:38.767 [2024-07-25 06:29:52.195913] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:38.767 [2024-07-25 06:29:52.195922] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x191aea0 name Raid, state offline 00:13:38.767 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@974 -- # wait 1090499 00:13:38.767 [2024-07-25 06:29:52.275488] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:39.025 06:29:52 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@924 -- # return 0 00:13:39.025 00:13:39.025 real 0m5.356s 00:13:39.025 user 0m8.680s 00:13:39.025 sys 0m1.163s 00:13:39.025 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:39.025 06:29:52 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.025 ************************************ 00:13:39.025 END TEST raid1_resize_superblock_test 00:13:39.025 ************************************ 00:13:39.025 06:29:52 bdev_raid -- bdev/bdev_raid.sh@935 -- # uname -s 00:13:39.025 06:29:52 bdev_raid -- bdev/bdev_raid.sh@935 -- # '[' Linux = Linux ']' 00:13:39.025 06:29:52 bdev_raid -- bdev/bdev_raid.sh@935 -- # modprobe -n nbd 00:13:39.025 06:29:52 bdev_raid -- bdev/bdev_raid.sh@936 -- # has_nbd=true 00:13:39.025 06:29:52 bdev_raid -- bdev/bdev_raid.sh@937 -- # modprobe nbd 00:13:39.025 06:29:52 bdev_raid -- bdev/bdev_raid.sh@938 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:13:39.025 06:29:52 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:39.025 06:29:52 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:39.025 06:29:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:39.025 ************************************ 00:13:39.025 START TEST raid_function_test_raid0 00:13:39.025 ************************************ 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # raid_function_test raid0 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1091359 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1091359' 00:13:39.025 Process raid pid: 1091359 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1091359 /var/tmp/spdk-raid.sock 00:13:39.025 06:29:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@831 -- # '[' -z 1091359 ']' 00:13:39.026 06:29:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:39.026 06:29:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:39.026 06:29:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:39.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:39.026 06:29:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:39.026 06:29:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:13:39.284 [2024-07-25 06:29:52.616926] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:39.284 [2024-07-25 06:29:52.616983] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:39.284 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:39.284 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:39.284 [2024-07-25 06:29:52.759333] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:39.284 [2024-07-25 06:29:52.804178] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:39.542 [2024-07-25 06:29:52.873910] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:39.542 [2024-07-25 06:29:52.873945] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:40.105 06:29:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:40.105 06:29:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@864 -- # return 0 00:13:40.105 06:29:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:13:40.105 06:29:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:13:40.105 06:29:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:13:40.105 06:29:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:13:40.105 06:29:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:13:40.362 [2024-07-25 06:29:53.762400] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:13:40.362 [2024-07-25 06:29:53.763438] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:13:40.362 [2024-07-25 06:29:53.763493] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe8f400 00:13:40.362 [2024-07-25 06:29:53.763503] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:13:40.362 [2024-07-25 06:29:53.763731] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe92f60 00:13:40.362 [2024-07-25 06:29:53.763829] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe8f400 00:13:40.362 [2024-07-25 06:29:53.763838] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xe8f400 00:13:40.362 [2024-07-25 06:29:53.763930] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:40.362 Base_1 00:13:40.362 Base_2 00:13:40.362 06:29:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:13:40.362 06:29:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:40.362 06:29:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:13:40.620 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:13:40.878 [2024-07-25 06:29:54.231635] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe8fd10 00:13:40.878 /dev/nbd0 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # local i 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@873 -- # break 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:40.878 1+0 records in 00:13:40.878 1+0 records out 00:13:40.878 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220341 s, 18.6 MB/s 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # size=4096 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@889 -- # return 0 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:40.878 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:13:41.136 { 00:13:41.136 "nbd_device": "/dev/nbd0", 00:13:41.136 "bdev_name": "raid" 00:13:41.136 } 00:13:41.136 ]' 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:13:41.136 { 00:13:41.136 "nbd_device": "/dev/nbd0", 00:13:41.136 "bdev_name": "raid" 00:13:41.136 } 00:13:41.136 ]' 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:13:41.136 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:13:41.137 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:13:41.137 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:13:41.137 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:13:41.137 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:13:41.137 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:13:41.137 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:13:41.137 4096+0 records in 00:13:41.137 4096+0 records out 00:13:41.137 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0289692 s, 72.4 MB/s 00:13:41.137 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:13:41.394 4096+0 records in 00:13:41.394 4096+0 records out 00:13:41.394 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.186225 s, 11.3 MB/s 00:13:41.394 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:13:41.394 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:13:41.394 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:13:41.395 128+0 records in 00:13:41.395 128+0 records out 00:13:41.395 65536 bytes (66 kB, 64 KiB) copied, 0.000848502 s, 77.2 MB/s 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:13:41.395 2035+0 records in 00:13:41.395 2035+0 records out 00:13:41.395 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0117613 s, 88.6 MB/s 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:13:41.395 456+0 records in 00:13:41.395 456+0 records out 00:13:41.395 233472 bytes (233 kB, 228 KiB) copied, 0.00269313 s, 86.7 MB/s 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:41.395 06:29:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:13:41.653 [2024-07-25 06:29:55.160688] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:41.653 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1091359 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@950 -- # '[' -z 1091359 ']' 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # kill -0 1091359 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # uname 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:41.911 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1091359 00:13:42.170 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:42.170 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:42.170 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1091359' 00:13:42.170 killing process with pid 1091359 00:13:42.170 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@969 -- # kill 1091359 00:13:42.170 [2024-07-25 06:29:55.505846] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:42.170 [2024-07-25 06:29:55.505901] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:42.170 [2024-07-25 06:29:55.505937] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:42.170 [2024-07-25 06:29:55.505947] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe8f400 name raid, state offline 00:13:42.170 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@974 -- # wait 1091359 00:13:42.170 [2024-07-25 06:29:55.521706] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:42.170 06:29:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:13:42.170 00:13:42.170 real 0m3.140s 00:13:42.170 user 0m4.166s 00:13:42.170 sys 0m1.180s 00:13:42.170 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:42.170 06:29:55 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:13:42.170 ************************************ 00:13:42.170 END TEST raid_function_test_raid0 00:13:42.170 ************************************ 00:13:42.429 06:29:55 bdev_raid -- bdev/bdev_raid.sh@939 -- # run_test raid_function_test_concat raid_function_test concat 00:13:42.429 06:29:55 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:42.429 06:29:55 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:42.429 06:29:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:42.429 ************************************ 00:13:42.429 START TEST raid_function_test_concat 00:13:42.429 ************************************ 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # raid_function_test concat 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1091975 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1091975' 00:13:42.429 Process raid pid: 1091975 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1091975 /var/tmp/spdk-raid.sock 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@831 -- # '[' -z 1091975 ']' 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:42.429 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:42.429 06:29:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:13:42.429 [2024-07-25 06:29:55.846716] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:42.429 [2024-07-25 06:29:55.846775] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:42.429 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:42.429 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:42.429 [2024-07-25 06:29:55.983855] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:42.687 [2024-07-25 06:29:56.027095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:42.687 [2024-07-25 06:29:56.089194] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:42.687 [2024-07-25 06:29:56.089229] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:43.253 06:29:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:43.253 06:29:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@864 -- # return 0 00:13:43.253 06:29:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:13:43.253 06:29:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:13:43.253 06:29:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:13:43.253 06:29:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:13:43.253 06:29:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:13:43.511 [2024-07-25 06:29:56.980228] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:13:43.511 [2024-07-25 06:29:56.981267] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:13:43.511 [2024-07-25 06:29:56.981325] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24cb400 00:13:43.511 [2024-07-25 06:29:56.981335] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:13:43.511 [2024-07-25 06:29:56.981568] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24cef60 00:13:43.511 [2024-07-25 06:29:56.981671] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24cb400 00:13:43.511 [2024-07-25 06:29:56.981680] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x24cb400 00:13:43.511 [2024-07-25 06:29:56.981775] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:43.511 Base_1 00:13:43.511 Base_2 00:13:43.511 06:29:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:13:43.511 06:29:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:43.511 06:29:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:13:43.769 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:13:44.334 [2024-07-25 06:29:57.710168] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24cbd10 00:13:44.334 /dev/nbd0 00:13:44.334 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:13:44.334 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:13:44.334 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:13:44.334 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # local i 00:13:44.334 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:13:44.334 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:13:44.334 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@873 -- # break 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:44.335 1+0 records in 00:13:44.335 1+0 records out 00:13:44.335 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228091 s, 18.0 MB/s 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # size=4096 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@889 -- # return 0 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:44.335 06:29:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:13:44.592 { 00:13:44.592 "nbd_device": "/dev/nbd0", 00:13:44.592 "bdev_name": "raid" 00:13:44.592 } 00:13:44.592 ]' 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:13:44.592 { 00:13:44.592 "nbd_device": "/dev/nbd0", 00:13:44.592 "bdev_name": "raid" 00:13:44.592 } 00:13:44.592 ]' 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:44.592 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:13:44.593 4096+0 records in 00:13:44.593 4096+0 records out 00:13:44.593 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.028479 s, 73.6 MB/s 00:13:44.593 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:13:44.851 4096+0 records in 00:13:44.851 4096+0 records out 00:13:44.851 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.180994 s, 11.6 MB/s 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:13:44.851 128+0 records in 00:13:44.851 128+0 records out 00:13:44.851 65536 bytes (66 kB, 64 KiB) copied, 0.00048093 s, 136 MB/s 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:13:44.851 2035+0 records in 00:13:44.851 2035+0 records out 00:13:44.851 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0116975 s, 89.1 MB/s 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:13:44.851 456+0 records in 00:13:44.851 456+0 records out 00:13:44.851 233472 bytes (233 kB, 228 KiB) copied, 0.00267987 s, 87.1 MB/s 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:44.851 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:13:45.108 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:13:45.108 [2024-07-25 06:29:58.631177] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:45.108 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:13:45.109 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:13:45.366 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:13:45.367 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:13:45.367 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:13:45.367 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1091975 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@950 -- # '[' -z 1091975 ']' 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # kill -0 1091975 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # uname 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1091975 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1091975' 00:13:45.625 killing process with pid 1091975 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@969 -- # kill 1091975 00:13:45.625 [2024-07-25 06:29:58.989164] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:45.625 [2024-07-25 06:29:58.989228] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:45.625 [2024-07-25 06:29:58.989268] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:45.625 [2024-07-25 06:29:58.989279] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24cb400 name raid, state offline 00:13:45.625 06:29:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@974 -- # wait 1091975 00:13:45.625 [2024-07-25 06:29:59.005177] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:45.883 06:29:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:13:45.883 00:13:45.883 real 0m3.397s 00:13:45.883 user 0m4.639s 00:13:45.883 sys 0m1.238s 00:13:45.883 06:29:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:45.883 06:29:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:13:45.883 ************************************ 00:13:45.883 END TEST raid_function_test_concat 00:13:45.883 ************************************ 00:13:45.883 06:29:59 bdev_raid -- bdev/bdev_raid.sh@942 -- # run_test raid0_resize_test raid_resize_test 0 00:13:45.883 06:29:59 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:45.883 06:29:59 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:45.883 06:29:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:45.883 ************************************ 00:13:45.883 START TEST raid0_resize_test 00:13:45.883 ************************************ 00:13:45.883 06:29:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # raid_resize_test 0 00:13:45.883 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local raid_level=0 00:13:45.883 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:13:45.883 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:13:45.883 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # local expected_size 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # raid_pid=1092731 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@358 -- # echo 'Process raid pid: 1092731' 00:13:45.884 Process raid pid: 1092731 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # waitforlisten 1092731 /var/tmp/spdk-raid.sock 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@831 -- # '[' -z 1092731 ']' 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:45.884 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:45.884 06:29:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.884 [2024-07-25 06:29:59.329917] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:45.884 [2024-07-25 06:29:59.329973] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:45.884 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:45.884 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:46.142 [2024-07-25 06:29:59.465924] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:46.142 [2024-07-25 06:29:59.509846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:46.142 [2024-07-25 06:29:59.572964] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:46.142 [2024-07-25 06:29:59.573001] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:46.707 06:30:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:46.707 06:30:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@864 -- # return 0 00:13:46.707 06:30:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:13:46.967 Base_1 00:13:46.967 06:30:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:13:46.967 Base_2 00:13:46.967 06:30:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@364 -- # '[' 0 -eq 0 ']' 00:13:46.967 06:30:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:13:47.226 [2024-07-25 06:30:00.614946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:13:47.226 [2024-07-25 06:30:00.616023] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:13:47.226 [2024-07-25 06:30:00.616072] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa6d040 00:13:47.226 [2024-07-25 06:30:00.616081] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:13:47.226 [2024-07-25 06:30:00.616336] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8bdf80 00:13:47.226 [2024-07-25 06:30:00.616418] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa6d040 00:13:47.226 [2024-07-25 06:30:00.616427] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0xa6d040 00:13:47.226 [2024-07-25 06:30:00.616522] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:47.226 06:30:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@371 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:13:47.525 [2024-07-25 06:30:00.783375] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:13:47.525 [2024-07-25 06:30:00.783391] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:13:47.525 true 00:13:47.525 06:30:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:47.525 06:30:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # jq '.[].num_blocks' 00:13:47.525 [2024-07-25 06:30:01.016136] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:47.525 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # blkcnt=131072 00:13:47.525 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@375 -- # raid_size_mb=64 00:13:47.525 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # '[' 0 -eq 0 ']' 00:13:47.525 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@377 -- # expected_size=64 00:13:47.526 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 64 '!=' 64 ']' 00:13:47.526 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:13:47.802 [2024-07-25 06:30:01.192435] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:13:47.802 [2024-07-25 06:30:01.192454] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:13:47.802 [2024-07-25 06:30:01.192476] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:13:47.802 true 00:13:47.802 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:47.802 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # jq '.[].num_blocks' 00:13:48.061 [2024-07-25 06:30:01.425211] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # blkcnt=262144 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@391 -- # raid_size_mb=128 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@392 -- # '[' 0 -eq 0 ']' 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@393 -- # expected_size=128 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@397 -- # '[' 128 '!=' 128 ']' 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@402 -- # killprocess 1092731 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@950 -- # '[' -z 1092731 ']' 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # kill -0 1092731 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # uname 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1092731 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1092731' 00:13:48.061 killing process with pid 1092731 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@969 -- # kill 1092731 00:13:48.061 [2024-07-25 06:30:01.501889] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:48.061 [2024-07-25 06:30:01.501940] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:48.061 [2024-07-25 06:30:01.501980] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:48.061 [2024-07-25 06:30:01.501990] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa6d040 name Raid, state offline 00:13:48.061 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@974 -- # wait 1092731 00:13:48.061 [2024-07-25 06:30:01.503172] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:48.320 06:30:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@404 -- # return 0 00:13:48.320 00:13:48.320 real 0m2.395s 00:13:48.320 user 0m3.533s 00:13:48.320 sys 0m0.583s 00:13:48.320 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:48.320 06:30:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.320 ************************************ 00:13:48.320 END TEST raid0_resize_test 00:13:48.320 ************************************ 00:13:48.320 06:30:01 bdev_raid -- bdev/bdev_raid.sh@943 -- # run_test raid1_resize_test raid_resize_test 1 00:13:48.320 06:30:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:13:48.320 06:30:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:48.320 06:30:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:48.320 ************************************ 00:13:48.320 START TEST raid1_resize_test 00:13:48.320 ************************************ 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- common/autotest_common.sh@1125 -- # raid_resize_test 1 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@347 -- # local raid_level=1 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@354 -- # local expected_size 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@357 -- # raid_pid=1093266 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@356 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@358 -- # echo 'Process raid pid: 1093266' 00:13:48.320 Process raid pid: 1093266 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@359 -- # waitforlisten 1093266 /var/tmp/spdk-raid.sock 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- common/autotest_common.sh@831 -- # '[' -z 1093266 ']' 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:48.320 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:48.320 06:30:01 bdev_raid.raid1_resize_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.320 [2024-07-25 06:30:01.802012] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:48.320 [2024-07-25 06:30:01.802066] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:48.579 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:48.579 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:48.579 [2024-07-25 06:30:01.939828] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:48.579 [2024-07-25 06:30:01.984462] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:48.579 [2024-07-25 06:30:02.047669] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:48.579 [2024-07-25 06:30:02.047692] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:49.513 06:30:02 bdev_raid.raid1_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:49.513 06:30:02 bdev_raid.raid1_resize_test -- common/autotest_common.sh@864 -- # return 0 00:13:49.513 06:30:02 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:13:49.513 Base_1 00:13:49.513 06:30:02 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:13:49.772 Base_2 00:13:49.772 06:30:03 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@364 -- # '[' 1 -eq 0 ']' 00:13:49.772 06:30:03 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@367 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r 1 -b 'Base_1 Base_2' -n Raid 00:13:50.335 [2024-07-25 06:30:03.649698] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:13:50.335 [2024-07-25 06:30:03.650818] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:13:50.335 [2024-07-25 06:30:03.650869] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x102c040 00:13:50.335 [2024-07-25 06:30:03.650878] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:50.335 [2024-07-25 06:30:03.651120] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe7d210 00:13:50.335 [2024-07-25 06:30:03.651215] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x102c040 00:13:50.335 [2024-07-25 06:30:03.651224] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x102c040 00:13:50.335 [2024-07-25 06:30:03.651320] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:50.335 06:30:03 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@371 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:13:50.335 [2024-07-25 06:30:03.886307] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:13:50.335 [2024-07-25 06:30:03.886328] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:13:50.335 true 00:13:50.593 06:30:03 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:50.593 06:30:03 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # jq '.[].num_blocks' 00:13:50.593 [2024-07-25 06:30:04.115042] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:50.593 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # blkcnt=65536 00:13:50.593 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@375 -- # raid_size_mb=32 00:13:50.593 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@376 -- # '[' 1 -eq 0 ']' 00:13:50.593 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@379 -- # expected_size=32 00:13:50.593 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 32 '!=' 32 ']' 00:13:50.593 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:13:50.851 [2024-07-25 06:30:04.339472] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:13:50.851 [2024-07-25 06:30:04.339487] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:13:50.851 [2024-07-25 06:30:04.339509] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 65536 to 131072 00:13:50.851 true 00:13:50.851 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:13:50.851 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # jq '.[].num_blocks' 00:13:51.109 [2024-07-25 06:30:04.552171] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # blkcnt=131072 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@391 -- # raid_size_mb=64 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@392 -- # '[' 1 -eq 0 ']' 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@395 -- # expected_size=64 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@397 -- # '[' 64 '!=' 64 ']' 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@402 -- # killprocess 1093266 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@950 -- # '[' -z 1093266 ']' 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@954 -- # kill -0 1093266 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@955 -- # uname 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1093266 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1093266' 00:13:51.109 killing process with pid 1093266 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@969 -- # kill 1093266 00:13:51.109 [2024-07-25 06:30:04.627813] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:51.109 [2024-07-25 06:30:04.627864] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:51.109 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@974 -- # wait 1093266 00:13:51.109 [2024-07-25 06:30:04.628186] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:51.109 [2024-07-25 06:30:04.628198] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x102c040 name Raid, state offline 00:13:51.109 [2024-07-25 06:30:04.629093] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:51.368 06:30:04 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@404 -- # return 0 00:13:51.368 00:13:51.368 real 0m3.041s 00:13:51.368 user 0m4.751s 00:13:51.368 sys 0m0.652s 00:13:51.368 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:51.368 06:30:04 bdev_raid.raid1_resize_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.368 ************************************ 00:13:51.368 END TEST raid1_resize_test 00:13:51.368 ************************************ 00:13:51.368 06:30:04 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:13:51.368 06:30:04 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:13:51.368 06:30:04 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:13:51.368 06:30:04 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:51.368 06:30:04 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:51.368 06:30:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:51.368 ************************************ 00:13:51.368 START TEST raid_state_function_test 00:13:51.368 ************************************ 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 false 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1093937 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1093937' 00:13:51.368 Process raid pid: 1093937 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1093937 /var/tmp/spdk-raid.sock 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1093937 ']' 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:51.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:51.368 06:30:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.627 [2024-07-25 06:30:04.939688] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:13:51.627 [2024-07-25 06:30:04.939742] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:01.0 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:01.1 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:01.2 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:01.3 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:01.4 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:01.5 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:01.6 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:01.7 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:02.0 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:02.1 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:02.2 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:02.3 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:02.4 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:02.5 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:02.6 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3d:02.7 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3f:01.0 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3f:01.1 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3f:01.2 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3f:01.3 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3f:01.4 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3f:01.5 cannot be used 00:13:51.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.627 EAL: Requested device 0000:3f:01.6 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:01.7 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:02.0 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:02.1 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:02.2 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:02.3 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:02.4 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:02.5 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:02.6 cannot be used 00:13:51.628 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:13:51.628 EAL: Requested device 0000:3f:02.7 cannot be used 00:13:51.628 [2024-07-25 06:30:05.076194] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:51.628 [2024-07-25 06:30:05.120453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:51.886 [2024-07-25 06:30:05.183223] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.886 [2024-07-25 06:30:05.183250] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:52.455 06:30:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:52.455 06:30:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:13:52.455 06:30:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:52.714 [2024-07-25 06:30:06.043414] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:52.714 [2024-07-25 06:30:06.043450] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:52.714 [2024-07-25 06:30:06.043459] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:52.714 [2024-07-25 06:30:06.043470] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.714 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.974 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.974 "name": "Existed_Raid", 00:13:52.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.974 "strip_size_kb": 64, 00:13:52.974 "state": "configuring", 00:13:52.974 "raid_level": "raid0", 00:13:52.974 "superblock": false, 00:13:52.974 "num_base_bdevs": 2, 00:13:52.974 "num_base_bdevs_discovered": 0, 00:13:52.974 "num_base_bdevs_operational": 2, 00:13:52.974 "base_bdevs_list": [ 00:13:52.974 { 00:13:52.974 "name": "BaseBdev1", 00:13:52.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.974 "is_configured": false, 00:13:52.974 "data_offset": 0, 00:13:52.974 "data_size": 0 00:13:52.974 }, 00:13:52.974 { 00:13:52.974 "name": "BaseBdev2", 00:13:52.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.974 "is_configured": false, 00:13:52.974 "data_offset": 0, 00:13:52.974 "data_size": 0 00:13:52.974 } 00:13:52.974 ] 00:13:52.974 }' 00:13:52.974 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.974 06:30:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:53.542 06:30:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:53.542 [2024-07-25 06:30:07.074012] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:53.542 [2024-07-25 06:30:07.074036] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11a4470 name Existed_Raid, state configuring 00:13:53.542 06:30:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:53.802 [2024-07-25 06:30:07.298613] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:53.802 [2024-07-25 06:30:07.298640] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:53.802 [2024-07-25 06:30:07.298649] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:53.802 [2024-07-25 06:30:07.298659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:53.802 06:30:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:54.061 [2024-07-25 06:30:07.532665] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:54.061 BaseBdev1 00:13:54.061 06:30:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:54.061 06:30:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:54.061 06:30:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:54.061 06:30:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:54.061 06:30:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:54.061 06:30:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:54.061 06:30:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:54.320 06:30:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:54.887 [ 00:13:54.887 { 00:13:54.887 "name": "BaseBdev1", 00:13:54.887 "aliases": [ 00:13:54.887 "6f26e73e-68e6-454a-b9cc-a23277539a36" 00:13:54.887 ], 00:13:54.887 "product_name": "Malloc disk", 00:13:54.887 "block_size": 512, 00:13:54.887 "num_blocks": 65536, 00:13:54.887 "uuid": "6f26e73e-68e6-454a-b9cc-a23277539a36", 00:13:54.887 "assigned_rate_limits": { 00:13:54.887 "rw_ios_per_sec": 0, 00:13:54.887 "rw_mbytes_per_sec": 0, 00:13:54.887 "r_mbytes_per_sec": 0, 00:13:54.887 "w_mbytes_per_sec": 0 00:13:54.887 }, 00:13:54.887 "claimed": true, 00:13:54.887 "claim_type": "exclusive_write", 00:13:54.887 "zoned": false, 00:13:54.887 "supported_io_types": { 00:13:54.887 "read": true, 00:13:54.887 "write": true, 00:13:54.887 "unmap": true, 00:13:54.887 "flush": true, 00:13:54.887 "reset": true, 00:13:54.887 "nvme_admin": false, 00:13:54.887 "nvme_io": false, 00:13:54.887 "nvme_io_md": false, 00:13:54.887 "write_zeroes": true, 00:13:54.887 "zcopy": true, 00:13:54.887 "get_zone_info": false, 00:13:54.887 "zone_management": false, 00:13:54.887 "zone_append": false, 00:13:54.887 "compare": false, 00:13:54.887 "compare_and_write": false, 00:13:54.887 "abort": true, 00:13:54.887 "seek_hole": false, 00:13:54.887 "seek_data": false, 00:13:54.887 "copy": true, 00:13:54.887 "nvme_iov_md": false 00:13:54.887 }, 00:13:54.887 "memory_domains": [ 00:13:54.887 { 00:13:54.887 "dma_device_id": "system", 00:13:54.887 "dma_device_type": 1 00:13:54.887 }, 00:13:54.887 { 00:13:54.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.887 "dma_device_type": 2 00:13:54.887 } 00:13:54.887 ], 00:13:54.887 "driver_specific": {} 00:13:54.887 } 00:13:54.887 ] 00:13:54.887 06:30:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:54.887 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:13:54.887 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:54.887 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:54.887 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:54.887 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:54.887 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:54.887 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:54.888 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:54.888 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:54.888 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:54.888 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.888 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.147 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.147 "name": "Existed_Raid", 00:13:55.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.147 "strip_size_kb": 64, 00:13:55.147 "state": "configuring", 00:13:55.147 "raid_level": "raid0", 00:13:55.147 "superblock": false, 00:13:55.147 "num_base_bdevs": 2, 00:13:55.147 "num_base_bdevs_discovered": 1, 00:13:55.147 "num_base_bdevs_operational": 2, 00:13:55.147 "base_bdevs_list": [ 00:13:55.147 { 00:13:55.147 "name": "BaseBdev1", 00:13:55.147 "uuid": "6f26e73e-68e6-454a-b9cc-a23277539a36", 00:13:55.147 "is_configured": true, 00:13:55.147 "data_offset": 0, 00:13:55.147 "data_size": 65536 00:13:55.147 }, 00:13:55.147 { 00:13:55.147 "name": "BaseBdev2", 00:13:55.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.147 "is_configured": false, 00:13:55.147 "data_offset": 0, 00:13:55.147 "data_size": 0 00:13:55.147 } 00:13:55.147 ] 00:13:55.147 }' 00:13:55.147 06:30:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.147 06:30:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.715 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:55.715 [2024-07-25 06:30:09.181021] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:55.715 [2024-07-25 06:30:09.181054] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11a3ce0 name Existed_Raid, state configuring 00:13:55.715 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:55.974 [2024-07-25 06:30:09.349486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:55.974 [2024-07-25 06:30:09.350852] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:55.974 [2024-07-25 06:30:09.350882] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.974 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.233 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.233 "name": "Existed_Raid", 00:13:56.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.233 "strip_size_kb": 64, 00:13:56.233 "state": "configuring", 00:13:56.233 "raid_level": "raid0", 00:13:56.233 "superblock": false, 00:13:56.233 "num_base_bdevs": 2, 00:13:56.233 "num_base_bdevs_discovered": 1, 00:13:56.233 "num_base_bdevs_operational": 2, 00:13:56.233 "base_bdevs_list": [ 00:13:56.233 { 00:13:56.233 "name": "BaseBdev1", 00:13:56.233 "uuid": "6f26e73e-68e6-454a-b9cc-a23277539a36", 00:13:56.233 "is_configured": true, 00:13:56.233 "data_offset": 0, 00:13:56.233 "data_size": 65536 00:13:56.233 }, 00:13:56.233 { 00:13:56.233 "name": "BaseBdev2", 00:13:56.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.233 "is_configured": false, 00:13:56.233 "data_offset": 0, 00:13:56.233 "data_size": 0 00:13:56.233 } 00:13:56.233 ] 00:13:56.233 }' 00:13:56.233 06:30:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.233 06:30:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.492 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:56.749 [2024-07-25 06:30:10.239033] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:56.749 [2024-07-25 06:30:10.239072] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1357120 00:13:56.750 [2024-07-25 06:30:10.239080] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:13:56.750 [2024-07-25 06:30:10.239259] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x134e050 00:13:56.750 [2024-07-25 06:30:10.239373] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1357120 00:13:56.750 [2024-07-25 06:30:10.239382] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1357120 00:13:56.750 [2024-07-25 06:30:10.239535] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:56.750 BaseBdev2 00:13:56.750 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:56.750 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:56.750 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:56.750 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:56.750 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:56.750 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:56.750 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:57.007 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:57.266 [ 00:13:57.266 { 00:13:57.266 "name": "BaseBdev2", 00:13:57.266 "aliases": [ 00:13:57.266 "9cb9cb44-737e-43be-a24e-49e48ec3d9db" 00:13:57.266 ], 00:13:57.266 "product_name": "Malloc disk", 00:13:57.266 "block_size": 512, 00:13:57.266 "num_blocks": 65536, 00:13:57.266 "uuid": "9cb9cb44-737e-43be-a24e-49e48ec3d9db", 00:13:57.266 "assigned_rate_limits": { 00:13:57.266 "rw_ios_per_sec": 0, 00:13:57.266 "rw_mbytes_per_sec": 0, 00:13:57.266 "r_mbytes_per_sec": 0, 00:13:57.266 "w_mbytes_per_sec": 0 00:13:57.266 }, 00:13:57.266 "claimed": true, 00:13:57.266 "claim_type": "exclusive_write", 00:13:57.266 "zoned": false, 00:13:57.266 "supported_io_types": { 00:13:57.266 "read": true, 00:13:57.266 "write": true, 00:13:57.266 "unmap": true, 00:13:57.266 "flush": true, 00:13:57.266 "reset": true, 00:13:57.266 "nvme_admin": false, 00:13:57.266 "nvme_io": false, 00:13:57.266 "nvme_io_md": false, 00:13:57.266 "write_zeroes": true, 00:13:57.266 "zcopy": true, 00:13:57.266 "get_zone_info": false, 00:13:57.266 "zone_management": false, 00:13:57.266 "zone_append": false, 00:13:57.266 "compare": false, 00:13:57.266 "compare_and_write": false, 00:13:57.266 "abort": true, 00:13:57.266 "seek_hole": false, 00:13:57.266 "seek_data": false, 00:13:57.266 "copy": true, 00:13:57.266 "nvme_iov_md": false 00:13:57.266 }, 00:13:57.266 "memory_domains": [ 00:13:57.266 { 00:13:57.266 "dma_device_id": "system", 00:13:57.266 "dma_device_type": 1 00:13:57.266 }, 00:13:57.266 { 00:13:57.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.266 "dma_device_type": 2 00:13:57.266 } 00:13:57.266 ], 00:13:57.266 "driver_specific": {} 00:13:57.266 } 00:13:57.266 ] 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.266 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.533 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.533 "name": "Existed_Raid", 00:13:57.533 "uuid": "b1df588f-a2a1-4f4d-afea-827c85dcdcba", 00:13:57.533 "strip_size_kb": 64, 00:13:57.533 "state": "online", 00:13:57.533 "raid_level": "raid0", 00:13:57.533 "superblock": false, 00:13:57.533 "num_base_bdevs": 2, 00:13:57.533 "num_base_bdevs_discovered": 2, 00:13:57.533 "num_base_bdevs_operational": 2, 00:13:57.533 "base_bdevs_list": [ 00:13:57.533 { 00:13:57.533 "name": "BaseBdev1", 00:13:57.533 "uuid": "6f26e73e-68e6-454a-b9cc-a23277539a36", 00:13:57.533 "is_configured": true, 00:13:57.533 "data_offset": 0, 00:13:57.533 "data_size": 65536 00:13:57.533 }, 00:13:57.533 { 00:13:57.533 "name": "BaseBdev2", 00:13:57.533 "uuid": "9cb9cb44-737e-43be-a24e-49e48ec3d9db", 00:13:57.533 "is_configured": true, 00:13:57.533 "data_offset": 0, 00:13:57.533 "data_size": 65536 00:13:57.533 } 00:13:57.533 ] 00:13:57.533 }' 00:13:57.533 06:30:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.533 06:30:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:58.101 [2024-07-25 06:30:11.606883] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:58.101 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:58.101 "name": "Existed_Raid", 00:13:58.101 "aliases": [ 00:13:58.102 "b1df588f-a2a1-4f4d-afea-827c85dcdcba" 00:13:58.102 ], 00:13:58.102 "product_name": "Raid Volume", 00:13:58.102 "block_size": 512, 00:13:58.102 "num_blocks": 131072, 00:13:58.102 "uuid": "b1df588f-a2a1-4f4d-afea-827c85dcdcba", 00:13:58.102 "assigned_rate_limits": { 00:13:58.102 "rw_ios_per_sec": 0, 00:13:58.102 "rw_mbytes_per_sec": 0, 00:13:58.102 "r_mbytes_per_sec": 0, 00:13:58.102 "w_mbytes_per_sec": 0 00:13:58.102 }, 00:13:58.102 "claimed": false, 00:13:58.102 "zoned": false, 00:13:58.102 "supported_io_types": { 00:13:58.102 "read": true, 00:13:58.102 "write": true, 00:13:58.102 "unmap": true, 00:13:58.102 "flush": true, 00:13:58.102 "reset": true, 00:13:58.102 "nvme_admin": false, 00:13:58.102 "nvme_io": false, 00:13:58.102 "nvme_io_md": false, 00:13:58.102 "write_zeroes": true, 00:13:58.102 "zcopy": false, 00:13:58.102 "get_zone_info": false, 00:13:58.102 "zone_management": false, 00:13:58.102 "zone_append": false, 00:13:58.102 "compare": false, 00:13:58.102 "compare_and_write": false, 00:13:58.102 "abort": false, 00:13:58.102 "seek_hole": false, 00:13:58.102 "seek_data": false, 00:13:58.102 "copy": false, 00:13:58.102 "nvme_iov_md": false 00:13:58.102 }, 00:13:58.102 "memory_domains": [ 00:13:58.102 { 00:13:58.102 "dma_device_id": "system", 00:13:58.102 "dma_device_type": 1 00:13:58.102 }, 00:13:58.102 { 00:13:58.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.102 "dma_device_type": 2 00:13:58.102 }, 00:13:58.102 { 00:13:58.102 "dma_device_id": "system", 00:13:58.102 "dma_device_type": 1 00:13:58.102 }, 00:13:58.102 { 00:13:58.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.102 "dma_device_type": 2 00:13:58.102 } 00:13:58.102 ], 00:13:58.102 "driver_specific": { 00:13:58.102 "raid": { 00:13:58.102 "uuid": "b1df588f-a2a1-4f4d-afea-827c85dcdcba", 00:13:58.102 "strip_size_kb": 64, 00:13:58.102 "state": "online", 00:13:58.102 "raid_level": "raid0", 00:13:58.102 "superblock": false, 00:13:58.102 "num_base_bdevs": 2, 00:13:58.102 "num_base_bdevs_discovered": 2, 00:13:58.102 "num_base_bdevs_operational": 2, 00:13:58.102 "base_bdevs_list": [ 00:13:58.102 { 00:13:58.102 "name": "BaseBdev1", 00:13:58.102 "uuid": "6f26e73e-68e6-454a-b9cc-a23277539a36", 00:13:58.102 "is_configured": true, 00:13:58.102 "data_offset": 0, 00:13:58.102 "data_size": 65536 00:13:58.102 }, 00:13:58.102 { 00:13:58.102 "name": "BaseBdev2", 00:13:58.102 "uuid": "9cb9cb44-737e-43be-a24e-49e48ec3d9db", 00:13:58.102 "is_configured": true, 00:13:58.102 "data_offset": 0, 00:13:58.102 "data_size": 65536 00:13:58.102 } 00:13:58.102 ] 00:13:58.102 } 00:13:58.102 } 00:13:58.102 }' 00:13:58.102 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:58.361 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:58.361 BaseBdev2' 00:13:58.361 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:58.361 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:58.361 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:58.361 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:58.361 "name": "BaseBdev1", 00:13:58.361 "aliases": [ 00:13:58.361 "6f26e73e-68e6-454a-b9cc-a23277539a36" 00:13:58.361 ], 00:13:58.361 "product_name": "Malloc disk", 00:13:58.361 "block_size": 512, 00:13:58.361 "num_blocks": 65536, 00:13:58.361 "uuid": "6f26e73e-68e6-454a-b9cc-a23277539a36", 00:13:58.361 "assigned_rate_limits": { 00:13:58.361 "rw_ios_per_sec": 0, 00:13:58.361 "rw_mbytes_per_sec": 0, 00:13:58.361 "r_mbytes_per_sec": 0, 00:13:58.361 "w_mbytes_per_sec": 0 00:13:58.361 }, 00:13:58.361 "claimed": true, 00:13:58.361 "claim_type": "exclusive_write", 00:13:58.361 "zoned": false, 00:13:58.361 "supported_io_types": { 00:13:58.361 "read": true, 00:13:58.361 "write": true, 00:13:58.361 "unmap": true, 00:13:58.361 "flush": true, 00:13:58.361 "reset": true, 00:13:58.361 "nvme_admin": false, 00:13:58.361 "nvme_io": false, 00:13:58.361 "nvme_io_md": false, 00:13:58.361 "write_zeroes": true, 00:13:58.361 "zcopy": true, 00:13:58.361 "get_zone_info": false, 00:13:58.361 "zone_management": false, 00:13:58.361 "zone_append": false, 00:13:58.361 "compare": false, 00:13:58.361 "compare_and_write": false, 00:13:58.361 "abort": true, 00:13:58.361 "seek_hole": false, 00:13:58.361 "seek_data": false, 00:13:58.361 "copy": true, 00:13:58.361 "nvme_iov_md": false 00:13:58.361 }, 00:13:58.361 "memory_domains": [ 00:13:58.361 { 00:13:58.361 "dma_device_id": "system", 00:13:58.361 "dma_device_type": 1 00:13:58.361 }, 00:13:58.361 { 00:13:58.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.361 "dma_device_type": 2 00:13:58.362 } 00:13:58.362 ], 00:13:58.362 "driver_specific": {} 00:13:58.362 }' 00:13:58.362 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.620 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.620 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:58.620 06:30:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.620 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.620 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:58.620 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:58.621 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:58.621 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:58.621 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:58.880 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:58.880 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:58.880 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:58.880 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:58.880 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:58.880 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:58.880 "name": "BaseBdev2", 00:13:58.880 "aliases": [ 00:13:58.880 "9cb9cb44-737e-43be-a24e-49e48ec3d9db" 00:13:58.880 ], 00:13:58.880 "product_name": "Malloc disk", 00:13:58.880 "block_size": 512, 00:13:58.880 "num_blocks": 65536, 00:13:58.880 "uuid": "9cb9cb44-737e-43be-a24e-49e48ec3d9db", 00:13:58.880 "assigned_rate_limits": { 00:13:58.880 "rw_ios_per_sec": 0, 00:13:58.880 "rw_mbytes_per_sec": 0, 00:13:58.880 "r_mbytes_per_sec": 0, 00:13:58.880 "w_mbytes_per_sec": 0 00:13:58.880 }, 00:13:58.880 "claimed": true, 00:13:58.880 "claim_type": "exclusive_write", 00:13:58.880 "zoned": false, 00:13:58.880 "supported_io_types": { 00:13:58.880 "read": true, 00:13:58.880 "write": true, 00:13:58.880 "unmap": true, 00:13:58.880 "flush": true, 00:13:58.880 "reset": true, 00:13:58.880 "nvme_admin": false, 00:13:58.880 "nvme_io": false, 00:13:58.880 "nvme_io_md": false, 00:13:58.880 "write_zeroes": true, 00:13:58.880 "zcopy": true, 00:13:58.880 "get_zone_info": false, 00:13:58.880 "zone_management": false, 00:13:58.880 "zone_append": false, 00:13:58.880 "compare": false, 00:13:58.880 "compare_and_write": false, 00:13:58.880 "abort": true, 00:13:58.880 "seek_hole": false, 00:13:58.880 "seek_data": false, 00:13:58.880 "copy": true, 00:13:58.880 "nvme_iov_md": false 00:13:58.880 }, 00:13:58.880 "memory_domains": [ 00:13:58.880 { 00:13:58.880 "dma_device_id": "system", 00:13:58.880 "dma_device_type": 1 00:13:58.880 }, 00:13:58.880 { 00:13:58.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.880 "dma_device_type": 2 00:13:58.880 } 00:13:58.880 ], 00:13:58.880 "driver_specific": {} 00:13:58.880 }' 00:13:58.880 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:59.139 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:59.398 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:59.398 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:59.398 06:30:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:59.967 [2024-07-25 06:30:13.226992] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:59.967 [2024-07-25 06:30:13.227017] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:59.967 [2024-07-25 06:30:13.227056] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.967 "name": "Existed_Raid", 00:13:59.967 "uuid": "b1df588f-a2a1-4f4d-afea-827c85dcdcba", 00:13:59.967 "strip_size_kb": 64, 00:13:59.967 "state": "offline", 00:13:59.967 "raid_level": "raid0", 00:13:59.967 "superblock": false, 00:13:59.967 "num_base_bdevs": 2, 00:13:59.967 "num_base_bdevs_discovered": 1, 00:13:59.967 "num_base_bdevs_operational": 1, 00:13:59.967 "base_bdevs_list": [ 00:13:59.967 { 00:13:59.967 "name": null, 00:13:59.967 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.967 "is_configured": false, 00:13:59.967 "data_offset": 0, 00:13:59.967 "data_size": 65536 00:13:59.967 }, 00:13:59.967 { 00:13:59.967 "name": "BaseBdev2", 00:13:59.967 "uuid": "9cb9cb44-737e-43be-a24e-49e48ec3d9db", 00:13:59.967 "is_configured": true, 00:13:59.967 "data_offset": 0, 00:13:59.967 "data_size": 65536 00:13:59.967 } 00:13:59.967 ] 00:13:59.967 }' 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.967 06:30:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.535 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:00.535 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:00.535 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.535 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:00.822 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:00.822 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:00.822 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:01.087 [2024-07-25 06:30:14.491423] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:01.087 [2024-07-25 06:30:14.491472] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1357120 name Existed_Raid, state offline 00:14:01.087 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:01.087 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:01.087 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.087 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1093937 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1093937 ']' 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1093937 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1093937 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1093937' 00:14:01.346 killing process with pid 1093937 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1093937 00:14:01.346 [2024-07-25 06:30:14.814091] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:01.346 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1093937 00:14:01.346 [2024-07-25 06:30:14.814957] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:01.606 06:30:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:01.606 00:14:01.606 real 0m10.119s 00:14:01.606 user 0m18.028s 00:14:01.606 sys 0m1.859s 00:14:01.606 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:01.606 06:30:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.606 ************************************ 00:14:01.606 END TEST raid_state_function_test 00:14:01.606 ************************************ 00:14:01.606 06:30:15 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:14:01.606 06:30:15 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:01.606 06:30:15 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:01.606 06:30:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:01.606 ************************************ 00:14:01.606 START TEST raid_state_function_test_sb 00:14:01.606 ************************************ 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 true 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1096319 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1096319' 00:14:01.606 Process raid pid: 1096319 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1096319 /var/tmp/spdk-raid.sock 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1096319 ']' 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:01.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:01.606 06:30:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.606 [2024-07-25 06:30:15.147317] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:14:01.606 [2024-07-25 06:30:15.147373] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:01.866 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:01.866 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:01.866 [2024-07-25 06:30:15.285125] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:01.866 [2024-07-25 06:30:15.330030] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:01.866 [2024-07-25 06:30:15.392638] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:01.866 [2024-07-25 06:30:15.392690] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:02.805 [2024-07-25 06:30:16.249566] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:02.805 [2024-07-25 06:30:16.249602] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:02.805 [2024-07-25 06:30:16.249612] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:02.805 [2024-07-25 06:30:16.249623] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.805 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.065 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.065 "name": "Existed_Raid", 00:14:03.065 "uuid": "fa267d50-c93c-41f3-84d7-215697358072", 00:14:03.065 "strip_size_kb": 64, 00:14:03.065 "state": "configuring", 00:14:03.065 "raid_level": "raid0", 00:14:03.065 "superblock": true, 00:14:03.065 "num_base_bdevs": 2, 00:14:03.065 "num_base_bdevs_discovered": 0, 00:14:03.065 "num_base_bdevs_operational": 2, 00:14:03.065 "base_bdevs_list": [ 00:14:03.065 { 00:14:03.065 "name": "BaseBdev1", 00:14:03.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.065 "is_configured": false, 00:14:03.065 "data_offset": 0, 00:14:03.065 "data_size": 0 00:14:03.065 }, 00:14:03.065 { 00:14:03.065 "name": "BaseBdev2", 00:14:03.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.065 "is_configured": false, 00:14:03.065 "data_offset": 0, 00:14:03.065 "data_size": 0 00:14:03.065 } 00:14:03.065 ] 00:14:03.065 }' 00:14:03.065 06:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.065 06:30:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.634 06:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:03.893 [2024-07-25 06:30:17.260107] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:03.893 [2024-07-25 06:30:17.260130] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1008470 name Existed_Raid, state configuring 00:14:03.893 06:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:04.152 [2024-07-25 06:30:17.488720] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:04.152 [2024-07-25 06:30:17.488743] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:04.152 [2024-07-25 06:30:17.488752] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:04.152 [2024-07-25 06:30:17.488762] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:04.152 06:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:04.411 [2024-07-25 06:30:17.722726] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:04.411 BaseBdev1 00:14:04.411 06:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:04.411 06:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:04.411 06:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:04.411 06:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:04.411 06:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:04.411 06:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:04.411 06:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.411 06:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:04.670 [ 00:14:04.670 { 00:14:04.670 "name": "BaseBdev1", 00:14:04.670 "aliases": [ 00:14:04.670 "9153aacd-e324-43ce-b2b8-e7800cbda38f" 00:14:04.670 ], 00:14:04.670 "product_name": "Malloc disk", 00:14:04.670 "block_size": 512, 00:14:04.670 "num_blocks": 65536, 00:14:04.670 "uuid": "9153aacd-e324-43ce-b2b8-e7800cbda38f", 00:14:04.670 "assigned_rate_limits": { 00:14:04.670 "rw_ios_per_sec": 0, 00:14:04.670 "rw_mbytes_per_sec": 0, 00:14:04.670 "r_mbytes_per_sec": 0, 00:14:04.670 "w_mbytes_per_sec": 0 00:14:04.670 }, 00:14:04.670 "claimed": true, 00:14:04.670 "claim_type": "exclusive_write", 00:14:04.670 "zoned": false, 00:14:04.670 "supported_io_types": { 00:14:04.670 "read": true, 00:14:04.670 "write": true, 00:14:04.670 "unmap": true, 00:14:04.670 "flush": true, 00:14:04.670 "reset": true, 00:14:04.670 "nvme_admin": false, 00:14:04.670 "nvme_io": false, 00:14:04.670 "nvme_io_md": false, 00:14:04.670 "write_zeroes": true, 00:14:04.670 "zcopy": true, 00:14:04.670 "get_zone_info": false, 00:14:04.670 "zone_management": false, 00:14:04.670 "zone_append": false, 00:14:04.670 "compare": false, 00:14:04.670 "compare_and_write": false, 00:14:04.670 "abort": true, 00:14:04.670 "seek_hole": false, 00:14:04.670 "seek_data": false, 00:14:04.670 "copy": true, 00:14:04.670 "nvme_iov_md": false 00:14:04.670 }, 00:14:04.670 "memory_domains": [ 00:14:04.671 { 00:14:04.671 "dma_device_id": "system", 00:14:04.671 "dma_device_type": 1 00:14:04.671 }, 00:14:04.671 { 00:14:04.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.671 "dma_device_type": 2 00:14:04.671 } 00:14:04.671 ], 00:14:04.671 "driver_specific": {} 00:14:04.671 } 00:14:04.671 ] 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.671 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.929 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.929 "name": "Existed_Raid", 00:14:04.929 "uuid": "8014aceb-8de3-47ff-ba24-71a40d252783", 00:14:04.929 "strip_size_kb": 64, 00:14:04.929 "state": "configuring", 00:14:04.929 "raid_level": "raid0", 00:14:04.929 "superblock": true, 00:14:04.929 "num_base_bdevs": 2, 00:14:04.929 "num_base_bdevs_discovered": 1, 00:14:04.929 "num_base_bdevs_operational": 2, 00:14:04.929 "base_bdevs_list": [ 00:14:04.929 { 00:14:04.929 "name": "BaseBdev1", 00:14:04.929 "uuid": "9153aacd-e324-43ce-b2b8-e7800cbda38f", 00:14:04.929 "is_configured": true, 00:14:04.929 "data_offset": 2048, 00:14:04.929 "data_size": 63488 00:14:04.929 }, 00:14:04.929 { 00:14:04.929 "name": "BaseBdev2", 00:14:04.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.929 "is_configured": false, 00:14:04.929 "data_offset": 0, 00:14:04.929 "data_size": 0 00:14:04.929 } 00:14:04.929 ] 00:14:04.929 }' 00:14:04.930 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.930 06:30:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:05.497 06:30:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:05.756 [2024-07-25 06:30:19.174555] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:05.756 [2024-07-25 06:30:19.174588] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1007ce0 name Existed_Raid, state configuring 00:14:05.756 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:06.014 [2024-07-25 06:30:19.403197] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:06.014 [2024-07-25 06:30:19.404572] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:06.014 [2024-07-25 06:30:19.404603] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:06.014 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:06.014 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:06.014 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.015 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.273 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:06.273 "name": "Existed_Raid", 00:14:06.273 "uuid": "12da685a-6ff5-441b-94ce-6bbeaf3bef68", 00:14:06.273 "strip_size_kb": 64, 00:14:06.273 "state": "configuring", 00:14:06.273 "raid_level": "raid0", 00:14:06.273 "superblock": true, 00:14:06.273 "num_base_bdevs": 2, 00:14:06.273 "num_base_bdevs_discovered": 1, 00:14:06.273 "num_base_bdevs_operational": 2, 00:14:06.273 "base_bdevs_list": [ 00:14:06.273 { 00:14:06.273 "name": "BaseBdev1", 00:14:06.273 "uuid": "9153aacd-e324-43ce-b2b8-e7800cbda38f", 00:14:06.273 "is_configured": true, 00:14:06.273 "data_offset": 2048, 00:14:06.273 "data_size": 63488 00:14:06.273 }, 00:14:06.273 { 00:14:06.273 "name": "BaseBdev2", 00:14:06.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.273 "is_configured": false, 00:14:06.273 "data_offset": 0, 00:14:06.273 "data_size": 0 00:14:06.273 } 00:14:06.273 ] 00:14:06.273 }' 00:14:06.273 06:30:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:06.273 06:30:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:06.840 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:07.099 [2024-07-25 06:30:20.437054] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:07.099 [2024-07-25 06:30:20.437197] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11bb120 00:14:07.099 [2024-07-25 06:30:20.437210] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:07.099 [2024-07-25 06:30:20.437369] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1009050 00:14:07.099 [2024-07-25 06:30:20.437476] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11bb120 00:14:07.099 [2024-07-25 06:30:20.437485] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11bb120 00:14:07.099 [2024-07-25 06:30:20.437569] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:07.099 BaseBdev2 00:14:07.099 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:07.099 06:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:07.099 06:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:07.099 06:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:07.099 06:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:07.099 06:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:07.099 06:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:07.357 06:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:07.357 [ 00:14:07.357 { 00:14:07.357 "name": "BaseBdev2", 00:14:07.357 "aliases": [ 00:14:07.357 "78bc5eca-4862-48a8-b728-c7cf5c01734a" 00:14:07.357 ], 00:14:07.357 "product_name": "Malloc disk", 00:14:07.357 "block_size": 512, 00:14:07.357 "num_blocks": 65536, 00:14:07.357 "uuid": "78bc5eca-4862-48a8-b728-c7cf5c01734a", 00:14:07.357 "assigned_rate_limits": { 00:14:07.357 "rw_ios_per_sec": 0, 00:14:07.357 "rw_mbytes_per_sec": 0, 00:14:07.357 "r_mbytes_per_sec": 0, 00:14:07.357 "w_mbytes_per_sec": 0 00:14:07.357 }, 00:14:07.357 "claimed": true, 00:14:07.357 "claim_type": "exclusive_write", 00:14:07.357 "zoned": false, 00:14:07.357 "supported_io_types": { 00:14:07.357 "read": true, 00:14:07.357 "write": true, 00:14:07.357 "unmap": true, 00:14:07.357 "flush": true, 00:14:07.357 "reset": true, 00:14:07.357 "nvme_admin": false, 00:14:07.357 "nvme_io": false, 00:14:07.357 "nvme_io_md": false, 00:14:07.357 "write_zeroes": true, 00:14:07.357 "zcopy": true, 00:14:07.358 "get_zone_info": false, 00:14:07.358 "zone_management": false, 00:14:07.358 "zone_append": false, 00:14:07.358 "compare": false, 00:14:07.358 "compare_and_write": false, 00:14:07.358 "abort": true, 00:14:07.358 "seek_hole": false, 00:14:07.358 "seek_data": false, 00:14:07.358 "copy": true, 00:14:07.358 "nvme_iov_md": false 00:14:07.358 }, 00:14:07.358 "memory_domains": [ 00:14:07.358 { 00:14:07.358 "dma_device_id": "system", 00:14:07.358 "dma_device_type": 1 00:14:07.358 }, 00:14:07.358 { 00:14:07.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.358 "dma_device_type": 2 00:14:07.358 } 00:14:07.358 ], 00:14:07.358 "driver_specific": {} 00:14:07.358 } 00:14:07.358 ] 00:14:07.358 06:30:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:07.358 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:07.358 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:07.358 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:14:07.358 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.358 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:07.358 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:07.358 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:07.616 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:07.616 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.616 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.616 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.616 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.616 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.616 06:30:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.616 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.616 "name": "Existed_Raid", 00:14:07.616 "uuid": "12da685a-6ff5-441b-94ce-6bbeaf3bef68", 00:14:07.616 "strip_size_kb": 64, 00:14:07.616 "state": "online", 00:14:07.616 "raid_level": "raid0", 00:14:07.616 "superblock": true, 00:14:07.616 "num_base_bdevs": 2, 00:14:07.616 "num_base_bdevs_discovered": 2, 00:14:07.616 "num_base_bdevs_operational": 2, 00:14:07.616 "base_bdevs_list": [ 00:14:07.617 { 00:14:07.617 "name": "BaseBdev1", 00:14:07.617 "uuid": "9153aacd-e324-43ce-b2b8-e7800cbda38f", 00:14:07.617 "is_configured": true, 00:14:07.617 "data_offset": 2048, 00:14:07.617 "data_size": 63488 00:14:07.617 }, 00:14:07.617 { 00:14:07.617 "name": "BaseBdev2", 00:14:07.617 "uuid": "78bc5eca-4862-48a8-b728-c7cf5c01734a", 00:14:07.617 "is_configured": true, 00:14:07.617 "data_offset": 2048, 00:14:07.617 "data_size": 63488 00:14:07.617 } 00:14:07.617 ] 00:14:07.617 }' 00:14:07.617 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.617 06:30:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.183 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:08.183 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:08.183 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:08.183 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:08.183 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:08.183 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:08.183 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:08.183 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:08.441 [2024-07-25 06:30:21.921223] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:08.441 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:08.441 "name": "Existed_Raid", 00:14:08.441 "aliases": [ 00:14:08.441 "12da685a-6ff5-441b-94ce-6bbeaf3bef68" 00:14:08.441 ], 00:14:08.441 "product_name": "Raid Volume", 00:14:08.441 "block_size": 512, 00:14:08.441 "num_blocks": 126976, 00:14:08.441 "uuid": "12da685a-6ff5-441b-94ce-6bbeaf3bef68", 00:14:08.441 "assigned_rate_limits": { 00:14:08.441 "rw_ios_per_sec": 0, 00:14:08.441 "rw_mbytes_per_sec": 0, 00:14:08.441 "r_mbytes_per_sec": 0, 00:14:08.441 "w_mbytes_per_sec": 0 00:14:08.441 }, 00:14:08.441 "claimed": false, 00:14:08.441 "zoned": false, 00:14:08.441 "supported_io_types": { 00:14:08.441 "read": true, 00:14:08.441 "write": true, 00:14:08.441 "unmap": true, 00:14:08.441 "flush": true, 00:14:08.441 "reset": true, 00:14:08.441 "nvme_admin": false, 00:14:08.441 "nvme_io": false, 00:14:08.441 "nvme_io_md": false, 00:14:08.441 "write_zeroes": true, 00:14:08.441 "zcopy": false, 00:14:08.441 "get_zone_info": false, 00:14:08.441 "zone_management": false, 00:14:08.441 "zone_append": false, 00:14:08.441 "compare": false, 00:14:08.441 "compare_and_write": false, 00:14:08.441 "abort": false, 00:14:08.441 "seek_hole": false, 00:14:08.441 "seek_data": false, 00:14:08.441 "copy": false, 00:14:08.441 "nvme_iov_md": false 00:14:08.441 }, 00:14:08.441 "memory_domains": [ 00:14:08.441 { 00:14:08.441 "dma_device_id": "system", 00:14:08.441 "dma_device_type": 1 00:14:08.441 }, 00:14:08.441 { 00:14:08.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.441 "dma_device_type": 2 00:14:08.441 }, 00:14:08.441 { 00:14:08.441 "dma_device_id": "system", 00:14:08.441 "dma_device_type": 1 00:14:08.441 }, 00:14:08.441 { 00:14:08.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.441 "dma_device_type": 2 00:14:08.441 } 00:14:08.441 ], 00:14:08.441 "driver_specific": { 00:14:08.441 "raid": { 00:14:08.441 "uuid": "12da685a-6ff5-441b-94ce-6bbeaf3bef68", 00:14:08.441 "strip_size_kb": 64, 00:14:08.441 "state": "online", 00:14:08.441 "raid_level": "raid0", 00:14:08.441 "superblock": true, 00:14:08.441 "num_base_bdevs": 2, 00:14:08.441 "num_base_bdevs_discovered": 2, 00:14:08.441 "num_base_bdevs_operational": 2, 00:14:08.441 "base_bdevs_list": [ 00:14:08.441 { 00:14:08.441 "name": "BaseBdev1", 00:14:08.441 "uuid": "9153aacd-e324-43ce-b2b8-e7800cbda38f", 00:14:08.441 "is_configured": true, 00:14:08.441 "data_offset": 2048, 00:14:08.441 "data_size": 63488 00:14:08.441 }, 00:14:08.441 { 00:14:08.441 "name": "BaseBdev2", 00:14:08.441 "uuid": "78bc5eca-4862-48a8-b728-c7cf5c01734a", 00:14:08.441 "is_configured": true, 00:14:08.441 "data_offset": 2048, 00:14:08.441 "data_size": 63488 00:14:08.441 } 00:14:08.441 ] 00:14:08.441 } 00:14:08.441 } 00:14:08.441 }' 00:14:08.441 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:08.441 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:08.441 BaseBdev2' 00:14:08.441 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:08.441 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:08.441 06:30:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:08.700 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:08.700 "name": "BaseBdev1", 00:14:08.700 "aliases": [ 00:14:08.700 "9153aacd-e324-43ce-b2b8-e7800cbda38f" 00:14:08.700 ], 00:14:08.700 "product_name": "Malloc disk", 00:14:08.700 "block_size": 512, 00:14:08.700 "num_blocks": 65536, 00:14:08.700 "uuid": "9153aacd-e324-43ce-b2b8-e7800cbda38f", 00:14:08.700 "assigned_rate_limits": { 00:14:08.700 "rw_ios_per_sec": 0, 00:14:08.700 "rw_mbytes_per_sec": 0, 00:14:08.700 "r_mbytes_per_sec": 0, 00:14:08.700 "w_mbytes_per_sec": 0 00:14:08.700 }, 00:14:08.700 "claimed": true, 00:14:08.700 "claim_type": "exclusive_write", 00:14:08.700 "zoned": false, 00:14:08.700 "supported_io_types": { 00:14:08.700 "read": true, 00:14:08.700 "write": true, 00:14:08.700 "unmap": true, 00:14:08.700 "flush": true, 00:14:08.700 "reset": true, 00:14:08.700 "nvme_admin": false, 00:14:08.700 "nvme_io": false, 00:14:08.700 "nvme_io_md": false, 00:14:08.700 "write_zeroes": true, 00:14:08.700 "zcopy": true, 00:14:08.700 "get_zone_info": false, 00:14:08.700 "zone_management": false, 00:14:08.700 "zone_append": false, 00:14:08.700 "compare": false, 00:14:08.700 "compare_and_write": false, 00:14:08.700 "abort": true, 00:14:08.700 "seek_hole": false, 00:14:08.700 "seek_data": false, 00:14:08.700 "copy": true, 00:14:08.700 "nvme_iov_md": false 00:14:08.700 }, 00:14:08.700 "memory_domains": [ 00:14:08.700 { 00:14:08.700 "dma_device_id": "system", 00:14:08.700 "dma_device_type": 1 00:14:08.700 }, 00:14:08.700 { 00:14:08.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.700 "dma_device_type": 2 00:14:08.700 } 00:14:08.700 ], 00:14:08.700 "driver_specific": {} 00:14:08.700 }' 00:14:08.700 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:08.958 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.217 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.217 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:09.217 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:09.217 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:09.217 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:09.475 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:09.475 "name": "BaseBdev2", 00:14:09.475 "aliases": [ 00:14:09.475 "78bc5eca-4862-48a8-b728-c7cf5c01734a" 00:14:09.475 ], 00:14:09.475 "product_name": "Malloc disk", 00:14:09.475 "block_size": 512, 00:14:09.475 "num_blocks": 65536, 00:14:09.475 "uuid": "78bc5eca-4862-48a8-b728-c7cf5c01734a", 00:14:09.475 "assigned_rate_limits": { 00:14:09.475 "rw_ios_per_sec": 0, 00:14:09.475 "rw_mbytes_per_sec": 0, 00:14:09.475 "r_mbytes_per_sec": 0, 00:14:09.475 "w_mbytes_per_sec": 0 00:14:09.475 }, 00:14:09.475 "claimed": true, 00:14:09.475 "claim_type": "exclusive_write", 00:14:09.475 "zoned": false, 00:14:09.475 "supported_io_types": { 00:14:09.475 "read": true, 00:14:09.475 "write": true, 00:14:09.475 "unmap": true, 00:14:09.475 "flush": true, 00:14:09.475 "reset": true, 00:14:09.475 "nvme_admin": false, 00:14:09.475 "nvme_io": false, 00:14:09.475 "nvme_io_md": false, 00:14:09.475 "write_zeroes": true, 00:14:09.475 "zcopy": true, 00:14:09.475 "get_zone_info": false, 00:14:09.475 "zone_management": false, 00:14:09.475 "zone_append": false, 00:14:09.475 "compare": false, 00:14:09.475 "compare_and_write": false, 00:14:09.475 "abort": true, 00:14:09.475 "seek_hole": false, 00:14:09.475 "seek_data": false, 00:14:09.475 "copy": true, 00:14:09.475 "nvme_iov_md": false 00:14:09.475 }, 00:14:09.475 "memory_domains": [ 00:14:09.475 { 00:14:09.475 "dma_device_id": "system", 00:14:09.475 "dma_device_type": 1 00:14:09.475 }, 00:14:09.475 { 00:14:09.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.475 "dma_device_type": 2 00:14:09.475 } 00:14:09.475 ], 00:14:09.475 "driver_specific": {} 00:14:09.475 }' 00:14:09.475 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.475 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.475 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.475 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.475 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.475 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.475 06:30:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.475 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.734 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.734 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.734 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.734 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:09.734 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:09.996 [2024-07-25 06:30:23.360931] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:09.996 [2024-07-25 06:30:23.360954] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:09.996 [2024-07-25 06:30:23.360990] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.996 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.255 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.255 "name": "Existed_Raid", 00:14:10.255 "uuid": "12da685a-6ff5-441b-94ce-6bbeaf3bef68", 00:14:10.255 "strip_size_kb": 64, 00:14:10.255 "state": "offline", 00:14:10.255 "raid_level": "raid0", 00:14:10.255 "superblock": true, 00:14:10.255 "num_base_bdevs": 2, 00:14:10.255 "num_base_bdevs_discovered": 1, 00:14:10.255 "num_base_bdevs_operational": 1, 00:14:10.255 "base_bdevs_list": [ 00:14:10.255 { 00:14:10.255 "name": null, 00:14:10.255 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.255 "is_configured": false, 00:14:10.255 "data_offset": 2048, 00:14:10.255 "data_size": 63488 00:14:10.255 }, 00:14:10.255 { 00:14:10.255 "name": "BaseBdev2", 00:14:10.255 "uuid": "78bc5eca-4862-48a8-b728-c7cf5c01734a", 00:14:10.255 "is_configured": true, 00:14:10.255 "data_offset": 2048, 00:14:10.255 "data_size": 63488 00:14:10.255 } 00:14:10.255 ] 00:14:10.255 }' 00:14:10.255 06:30:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.255 06:30:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:10.821 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:10.821 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:10.821 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.821 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:10.821 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:10.821 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:10.821 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:11.080 [2024-07-25 06:30:24.561121] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:11.080 [2024-07-25 06:30:24.561169] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11bb120 name Existed_Raid, state offline 00:14:11.080 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:11.080 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:11.080 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.080 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1096319 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1096319 ']' 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1096319 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1096319 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1096319' 00:14:11.338 killing process with pid 1096319 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1096319 00:14:11.338 [2024-07-25 06:30:24.873589] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:11.338 06:30:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1096319 00:14:11.338 [2024-07-25 06:30:24.874441] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:11.596 06:30:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:11.596 00:14:11.596 real 0m9.971s 00:14:11.596 user 0m17.741s 00:14:11.596 sys 0m1.886s 00:14:11.596 06:30:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:11.596 06:30:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:11.596 ************************************ 00:14:11.596 END TEST raid_state_function_test_sb 00:14:11.596 ************************************ 00:14:11.596 06:30:25 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:14:11.596 06:30:25 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:14:11.596 06:30:25 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:11.596 06:30:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:11.596 ************************************ 00:14:11.596 START TEST raid_superblock_test 00:14:11.596 ************************************ 00:14:11.596 06:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 2 00:14:11.596 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:14:11.596 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:14:11.596 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:14:11.596 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:14:11.596 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1098156 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1098156 /var/tmp/spdk-raid.sock 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1098156 ']' 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:11.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:11.597 06:30:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.855 [2024-07-25 06:30:25.202966] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:14:11.855 [2024-07-25 06:30:25.203023] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1098156 ] 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:11.855 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:11.855 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:11.855 [2024-07-25 06:30:25.330635] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:11.855 [2024-07-25 06:30:25.375789] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:12.113 [2024-07-25 06:30:25.432798] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:12.113 [2024-07-25 06:30:25.432830] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:12.679 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:12.938 malloc1 00:14:12.938 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:13.196 [2024-07-25 06:30:26.539009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:13.196 [2024-07-25 06:30:26.539057] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:13.196 [2024-07-25 06:30:26.539075] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2269d70 00:14:13.196 [2024-07-25 06:30:26.539087] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:13.196 [2024-07-25 06:30:26.540499] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:13.196 [2024-07-25 06:30:26.540529] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:13.196 pt1 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:13.196 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:13.455 malloc2 00:14:13.455 06:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:13.455 [2024-07-25 06:30:27.000624] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:13.455 [2024-07-25 06:30:27.000664] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:13.455 [2024-07-25 06:30:27.000680] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b8790 00:14:13.455 [2024-07-25 06:30:27.000692] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:13.455 [2024-07-25 06:30:27.001910] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:13.455 [2024-07-25 06:30:27.001936] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:13.455 pt2 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:14:13.742 [2024-07-25 06:30:27.225237] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:13.742 [2024-07-25 06:30:27.226340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:13.742 [2024-07-25 06:30:27.226471] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x225b1c0 00:14:13.742 [2024-07-25 06:30:27.226483] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:13.742 [2024-07-25 06:30:27.226644] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20b66e0 00:14:13.742 [2024-07-25 06:30:27.226769] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x225b1c0 00:14:13.742 [2024-07-25 06:30:27.226778] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x225b1c0 00:14:13.742 [2024-07-25 06:30:27.226858] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.742 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:14.000 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:14.000 "name": "raid_bdev1", 00:14:14.000 "uuid": "a7d546f9-fc11-48ba-a8e0-0070e184832e", 00:14:14.000 "strip_size_kb": 64, 00:14:14.000 "state": "online", 00:14:14.000 "raid_level": "raid0", 00:14:14.000 "superblock": true, 00:14:14.000 "num_base_bdevs": 2, 00:14:14.000 "num_base_bdevs_discovered": 2, 00:14:14.000 "num_base_bdevs_operational": 2, 00:14:14.000 "base_bdevs_list": [ 00:14:14.000 { 00:14:14.000 "name": "pt1", 00:14:14.000 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:14.000 "is_configured": true, 00:14:14.000 "data_offset": 2048, 00:14:14.000 "data_size": 63488 00:14:14.000 }, 00:14:14.000 { 00:14:14.000 "name": "pt2", 00:14:14.000 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:14.000 "is_configured": true, 00:14:14.000 "data_offset": 2048, 00:14:14.000 "data_size": 63488 00:14:14.000 } 00:14:14.000 ] 00:14:14.000 }' 00:14:14.000 06:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:14.000 06:30:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.566 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:14:14.566 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:14.566 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:14.566 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:14.566 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:14.566 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:14.566 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:14.566 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:14.825 [2024-07-25 06:30:28.232093] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:14.825 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:14.825 "name": "raid_bdev1", 00:14:14.825 "aliases": [ 00:14:14.825 "a7d546f9-fc11-48ba-a8e0-0070e184832e" 00:14:14.825 ], 00:14:14.825 "product_name": "Raid Volume", 00:14:14.825 "block_size": 512, 00:14:14.825 "num_blocks": 126976, 00:14:14.825 "uuid": "a7d546f9-fc11-48ba-a8e0-0070e184832e", 00:14:14.825 "assigned_rate_limits": { 00:14:14.825 "rw_ios_per_sec": 0, 00:14:14.825 "rw_mbytes_per_sec": 0, 00:14:14.825 "r_mbytes_per_sec": 0, 00:14:14.825 "w_mbytes_per_sec": 0 00:14:14.825 }, 00:14:14.825 "claimed": false, 00:14:14.825 "zoned": false, 00:14:14.825 "supported_io_types": { 00:14:14.825 "read": true, 00:14:14.825 "write": true, 00:14:14.825 "unmap": true, 00:14:14.825 "flush": true, 00:14:14.825 "reset": true, 00:14:14.825 "nvme_admin": false, 00:14:14.825 "nvme_io": false, 00:14:14.825 "nvme_io_md": false, 00:14:14.825 "write_zeroes": true, 00:14:14.825 "zcopy": false, 00:14:14.825 "get_zone_info": false, 00:14:14.825 "zone_management": false, 00:14:14.825 "zone_append": false, 00:14:14.825 "compare": false, 00:14:14.825 "compare_and_write": false, 00:14:14.825 "abort": false, 00:14:14.825 "seek_hole": false, 00:14:14.825 "seek_data": false, 00:14:14.825 "copy": false, 00:14:14.825 "nvme_iov_md": false 00:14:14.825 }, 00:14:14.825 "memory_domains": [ 00:14:14.825 { 00:14:14.825 "dma_device_id": "system", 00:14:14.825 "dma_device_type": 1 00:14:14.825 }, 00:14:14.825 { 00:14:14.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.825 "dma_device_type": 2 00:14:14.825 }, 00:14:14.825 { 00:14:14.825 "dma_device_id": "system", 00:14:14.825 "dma_device_type": 1 00:14:14.825 }, 00:14:14.825 { 00:14:14.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.825 "dma_device_type": 2 00:14:14.825 } 00:14:14.825 ], 00:14:14.825 "driver_specific": { 00:14:14.825 "raid": { 00:14:14.825 "uuid": "a7d546f9-fc11-48ba-a8e0-0070e184832e", 00:14:14.825 "strip_size_kb": 64, 00:14:14.825 "state": "online", 00:14:14.825 "raid_level": "raid0", 00:14:14.825 "superblock": true, 00:14:14.825 "num_base_bdevs": 2, 00:14:14.825 "num_base_bdevs_discovered": 2, 00:14:14.825 "num_base_bdevs_operational": 2, 00:14:14.825 "base_bdevs_list": [ 00:14:14.825 { 00:14:14.825 "name": "pt1", 00:14:14.825 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:14.825 "is_configured": true, 00:14:14.825 "data_offset": 2048, 00:14:14.825 "data_size": 63488 00:14:14.825 }, 00:14:14.825 { 00:14:14.825 "name": "pt2", 00:14:14.825 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:14.825 "is_configured": true, 00:14:14.825 "data_offset": 2048, 00:14:14.825 "data_size": 63488 00:14:14.825 } 00:14:14.825 ] 00:14:14.825 } 00:14:14.825 } 00:14:14.825 }' 00:14:14.825 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:14.825 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:14.825 pt2' 00:14:14.825 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:14.825 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:14.825 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.083 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.083 "name": "pt1", 00:14:15.083 "aliases": [ 00:14:15.083 "00000000-0000-0000-0000-000000000001" 00:14:15.083 ], 00:14:15.083 "product_name": "passthru", 00:14:15.083 "block_size": 512, 00:14:15.083 "num_blocks": 65536, 00:14:15.083 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:15.083 "assigned_rate_limits": { 00:14:15.083 "rw_ios_per_sec": 0, 00:14:15.083 "rw_mbytes_per_sec": 0, 00:14:15.083 "r_mbytes_per_sec": 0, 00:14:15.083 "w_mbytes_per_sec": 0 00:14:15.083 }, 00:14:15.083 "claimed": true, 00:14:15.083 "claim_type": "exclusive_write", 00:14:15.083 "zoned": false, 00:14:15.083 "supported_io_types": { 00:14:15.083 "read": true, 00:14:15.083 "write": true, 00:14:15.083 "unmap": true, 00:14:15.083 "flush": true, 00:14:15.083 "reset": true, 00:14:15.083 "nvme_admin": false, 00:14:15.083 "nvme_io": false, 00:14:15.083 "nvme_io_md": false, 00:14:15.083 "write_zeroes": true, 00:14:15.083 "zcopy": true, 00:14:15.083 "get_zone_info": false, 00:14:15.083 "zone_management": false, 00:14:15.083 "zone_append": false, 00:14:15.084 "compare": false, 00:14:15.084 "compare_and_write": false, 00:14:15.084 "abort": true, 00:14:15.084 "seek_hole": false, 00:14:15.084 "seek_data": false, 00:14:15.084 "copy": true, 00:14:15.084 "nvme_iov_md": false 00:14:15.084 }, 00:14:15.084 "memory_domains": [ 00:14:15.084 { 00:14:15.084 "dma_device_id": "system", 00:14:15.084 "dma_device_type": 1 00:14:15.084 }, 00:14:15.084 { 00:14:15.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.084 "dma_device_type": 2 00:14:15.084 } 00:14:15.084 ], 00:14:15.084 "driver_specific": { 00:14:15.084 "passthru": { 00:14:15.084 "name": "pt1", 00:14:15.084 "base_bdev_name": "malloc1" 00:14:15.084 } 00:14:15.084 } 00:14:15.084 }' 00:14:15.084 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.084 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.084 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:15.084 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.084 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:15.342 06:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.600 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.600 "name": "pt2", 00:14:15.600 "aliases": [ 00:14:15.600 "00000000-0000-0000-0000-000000000002" 00:14:15.600 ], 00:14:15.600 "product_name": "passthru", 00:14:15.600 "block_size": 512, 00:14:15.600 "num_blocks": 65536, 00:14:15.600 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:15.600 "assigned_rate_limits": { 00:14:15.600 "rw_ios_per_sec": 0, 00:14:15.600 "rw_mbytes_per_sec": 0, 00:14:15.600 "r_mbytes_per_sec": 0, 00:14:15.600 "w_mbytes_per_sec": 0 00:14:15.600 }, 00:14:15.600 "claimed": true, 00:14:15.600 "claim_type": "exclusive_write", 00:14:15.600 "zoned": false, 00:14:15.600 "supported_io_types": { 00:14:15.600 "read": true, 00:14:15.600 "write": true, 00:14:15.600 "unmap": true, 00:14:15.600 "flush": true, 00:14:15.600 "reset": true, 00:14:15.600 "nvme_admin": false, 00:14:15.600 "nvme_io": false, 00:14:15.600 "nvme_io_md": false, 00:14:15.600 "write_zeroes": true, 00:14:15.600 "zcopy": true, 00:14:15.600 "get_zone_info": false, 00:14:15.600 "zone_management": false, 00:14:15.600 "zone_append": false, 00:14:15.600 "compare": false, 00:14:15.600 "compare_and_write": false, 00:14:15.600 "abort": true, 00:14:15.600 "seek_hole": false, 00:14:15.600 "seek_data": false, 00:14:15.600 "copy": true, 00:14:15.600 "nvme_iov_md": false 00:14:15.601 }, 00:14:15.601 "memory_domains": [ 00:14:15.601 { 00:14:15.601 "dma_device_id": "system", 00:14:15.601 "dma_device_type": 1 00:14:15.601 }, 00:14:15.601 { 00:14:15.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.601 "dma_device_type": 2 00:14:15.601 } 00:14:15.601 ], 00:14:15.601 "driver_specific": { 00:14:15.601 "passthru": { 00:14:15.601 "name": "pt2", 00:14:15.601 "base_bdev_name": "malloc2" 00:14:15.601 } 00:14:15.601 } 00:14:15.601 }' 00:14:15.601 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.601 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.601 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:15.601 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:15.859 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:14:16.117 [2024-07-25 06:30:29.575616] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:16.117 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=a7d546f9-fc11-48ba-a8e0-0070e184832e 00:14:16.117 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z a7d546f9-fc11-48ba-a8e0-0070e184832e ']' 00:14:16.117 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:16.375 [2024-07-25 06:30:29.799967] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:16.375 [2024-07-25 06:30:29.799984] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:16.375 [2024-07-25 06:30:29.800036] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:16.375 [2024-07-25 06:30:29.800076] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:16.375 [2024-07-25 06:30:29.800087] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x225b1c0 name raid_bdev1, state offline 00:14:16.375 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.375 06:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:14:16.633 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:14:16.633 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:14:16.633 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:16.633 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:16.892 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:16.892 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:17.150 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:17.150 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:17.409 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:14:17.409 [2024-07-25 06:30:30.954972] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:17.409 [2024-07-25 06:30:30.956217] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:17.409 [2024-07-25 06:30:30.956267] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:17.409 [2024-07-25 06:30:30.956304] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:17.409 [2024-07-25 06:30:30.956321] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:17.409 [2024-07-25 06:30:30.956330] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x225c5a0 name raid_bdev1, state configuring 00:14:17.409 request: 00:14:17.409 { 00:14:17.409 "name": "raid_bdev1", 00:14:17.409 "raid_level": "raid0", 00:14:17.409 "base_bdevs": [ 00:14:17.409 "malloc1", 00:14:17.409 "malloc2" 00:14:17.409 ], 00:14:17.409 "strip_size_kb": 64, 00:14:17.409 "superblock": false, 00:14:17.409 "method": "bdev_raid_create", 00:14:17.409 "req_id": 1 00:14:17.409 } 00:14:17.409 Got JSON-RPC error response 00:14:17.409 response: 00:14:17.409 { 00:14:17.409 "code": -17, 00:14:17.409 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:17.409 } 00:14:17.667 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:14:17.667 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:14:17.667 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:14:17.667 06:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:14:17.667 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.667 06:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:14:17.667 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:14:17.667 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:14:17.667 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:17.925 [2024-07-25 06:30:31.412132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:17.925 [2024-07-25 06:30:31.412182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:17.925 [2024-07-25 06:30:31.412198] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x225af60 00:14:17.925 [2024-07-25 06:30:31.412210] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:17.925 [2024-07-25 06:30:31.413693] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:17.925 [2024-07-25 06:30:31.413721] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:17.925 [2024-07-25 06:30:31.413790] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:17.925 [2024-07-25 06:30:31.413812] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:17.925 pt1 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.925 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:18.184 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.184 "name": "raid_bdev1", 00:14:18.184 "uuid": "a7d546f9-fc11-48ba-a8e0-0070e184832e", 00:14:18.184 "strip_size_kb": 64, 00:14:18.184 "state": "configuring", 00:14:18.184 "raid_level": "raid0", 00:14:18.184 "superblock": true, 00:14:18.184 "num_base_bdevs": 2, 00:14:18.184 "num_base_bdevs_discovered": 1, 00:14:18.184 "num_base_bdevs_operational": 2, 00:14:18.184 "base_bdevs_list": [ 00:14:18.184 { 00:14:18.184 "name": "pt1", 00:14:18.184 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:18.184 "is_configured": true, 00:14:18.184 "data_offset": 2048, 00:14:18.184 "data_size": 63488 00:14:18.184 }, 00:14:18.184 { 00:14:18.184 "name": null, 00:14:18.184 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:18.184 "is_configured": false, 00:14:18.184 "data_offset": 2048, 00:14:18.184 "data_size": 63488 00:14:18.184 } 00:14:18.184 ] 00:14:18.184 }' 00:14:18.184 06:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.184 06:30:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.751 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:14:18.751 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:14:18.751 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:18.751 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:19.009 [2024-07-25 06:30:32.410777] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:19.009 [2024-07-25 06:30:32.410823] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.009 [2024-07-25 06:30:32.410840] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b67f0 00:14:19.009 [2024-07-25 06:30:32.410851] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.009 [2024-07-25 06:30:32.411186] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.009 [2024-07-25 06:30:32.411203] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:19.009 [2024-07-25 06:30:32.411261] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:19.009 [2024-07-25 06:30:32.411280] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:19.009 [2024-07-25 06:30:32.411370] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x225f700 00:14:19.009 [2024-07-25 06:30:32.411380] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:19.009 [2024-07-25 06:30:32.411537] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2260910 00:14:19.009 [2024-07-25 06:30:32.411659] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x225f700 00:14:19.009 [2024-07-25 06:30:32.411668] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x225f700 00:14:19.009 [2024-07-25 06:30:32.411756] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:19.009 pt2 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:19.009 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.269 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.269 "name": "raid_bdev1", 00:14:19.269 "uuid": "a7d546f9-fc11-48ba-a8e0-0070e184832e", 00:14:19.269 "strip_size_kb": 64, 00:14:19.269 "state": "online", 00:14:19.269 "raid_level": "raid0", 00:14:19.269 "superblock": true, 00:14:19.269 "num_base_bdevs": 2, 00:14:19.269 "num_base_bdevs_discovered": 2, 00:14:19.269 "num_base_bdevs_operational": 2, 00:14:19.269 "base_bdevs_list": [ 00:14:19.269 { 00:14:19.269 "name": "pt1", 00:14:19.269 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:19.269 "is_configured": true, 00:14:19.269 "data_offset": 2048, 00:14:19.269 "data_size": 63488 00:14:19.269 }, 00:14:19.269 { 00:14:19.269 "name": "pt2", 00:14:19.269 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:19.269 "is_configured": true, 00:14:19.269 "data_offset": 2048, 00:14:19.269 "data_size": 63488 00:14:19.269 } 00:14:19.269 ] 00:14:19.269 }' 00:14:19.269 06:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.269 06:30:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.836 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:14:19.836 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:19.837 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:19.837 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:19.837 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:19.837 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:19.837 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:19.837 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:20.094 [2024-07-25 06:30:33.449756] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:20.094 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:20.094 "name": "raid_bdev1", 00:14:20.094 "aliases": [ 00:14:20.094 "a7d546f9-fc11-48ba-a8e0-0070e184832e" 00:14:20.094 ], 00:14:20.094 "product_name": "Raid Volume", 00:14:20.094 "block_size": 512, 00:14:20.094 "num_blocks": 126976, 00:14:20.094 "uuid": "a7d546f9-fc11-48ba-a8e0-0070e184832e", 00:14:20.094 "assigned_rate_limits": { 00:14:20.094 "rw_ios_per_sec": 0, 00:14:20.094 "rw_mbytes_per_sec": 0, 00:14:20.094 "r_mbytes_per_sec": 0, 00:14:20.094 "w_mbytes_per_sec": 0 00:14:20.094 }, 00:14:20.094 "claimed": false, 00:14:20.094 "zoned": false, 00:14:20.094 "supported_io_types": { 00:14:20.094 "read": true, 00:14:20.094 "write": true, 00:14:20.094 "unmap": true, 00:14:20.094 "flush": true, 00:14:20.094 "reset": true, 00:14:20.094 "nvme_admin": false, 00:14:20.094 "nvme_io": false, 00:14:20.094 "nvme_io_md": false, 00:14:20.094 "write_zeroes": true, 00:14:20.094 "zcopy": false, 00:14:20.094 "get_zone_info": false, 00:14:20.094 "zone_management": false, 00:14:20.094 "zone_append": false, 00:14:20.094 "compare": false, 00:14:20.094 "compare_and_write": false, 00:14:20.094 "abort": false, 00:14:20.094 "seek_hole": false, 00:14:20.094 "seek_data": false, 00:14:20.094 "copy": false, 00:14:20.094 "nvme_iov_md": false 00:14:20.094 }, 00:14:20.094 "memory_domains": [ 00:14:20.094 { 00:14:20.094 "dma_device_id": "system", 00:14:20.094 "dma_device_type": 1 00:14:20.094 }, 00:14:20.094 { 00:14:20.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.094 "dma_device_type": 2 00:14:20.094 }, 00:14:20.094 { 00:14:20.094 "dma_device_id": "system", 00:14:20.094 "dma_device_type": 1 00:14:20.094 }, 00:14:20.094 { 00:14:20.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.094 "dma_device_type": 2 00:14:20.094 } 00:14:20.094 ], 00:14:20.094 "driver_specific": { 00:14:20.094 "raid": { 00:14:20.094 "uuid": "a7d546f9-fc11-48ba-a8e0-0070e184832e", 00:14:20.094 "strip_size_kb": 64, 00:14:20.094 "state": "online", 00:14:20.094 "raid_level": "raid0", 00:14:20.094 "superblock": true, 00:14:20.094 "num_base_bdevs": 2, 00:14:20.094 "num_base_bdevs_discovered": 2, 00:14:20.094 "num_base_bdevs_operational": 2, 00:14:20.094 "base_bdevs_list": [ 00:14:20.094 { 00:14:20.094 "name": "pt1", 00:14:20.094 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.094 "is_configured": true, 00:14:20.094 "data_offset": 2048, 00:14:20.094 "data_size": 63488 00:14:20.094 }, 00:14:20.094 { 00:14:20.094 "name": "pt2", 00:14:20.094 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.094 "is_configured": true, 00:14:20.094 "data_offset": 2048, 00:14:20.094 "data_size": 63488 00:14:20.094 } 00:14:20.094 ] 00:14:20.094 } 00:14:20.094 } 00:14:20.094 }' 00:14:20.094 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:20.094 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:20.094 pt2' 00:14:20.094 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.094 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:20.094 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:20.352 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:20.352 "name": "pt1", 00:14:20.352 "aliases": [ 00:14:20.352 "00000000-0000-0000-0000-000000000001" 00:14:20.352 ], 00:14:20.352 "product_name": "passthru", 00:14:20.352 "block_size": 512, 00:14:20.352 "num_blocks": 65536, 00:14:20.352 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.352 "assigned_rate_limits": { 00:14:20.352 "rw_ios_per_sec": 0, 00:14:20.352 "rw_mbytes_per_sec": 0, 00:14:20.352 "r_mbytes_per_sec": 0, 00:14:20.352 "w_mbytes_per_sec": 0 00:14:20.352 }, 00:14:20.352 "claimed": true, 00:14:20.352 "claim_type": "exclusive_write", 00:14:20.352 "zoned": false, 00:14:20.352 "supported_io_types": { 00:14:20.352 "read": true, 00:14:20.352 "write": true, 00:14:20.352 "unmap": true, 00:14:20.352 "flush": true, 00:14:20.352 "reset": true, 00:14:20.352 "nvme_admin": false, 00:14:20.352 "nvme_io": false, 00:14:20.352 "nvme_io_md": false, 00:14:20.352 "write_zeroes": true, 00:14:20.352 "zcopy": true, 00:14:20.352 "get_zone_info": false, 00:14:20.352 "zone_management": false, 00:14:20.352 "zone_append": false, 00:14:20.352 "compare": false, 00:14:20.352 "compare_and_write": false, 00:14:20.352 "abort": true, 00:14:20.352 "seek_hole": false, 00:14:20.352 "seek_data": false, 00:14:20.352 "copy": true, 00:14:20.352 "nvme_iov_md": false 00:14:20.352 }, 00:14:20.352 "memory_domains": [ 00:14:20.352 { 00:14:20.352 "dma_device_id": "system", 00:14:20.352 "dma_device_type": 1 00:14:20.352 }, 00:14:20.352 { 00:14:20.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.352 "dma_device_type": 2 00:14:20.352 } 00:14:20.352 ], 00:14:20.352 "driver_specific": { 00:14:20.352 "passthru": { 00:14:20.352 "name": "pt1", 00:14:20.352 "base_bdev_name": "malloc1" 00:14:20.352 } 00:14:20.352 } 00:14:20.352 }' 00:14:20.352 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.352 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.352 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:20.352 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.352 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.609 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:20.609 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.609 06:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.609 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:20.609 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.609 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.609 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:20.610 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.610 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:20.610 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:20.867 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:20.867 "name": "pt2", 00:14:20.867 "aliases": [ 00:14:20.867 "00000000-0000-0000-0000-000000000002" 00:14:20.867 ], 00:14:20.867 "product_name": "passthru", 00:14:20.867 "block_size": 512, 00:14:20.867 "num_blocks": 65536, 00:14:20.867 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.867 "assigned_rate_limits": { 00:14:20.867 "rw_ios_per_sec": 0, 00:14:20.867 "rw_mbytes_per_sec": 0, 00:14:20.867 "r_mbytes_per_sec": 0, 00:14:20.867 "w_mbytes_per_sec": 0 00:14:20.867 }, 00:14:20.867 "claimed": true, 00:14:20.867 "claim_type": "exclusive_write", 00:14:20.867 "zoned": false, 00:14:20.867 "supported_io_types": { 00:14:20.867 "read": true, 00:14:20.867 "write": true, 00:14:20.867 "unmap": true, 00:14:20.867 "flush": true, 00:14:20.867 "reset": true, 00:14:20.867 "nvme_admin": false, 00:14:20.867 "nvme_io": false, 00:14:20.867 "nvme_io_md": false, 00:14:20.867 "write_zeroes": true, 00:14:20.867 "zcopy": true, 00:14:20.867 "get_zone_info": false, 00:14:20.867 "zone_management": false, 00:14:20.867 "zone_append": false, 00:14:20.867 "compare": false, 00:14:20.867 "compare_and_write": false, 00:14:20.867 "abort": true, 00:14:20.867 "seek_hole": false, 00:14:20.867 "seek_data": false, 00:14:20.867 "copy": true, 00:14:20.867 "nvme_iov_md": false 00:14:20.867 }, 00:14:20.867 "memory_domains": [ 00:14:20.867 { 00:14:20.867 "dma_device_id": "system", 00:14:20.867 "dma_device_type": 1 00:14:20.867 }, 00:14:20.867 { 00:14:20.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.867 "dma_device_type": 2 00:14:20.867 } 00:14:20.867 ], 00:14:20.867 "driver_specific": { 00:14:20.867 "passthru": { 00:14:20.867 "name": "pt2", 00:14:20.867 "base_bdev_name": "malloc2" 00:14:20.867 } 00:14:20.867 } 00:14:20.867 }' 00:14:20.867 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.867 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.867 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:20.867 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.125 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.125 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:21.126 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.126 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.126 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.126 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.126 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.126 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.126 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:21.126 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:14:21.385 [2024-07-25 06:30:34.877494] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' a7d546f9-fc11-48ba-a8e0-0070e184832e '!=' a7d546f9-fc11-48ba-a8e0-0070e184832e ']' 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1098156 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1098156 ']' 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1098156 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:21.385 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1098156 00:14:21.644 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:21.644 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:21.644 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1098156' 00:14:21.644 killing process with pid 1098156 00:14:21.644 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1098156 00:14:21.644 [2024-07-25 06:30:34.960548] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:21.644 [2024-07-25 06:30:34.960599] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:21.644 [2024-07-25 06:30:34.960639] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:21.644 [2024-07-25 06:30:34.960649] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x225f700 name raid_bdev1, state offline 00:14:21.644 06:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1098156 00:14:21.644 [2024-07-25 06:30:34.976352] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:21.644 06:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:14:21.644 00:14:21.644 real 0m10.012s 00:14:21.644 user 0m17.818s 00:14:21.644 sys 0m1.893s 00:14:21.644 06:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:21.644 06:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.644 ************************************ 00:14:21.644 END TEST raid_superblock_test 00:14:21.644 ************************************ 00:14:21.644 06:30:35 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:14:21.645 06:30:35 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:21.645 06:30:35 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:21.645 06:30:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:21.904 ************************************ 00:14:21.904 START TEST raid_read_error_test 00:14:21.904 ************************************ 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 read 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:21.904 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.maRap0OdGM 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1100197 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1100197 /var/tmp/spdk-raid.sock 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1100197 ']' 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:21.905 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:21.905 06:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.905 [2024-07-25 06:30:35.312422] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:14:21.905 [2024-07-25 06:30:35.312480] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1100197 ] 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:21.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:21.905 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:21.905 [2024-07-25 06:30:35.447676] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:22.165 [2024-07-25 06:30:35.492014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:22.165 [2024-07-25 06:30:35.549167] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:22.165 [2024-07-25 06:30:35.549201] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:22.732 06:30:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:22.732 06:30:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:14:22.732 06:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:22.732 06:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:22.992 BaseBdev1_malloc 00:14:22.992 06:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:23.251 true 00:14:23.251 06:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:23.251 [2024-07-25 06:30:36.803925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:23.251 [2024-07-25 06:30:36.803967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:23.251 [2024-07-25 06:30:36.803987] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a18a60 00:14:23.251 [2024-07-25 06:30:36.803998] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:23.251 [2024-07-25 06:30:36.805468] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:23.251 [2024-07-25 06:30:36.805497] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:23.510 BaseBdev1 00:14:23.510 06:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:23.510 06:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:23.510 BaseBdev2_malloc 00:14:23.510 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:23.768 true 00:14:23.769 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:24.028 [2024-07-25 06:30:37.478031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:24.028 [2024-07-25 06:30:37.478070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.028 [2024-07-25 06:30:37.478091] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a1ddc0 00:14:24.028 [2024-07-25 06:30:37.478103] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.028 [2024-07-25 06:30:37.479466] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.028 [2024-07-25 06:30:37.479493] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:24.028 BaseBdev2 00:14:24.028 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:14:24.287 [2024-07-25 06:30:37.702648] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:24.287 [2024-07-25 06:30:37.703822] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:24.287 [2024-07-25 06:30:37.703986] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a1ee90 00:14:24.287 [2024-07-25 06:30:37.703998] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:24.287 [2024-07-25 06:30:37.704177] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a1aa50 00:14:24.287 [2024-07-25 06:30:37.704309] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a1ee90 00:14:24.287 [2024-07-25 06:30:37.704319] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a1ee90 00:14:24.287 [2024-07-25 06:30:37.704411] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.287 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:24.546 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.546 "name": "raid_bdev1", 00:14:24.546 "uuid": "a31a14cb-74b3-45f5-871e-962d24f8dec5", 00:14:24.546 "strip_size_kb": 64, 00:14:24.546 "state": "online", 00:14:24.546 "raid_level": "raid0", 00:14:24.546 "superblock": true, 00:14:24.546 "num_base_bdevs": 2, 00:14:24.546 "num_base_bdevs_discovered": 2, 00:14:24.546 "num_base_bdevs_operational": 2, 00:14:24.546 "base_bdevs_list": [ 00:14:24.546 { 00:14:24.546 "name": "BaseBdev1", 00:14:24.546 "uuid": "0b018d09-ebd9-5867-b652-36c27fa462ad", 00:14:24.546 "is_configured": true, 00:14:24.546 "data_offset": 2048, 00:14:24.546 "data_size": 63488 00:14:24.546 }, 00:14:24.546 { 00:14:24.546 "name": "BaseBdev2", 00:14:24.546 "uuid": "569721f9-0669-5078-9ffe-2fd72c6f4e9f", 00:14:24.546 "is_configured": true, 00:14:24.546 "data_offset": 2048, 00:14:24.546 "data_size": 63488 00:14:24.546 } 00:14:24.546 ] 00:14:24.546 }' 00:14:24.546 06:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.546 06:30:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.114 06:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:14:25.114 06:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:25.114 [2024-07-25 06:30:38.617291] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a1e990 00:14:26.052 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.310 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:26.571 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.571 "name": "raid_bdev1", 00:14:26.571 "uuid": "a31a14cb-74b3-45f5-871e-962d24f8dec5", 00:14:26.571 "strip_size_kb": 64, 00:14:26.571 "state": "online", 00:14:26.571 "raid_level": "raid0", 00:14:26.571 "superblock": true, 00:14:26.571 "num_base_bdevs": 2, 00:14:26.571 "num_base_bdevs_discovered": 2, 00:14:26.571 "num_base_bdevs_operational": 2, 00:14:26.571 "base_bdevs_list": [ 00:14:26.571 { 00:14:26.571 "name": "BaseBdev1", 00:14:26.571 "uuid": "0b018d09-ebd9-5867-b652-36c27fa462ad", 00:14:26.571 "is_configured": true, 00:14:26.571 "data_offset": 2048, 00:14:26.571 "data_size": 63488 00:14:26.571 }, 00:14:26.571 { 00:14:26.571 "name": "BaseBdev2", 00:14:26.572 "uuid": "569721f9-0669-5078-9ffe-2fd72c6f4e9f", 00:14:26.572 "is_configured": true, 00:14:26.572 "data_offset": 2048, 00:14:26.572 "data_size": 63488 00:14:26.572 } 00:14:26.572 ] 00:14:26.572 }' 00:14:26.572 06:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.572 06:30:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.199 06:30:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:27.460 [2024-07-25 06:30:40.766718] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:27.460 [2024-07-25 06:30:40.766748] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:27.460 [2024-07-25 06:30:40.769652] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:27.460 [2024-07-25 06:30:40.769692] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:27.460 [2024-07-25 06:30:40.769717] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:27.460 [2024-07-25 06:30:40.769732] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a1ee90 name raid_bdev1, state offline 00:14:27.460 0 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1100197 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1100197 ']' 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1100197 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1100197 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1100197' 00:14:27.460 killing process with pid 1100197 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1100197 00:14:27.460 [2024-07-25 06:30:40.844227] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:27.460 06:30:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1100197 00:14:27.460 [2024-07-25 06:30:40.853945] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.maRap0OdGM 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:14:27.721 00:14:27.721 real 0m5.808s 00:14:27.721 user 0m9.006s 00:14:27.721 sys 0m1.036s 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:27.721 06:30:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.721 ************************************ 00:14:27.721 END TEST raid_read_error_test 00:14:27.721 ************************************ 00:14:27.721 06:30:41 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:14:27.721 06:30:41 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:27.721 06:30:41 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:27.721 06:30:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:27.721 ************************************ 00:14:27.721 START TEST raid_write_error_test 00:14:27.721 ************************************ 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 write 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.7ogbpAV3Ys 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1101203 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1101203 /var/tmp/spdk-raid.sock 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1101203 ']' 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:27.721 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:27.721 06:30:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.721 [2024-07-25 06:30:41.207971] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:14:27.721 [2024-07-25 06:30:41.208031] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1101203 ] 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:27.980 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:27.980 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:27.980 [2024-07-25 06:30:41.345424] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:27.980 [2024-07-25 06:30:41.390217] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:27.980 [2024-07-25 06:30:41.450399] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:27.980 [2024-07-25 06:30:41.450433] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:28.917 06:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:28.917 06:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:14:28.917 06:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:28.917 06:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:28.917 BaseBdev1_malloc 00:14:28.917 06:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:29.177 true 00:14:29.177 06:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:29.436 [2024-07-25 06:30:42.769866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:29.436 [2024-07-25 06:30:42.769907] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:29.436 [2024-07-25 06:30:42.769926] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf3a60 00:14:29.436 [2024-07-25 06:30:42.769940] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:29.436 [2024-07-25 06:30:42.771506] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:29.436 [2024-07-25 06:30:42.771535] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:29.436 BaseBdev1 00:14:29.436 06:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:29.436 06:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:29.695 BaseBdev2_malloc 00:14:29.695 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:29.695 true 00:14:29.695 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:29.954 [2024-07-25 06:30:43.455981] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:29.955 [2024-07-25 06:30:43.456019] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:29.955 [2024-07-25 06:30:43.456038] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf8dc0 00:14:29.955 [2024-07-25 06:30:43.456049] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:29.955 [2024-07-25 06:30:43.457335] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:29.955 [2024-07-25 06:30:43.457362] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:29.955 BaseBdev2 00:14:29.955 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:14:30.213 [2024-07-25 06:30:43.692627] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:30.213 [2024-07-25 06:30:43.693777] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:30.213 [2024-07-25 06:30:43.693943] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xdf9e90 00:14:30.213 [2024-07-25 06:30:43.693955] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:30.214 [2024-07-25 06:30:43.694128] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdf5a50 00:14:30.214 [2024-07-25 06:30:43.694268] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdf9e90 00:14:30.214 [2024-07-25 06:30:43.694277] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdf9e90 00:14:30.214 [2024-07-25 06:30:43.694369] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.214 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:30.473 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.473 "name": "raid_bdev1", 00:14:30.473 "uuid": "2c20fa3f-601c-446a-8f5d-86802c747d8e", 00:14:30.473 "strip_size_kb": 64, 00:14:30.473 "state": "online", 00:14:30.473 "raid_level": "raid0", 00:14:30.473 "superblock": true, 00:14:30.473 "num_base_bdevs": 2, 00:14:30.473 "num_base_bdevs_discovered": 2, 00:14:30.473 "num_base_bdevs_operational": 2, 00:14:30.473 "base_bdevs_list": [ 00:14:30.473 { 00:14:30.473 "name": "BaseBdev1", 00:14:30.473 "uuid": "a4af0d1f-d860-54c5-b986-3f130653e8de", 00:14:30.473 "is_configured": true, 00:14:30.473 "data_offset": 2048, 00:14:30.473 "data_size": 63488 00:14:30.473 }, 00:14:30.473 { 00:14:30.473 "name": "BaseBdev2", 00:14:30.473 "uuid": "a021ee06-ba4b-50c6-98d2-c76f9de449a8", 00:14:30.473 "is_configured": true, 00:14:30.473 "data_offset": 2048, 00:14:30.473 "data_size": 63488 00:14:30.473 } 00:14:30.473 ] 00:14:30.473 }' 00:14:30.473 06:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.473 06:30:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.041 06:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:14:31.041 06:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:31.300 [2024-07-25 06:30:44.599251] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdf9990 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.239 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:32.498 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.498 "name": "raid_bdev1", 00:14:32.498 "uuid": "2c20fa3f-601c-446a-8f5d-86802c747d8e", 00:14:32.498 "strip_size_kb": 64, 00:14:32.498 "state": "online", 00:14:32.498 "raid_level": "raid0", 00:14:32.498 "superblock": true, 00:14:32.498 "num_base_bdevs": 2, 00:14:32.498 "num_base_bdevs_discovered": 2, 00:14:32.498 "num_base_bdevs_operational": 2, 00:14:32.498 "base_bdevs_list": [ 00:14:32.498 { 00:14:32.498 "name": "BaseBdev1", 00:14:32.498 "uuid": "a4af0d1f-d860-54c5-b986-3f130653e8de", 00:14:32.498 "is_configured": true, 00:14:32.498 "data_offset": 2048, 00:14:32.498 "data_size": 63488 00:14:32.498 }, 00:14:32.498 { 00:14:32.498 "name": "BaseBdev2", 00:14:32.498 "uuid": "a021ee06-ba4b-50c6-98d2-c76f9de449a8", 00:14:32.498 "is_configured": true, 00:14:32.498 "data_offset": 2048, 00:14:32.498 "data_size": 63488 00:14:32.498 } 00:14:32.498 ] 00:14:32.498 }' 00:14:32.498 06:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.498 06:30:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.067 06:30:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:33.326 [2024-07-25 06:30:46.733292] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:33.326 [2024-07-25 06:30:46.733328] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:33.326 [2024-07-25 06:30:46.736238] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:33.326 [2024-07-25 06:30:46.736266] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:33.326 [2024-07-25 06:30:46.736290] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:33.326 [2024-07-25 06:30:46.736300] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdf9e90 name raid_bdev1, state offline 00:14:33.326 0 00:14:33.326 06:30:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1101203 00:14:33.326 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1101203 ']' 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1101203 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1101203 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1101203' 00:14:33.327 killing process with pid 1101203 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1101203 00:14:33.327 [2024-07-25 06:30:46.811578] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:33.327 06:30:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1101203 00:14:33.327 [2024-07-25 06:30:46.821029] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:33.586 06:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:14:33.586 06:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.7ogbpAV3Ys 00:14:33.586 06:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:14:33.586 06:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:14:33.587 06:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:14:33.587 06:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:33.587 06:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:33.587 06:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:14:33.587 00:14:33.587 real 0m5.879s 00:14:33.587 user 0m9.167s 00:14:33.587 sys 0m1.025s 00:14:33.587 06:30:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:33.587 06:30:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.587 ************************************ 00:14:33.587 END TEST raid_write_error_test 00:14:33.587 ************************************ 00:14:33.587 06:30:47 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:14:33.587 06:30:47 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:14:33.587 06:30:47 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:33.587 06:30:47 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:33.587 06:30:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:33.587 ************************************ 00:14:33.587 START TEST raid_state_function_test 00:14:33.587 ************************************ 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 false 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1102283 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1102283' 00:14:33.587 Process raid pid: 1102283 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1102283 /var/tmp/spdk-raid.sock 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1102283 ']' 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:33.587 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:33.587 06:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.847 [2024-07-25 06:30:47.165314] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:14:33.847 [2024-07-25 06:30:47.165371] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:33.847 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:33.847 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:33.847 [2024-07-25 06:30:47.292329] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:33.847 [2024-07-25 06:30:47.337113] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:33.847 [2024-07-25 06:30:47.399979] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:33.847 [2024-07-25 06:30:47.400012] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:34.785 [2024-07-25 06:30:48.260284] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:34.785 [2024-07-25 06:30:48.260319] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:34.785 [2024-07-25 06:30:48.260329] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:34.785 [2024-07-25 06:30:48.260340] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.785 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.045 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.045 "name": "Existed_Raid", 00:14:35.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.045 "strip_size_kb": 64, 00:14:35.045 "state": "configuring", 00:14:35.045 "raid_level": "concat", 00:14:35.045 "superblock": false, 00:14:35.045 "num_base_bdevs": 2, 00:14:35.045 "num_base_bdevs_discovered": 0, 00:14:35.045 "num_base_bdevs_operational": 2, 00:14:35.045 "base_bdevs_list": [ 00:14:35.045 { 00:14:35.045 "name": "BaseBdev1", 00:14:35.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.045 "is_configured": false, 00:14:35.045 "data_offset": 0, 00:14:35.045 "data_size": 0 00:14:35.045 }, 00:14:35.045 { 00:14:35.045 "name": "BaseBdev2", 00:14:35.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.045 "is_configured": false, 00:14:35.045 "data_offset": 0, 00:14:35.045 "data_size": 0 00:14:35.045 } 00:14:35.045 ] 00:14:35.045 }' 00:14:35.045 06:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.045 06:30:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.612 06:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:35.871 [2024-07-25 06:30:49.278861] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:35.871 [2024-07-25 06:30:49.278885] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e4470 name Existed_Raid, state configuring 00:14:35.871 06:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:36.131 [2024-07-25 06:30:49.507461] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:36.131 [2024-07-25 06:30:49.507487] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:36.131 [2024-07-25 06:30:49.507496] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:36.131 [2024-07-25 06:30:49.507506] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:36.131 06:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:36.390 [2024-07-25 06:30:49.741572] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:36.390 BaseBdev1 00:14:36.390 06:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:36.390 06:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:36.390 06:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:36.390 06:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:36.390 06:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:36.390 06:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:36.390 06:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:36.650 06:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:36.650 [ 00:14:36.650 { 00:14:36.650 "name": "BaseBdev1", 00:14:36.650 "aliases": [ 00:14:36.650 "4d59f9a6-7fb4-4b13-b80f-a32e967ee96f" 00:14:36.650 ], 00:14:36.650 "product_name": "Malloc disk", 00:14:36.650 "block_size": 512, 00:14:36.650 "num_blocks": 65536, 00:14:36.650 "uuid": "4d59f9a6-7fb4-4b13-b80f-a32e967ee96f", 00:14:36.650 "assigned_rate_limits": { 00:14:36.650 "rw_ios_per_sec": 0, 00:14:36.650 "rw_mbytes_per_sec": 0, 00:14:36.650 "r_mbytes_per_sec": 0, 00:14:36.650 "w_mbytes_per_sec": 0 00:14:36.650 }, 00:14:36.650 "claimed": true, 00:14:36.650 "claim_type": "exclusive_write", 00:14:36.650 "zoned": false, 00:14:36.650 "supported_io_types": { 00:14:36.650 "read": true, 00:14:36.650 "write": true, 00:14:36.650 "unmap": true, 00:14:36.650 "flush": true, 00:14:36.650 "reset": true, 00:14:36.650 "nvme_admin": false, 00:14:36.650 "nvme_io": false, 00:14:36.650 "nvme_io_md": false, 00:14:36.650 "write_zeroes": true, 00:14:36.650 "zcopy": true, 00:14:36.650 "get_zone_info": false, 00:14:36.650 "zone_management": false, 00:14:36.650 "zone_append": false, 00:14:36.650 "compare": false, 00:14:36.650 "compare_and_write": false, 00:14:36.650 "abort": true, 00:14:36.650 "seek_hole": false, 00:14:36.650 "seek_data": false, 00:14:36.650 "copy": true, 00:14:36.650 "nvme_iov_md": false 00:14:36.650 }, 00:14:36.650 "memory_domains": [ 00:14:36.650 { 00:14:36.650 "dma_device_id": "system", 00:14:36.650 "dma_device_type": 1 00:14:36.650 }, 00:14:36.650 { 00:14:36.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.650 "dma_device_type": 2 00:14:36.650 } 00:14:36.650 ], 00:14:36.650 "driver_specific": {} 00:14:36.650 } 00:14:36.650 ] 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.909 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.909 "name": "Existed_Raid", 00:14:36.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.909 "strip_size_kb": 64, 00:14:36.909 "state": "configuring", 00:14:36.909 "raid_level": "concat", 00:14:36.909 "superblock": false, 00:14:36.909 "num_base_bdevs": 2, 00:14:36.909 "num_base_bdevs_discovered": 1, 00:14:36.909 "num_base_bdevs_operational": 2, 00:14:36.909 "base_bdevs_list": [ 00:14:36.910 { 00:14:36.910 "name": "BaseBdev1", 00:14:36.910 "uuid": "4d59f9a6-7fb4-4b13-b80f-a32e967ee96f", 00:14:36.910 "is_configured": true, 00:14:36.910 "data_offset": 0, 00:14:36.910 "data_size": 65536 00:14:36.910 }, 00:14:36.910 { 00:14:36.910 "name": "BaseBdev2", 00:14:36.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.910 "is_configured": false, 00:14:36.910 "data_offset": 0, 00:14:36.910 "data_size": 0 00:14:36.910 } 00:14:36.910 ] 00:14:36.910 }' 00:14:36.910 06:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.910 06:30:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.478 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:37.738 [2024-07-25 06:30:51.241680] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:37.738 [2024-07-25 06:30:51.241713] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e3ce0 name Existed_Raid, state configuring 00:14:37.738 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:37.997 [2024-07-25 06:30:51.470309] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:37.997 [2024-07-25 06:30:51.471677] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:37.997 [2024-07-25 06:30:51.471708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.997 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.256 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.256 "name": "Existed_Raid", 00:14:38.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.256 "strip_size_kb": 64, 00:14:38.256 "state": "configuring", 00:14:38.256 "raid_level": "concat", 00:14:38.256 "superblock": false, 00:14:38.256 "num_base_bdevs": 2, 00:14:38.256 "num_base_bdevs_discovered": 1, 00:14:38.256 "num_base_bdevs_operational": 2, 00:14:38.256 "base_bdevs_list": [ 00:14:38.256 { 00:14:38.256 "name": "BaseBdev1", 00:14:38.256 "uuid": "4d59f9a6-7fb4-4b13-b80f-a32e967ee96f", 00:14:38.256 "is_configured": true, 00:14:38.256 "data_offset": 0, 00:14:38.256 "data_size": 65536 00:14:38.256 }, 00:14:38.256 { 00:14:38.256 "name": "BaseBdev2", 00:14:38.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.257 "is_configured": false, 00:14:38.257 "data_offset": 0, 00:14:38.257 "data_size": 0 00:14:38.257 } 00:14:38.257 ] 00:14:38.257 }' 00:14:38.257 06:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.257 06:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.825 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:39.084 [2024-07-25 06:30:52.488028] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:39.084 [2024-07-25 06:30:52.488057] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a97120 00:14:39.084 [2024-07-25 06:30:52.488065] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:14:39.084 [2024-07-25 06:30:52.488244] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8e050 00:14:39.084 [2024-07-25 06:30:52.488356] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a97120 00:14:39.084 [2024-07-25 06:30:52.488365] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a97120 00:14:39.084 [2024-07-25 06:30:52.488510] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:39.084 BaseBdev2 00:14:39.084 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:39.084 06:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:39.084 06:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:39.084 06:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:39.084 06:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:39.084 06:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:39.084 06:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.343 06:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:39.603 [ 00:14:39.603 { 00:14:39.603 "name": "BaseBdev2", 00:14:39.603 "aliases": [ 00:14:39.603 "9d612a62-be9e-4e33-9d5d-cb47d6a54bc5" 00:14:39.603 ], 00:14:39.603 "product_name": "Malloc disk", 00:14:39.603 "block_size": 512, 00:14:39.603 "num_blocks": 65536, 00:14:39.603 "uuid": "9d612a62-be9e-4e33-9d5d-cb47d6a54bc5", 00:14:39.603 "assigned_rate_limits": { 00:14:39.603 "rw_ios_per_sec": 0, 00:14:39.603 "rw_mbytes_per_sec": 0, 00:14:39.603 "r_mbytes_per_sec": 0, 00:14:39.603 "w_mbytes_per_sec": 0 00:14:39.603 }, 00:14:39.603 "claimed": true, 00:14:39.603 "claim_type": "exclusive_write", 00:14:39.603 "zoned": false, 00:14:39.603 "supported_io_types": { 00:14:39.603 "read": true, 00:14:39.603 "write": true, 00:14:39.603 "unmap": true, 00:14:39.603 "flush": true, 00:14:39.603 "reset": true, 00:14:39.603 "nvme_admin": false, 00:14:39.603 "nvme_io": false, 00:14:39.603 "nvme_io_md": false, 00:14:39.603 "write_zeroes": true, 00:14:39.603 "zcopy": true, 00:14:39.603 "get_zone_info": false, 00:14:39.603 "zone_management": false, 00:14:39.603 "zone_append": false, 00:14:39.603 "compare": false, 00:14:39.603 "compare_and_write": false, 00:14:39.603 "abort": true, 00:14:39.603 "seek_hole": false, 00:14:39.603 "seek_data": false, 00:14:39.603 "copy": true, 00:14:39.603 "nvme_iov_md": false 00:14:39.603 }, 00:14:39.603 "memory_domains": [ 00:14:39.603 { 00:14:39.603 "dma_device_id": "system", 00:14:39.603 "dma_device_type": 1 00:14:39.603 }, 00:14:39.603 { 00:14:39.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.603 "dma_device_type": 2 00:14:39.603 } 00:14:39.603 ], 00:14:39.603 "driver_specific": {} 00:14:39.603 } 00:14:39.603 ] 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.603 06:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.862 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.862 "name": "Existed_Raid", 00:14:39.862 "uuid": "3db7344c-f56a-4b57-b1d6-1d87ae28355a", 00:14:39.862 "strip_size_kb": 64, 00:14:39.862 "state": "online", 00:14:39.862 "raid_level": "concat", 00:14:39.862 "superblock": false, 00:14:39.862 "num_base_bdevs": 2, 00:14:39.862 "num_base_bdevs_discovered": 2, 00:14:39.862 "num_base_bdevs_operational": 2, 00:14:39.862 "base_bdevs_list": [ 00:14:39.862 { 00:14:39.862 "name": "BaseBdev1", 00:14:39.862 "uuid": "4d59f9a6-7fb4-4b13-b80f-a32e967ee96f", 00:14:39.862 "is_configured": true, 00:14:39.862 "data_offset": 0, 00:14:39.862 "data_size": 65536 00:14:39.862 }, 00:14:39.862 { 00:14:39.862 "name": "BaseBdev2", 00:14:39.862 "uuid": "9d612a62-be9e-4e33-9d5d-cb47d6a54bc5", 00:14:39.862 "is_configured": true, 00:14:39.862 "data_offset": 0, 00:14:39.862 "data_size": 65536 00:14:39.862 } 00:14:39.862 ] 00:14:39.862 }' 00:14:39.862 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.862 06:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.457 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:40.457 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:40.457 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:40.457 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:40.457 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:40.458 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:40.458 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:40.458 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:40.458 [2024-07-25 06:30:53.968177] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:40.458 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:40.458 "name": "Existed_Raid", 00:14:40.458 "aliases": [ 00:14:40.458 "3db7344c-f56a-4b57-b1d6-1d87ae28355a" 00:14:40.458 ], 00:14:40.458 "product_name": "Raid Volume", 00:14:40.458 "block_size": 512, 00:14:40.458 "num_blocks": 131072, 00:14:40.458 "uuid": "3db7344c-f56a-4b57-b1d6-1d87ae28355a", 00:14:40.458 "assigned_rate_limits": { 00:14:40.458 "rw_ios_per_sec": 0, 00:14:40.458 "rw_mbytes_per_sec": 0, 00:14:40.458 "r_mbytes_per_sec": 0, 00:14:40.458 "w_mbytes_per_sec": 0 00:14:40.458 }, 00:14:40.458 "claimed": false, 00:14:40.458 "zoned": false, 00:14:40.458 "supported_io_types": { 00:14:40.458 "read": true, 00:14:40.458 "write": true, 00:14:40.458 "unmap": true, 00:14:40.458 "flush": true, 00:14:40.458 "reset": true, 00:14:40.458 "nvme_admin": false, 00:14:40.458 "nvme_io": false, 00:14:40.458 "nvme_io_md": false, 00:14:40.458 "write_zeroes": true, 00:14:40.458 "zcopy": false, 00:14:40.458 "get_zone_info": false, 00:14:40.458 "zone_management": false, 00:14:40.458 "zone_append": false, 00:14:40.458 "compare": false, 00:14:40.458 "compare_and_write": false, 00:14:40.458 "abort": false, 00:14:40.458 "seek_hole": false, 00:14:40.458 "seek_data": false, 00:14:40.458 "copy": false, 00:14:40.458 "nvme_iov_md": false 00:14:40.458 }, 00:14:40.458 "memory_domains": [ 00:14:40.458 { 00:14:40.458 "dma_device_id": "system", 00:14:40.458 "dma_device_type": 1 00:14:40.458 }, 00:14:40.458 { 00:14:40.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.458 "dma_device_type": 2 00:14:40.458 }, 00:14:40.458 { 00:14:40.458 "dma_device_id": "system", 00:14:40.458 "dma_device_type": 1 00:14:40.458 }, 00:14:40.458 { 00:14:40.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.458 "dma_device_type": 2 00:14:40.458 } 00:14:40.458 ], 00:14:40.458 "driver_specific": { 00:14:40.458 "raid": { 00:14:40.458 "uuid": "3db7344c-f56a-4b57-b1d6-1d87ae28355a", 00:14:40.458 "strip_size_kb": 64, 00:14:40.458 "state": "online", 00:14:40.458 "raid_level": "concat", 00:14:40.458 "superblock": false, 00:14:40.458 "num_base_bdevs": 2, 00:14:40.458 "num_base_bdevs_discovered": 2, 00:14:40.458 "num_base_bdevs_operational": 2, 00:14:40.458 "base_bdevs_list": [ 00:14:40.458 { 00:14:40.458 "name": "BaseBdev1", 00:14:40.458 "uuid": "4d59f9a6-7fb4-4b13-b80f-a32e967ee96f", 00:14:40.458 "is_configured": true, 00:14:40.458 "data_offset": 0, 00:14:40.458 "data_size": 65536 00:14:40.458 }, 00:14:40.458 { 00:14:40.458 "name": "BaseBdev2", 00:14:40.458 "uuid": "9d612a62-be9e-4e33-9d5d-cb47d6a54bc5", 00:14:40.458 "is_configured": true, 00:14:40.458 "data_offset": 0, 00:14:40.458 "data_size": 65536 00:14:40.458 } 00:14:40.458 ] 00:14:40.458 } 00:14:40.458 } 00:14:40.458 }' 00:14:40.458 06:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:40.717 BaseBdev2' 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.717 "name": "BaseBdev1", 00:14:40.717 "aliases": [ 00:14:40.717 "4d59f9a6-7fb4-4b13-b80f-a32e967ee96f" 00:14:40.717 ], 00:14:40.717 "product_name": "Malloc disk", 00:14:40.717 "block_size": 512, 00:14:40.717 "num_blocks": 65536, 00:14:40.717 "uuid": "4d59f9a6-7fb4-4b13-b80f-a32e967ee96f", 00:14:40.717 "assigned_rate_limits": { 00:14:40.717 "rw_ios_per_sec": 0, 00:14:40.717 "rw_mbytes_per_sec": 0, 00:14:40.717 "r_mbytes_per_sec": 0, 00:14:40.717 "w_mbytes_per_sec": 0 00:14:40.717 }, 00:14:40.717 "claimed": true, 00:14:40.717 "claim_type": "exclusive_write", 00:14:40.717 "zoned": false, 00:14:40.717 "supported_io_types": { 00:14:40.717 "read": true, 00:14:40.717 "write": true, 00:14:40.717 "unmap": true, 00:14:40.717 "flush": true, 00:14:40.717 "reset": true, 00:14:40.717 "nvme_admin": false, 00:14:40.717 "nvme_io": false, 00:14:40.717 "nvme_io_md": false, 00:14:40.717 "write_zeroes": true, 00:14:40.717 "zcopy": true, 00:14:40.717 "get_zone_info": false, 00:14:40.717 "zone_management": false, 00:14:40.717 "zone_append": false, 00:14:40.717 "compare": false, 00:14:40.717 "compare_and_write": false, 00:14:40.717 "abort": true, 00:14:40.717 "seek_hole": false, 00:14:40.717 "seek_data": false, 00:14:40.717 "copy": true, 00:14:40.717 "nvme_iov_md": false 00:14:40.717 }, 00:14:40.717 "memory_domains": [ 00:14:40.717 { 00:14:40.717 "dma_device_id": "system", 00:14:40.717 "dma_device_type": 1 00:14:40.717 }, 00:14:40.717 { 00:14:40.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.717 "dma_device_type": 2 00:14:40.717 } 00:14:40.717 ], 00:14:40.717 "driver_specific": {} 00:14:40.717 }' 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.717 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.976 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:41.236 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.236 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.236 "name": "BaseBdev2", 00:14:41.236 "aliases": [ 00:14:41.236 "9d612a62-be9e-4e33-9d5d-cb47d6a54bc5" 00:14:41.236 ], 00:14:41.236 "product_name": "Malloc disk", 00:14:41.236 "block_size": 512, 00:14:41.236 "num_blocks": 65536, 00:14:41.236 "uuid": "9d612a62-be9e-4e33-9d5d-cb47d6a54bc5", 00:14:41.236 "assigned_rate_limits": { 00:14:41.236 "rw_ios_per_sec": 0, 00:14:41.236 "rw_mbytes_per_sec": 0, 00:14:41.236 "r_mbytes_per_sec": 0, 00:14:41.236 "w_mbytes_per_sec": 0 00:14:41.236 }, 00:14:41.236 "claimed": true, 00:14:41.236 "claim_type": "exclusive_write", 00:14:41.236 "zoned": false, 00:14:41.236 "supported_io_types": { 00:14:41.236 "read": true, 00:14:41.236 "write": true, 00:14:41.236 "unmap": true, 00:14:41.236 "flush": true, 00:14:41.236 "reset": true, 00:14:41.236 "nvme_admin": false, 00:14:41.236 "nvme_io": false, 00:14:41.236 "nvme_io_md": false, 00:14:41.236 "write_zeroes": true, 00:14:41.236 "zcopy": true, 00:14:41.236 "get_zone_info": false, 00:14:41.236 "zone_management": false, 00:14:41.236 "zone_append": false, 00:14:41.236 "compare": false, 00:14:41.236 "compare_and_write": false, 00:14:41.236 "abort": true, 00:14:41.236 "seek_hole": false, 00:14:41.236 "seek_data": false, 00:14:41.236 "copy": true, 00:14:41.236 "nvme_iov_md": false 00:14:41.236 }, 00:14:41.236 "memory_domains": [ 00:14:41.236 { 00:14:41.236 "dma_device_id": "system", 00:14:41.236 "dma_device_type": 1 00:14:41.237 }, 00:14:41.237 { 00:14:41.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.237 "dma_device_type": 2 00:14:41.237 } 00:14:41.237 ], 00:14:41.237 "driver_specific": {} 00:14:41.237 }' 00:14:41.237 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.237 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.496 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.496 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.496 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.496 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.496 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.496 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.496 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.496 06:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.496 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:41.754 [2024-07-25 06:30:55.283439] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:41.754 [2024-07-25 06:30:55.283463] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:41.754 [2024-07-25 06:30:55.283500] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.754 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.014 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.014 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.014 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.014 "name": "Existed_Raid", 00:14:42.014 "uuid": "3db7344c-f56a-4b57-b1d6-1d87ae28355a", 00:14:42.014 "strip_size_kb": 64, 00:14:42.014 "state": "offline", 00:14:42.014 "raid_level": "concat", 00:14:42.014 "superblock": false, 00:14:42.014 "num_base_bdevs": 2, 00:14:42.014 "num_base_bdevs_discovered": 1, 00:14:42.014 "num_base_bdevs_operational": 1, 00:14:42.014 "base_bdevs_list": [ 00:14:42.014 { 00:14:42.014 "name": null, 00:14:42.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.014 "is_configured": false, 00:14:42.014 "data_offset": 0, 00:14:42.014 "data_size": 65536 00:14:42.014 }, 00:14:42.014 { 00:14:42.014 "name": "BaseBdev2", 00:14:42.014 "uuid": "9d612a62-be9e-4e33-9d5d-cb47d6a54bc5", 00:14:42.014 "is_configured": true, 00:14:42.014 "data_offset": 0, 00:14:42.014 "data_size": 65536 00:14:42.014 } 00:14:42.014 ] 00:14:42.014 }' 00:14:42.014 06:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.014 06:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.582 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:42.582 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:42.582 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.582 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:42.841 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:42.841 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:42.841 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:43.100 [2024-07-25 06:30:56.523780] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:43.100 [2024-07-25 06:30:56.523828] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a97120 name Existed_Raid, state offline 00:14:43.100 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:43.100 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.100 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.100 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1102283 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1102283 ']' 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1102283 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1102283 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1102283' 00:14:43.360 killing process with pid 1102283 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1102283 00:14:43.360 [2024-07-25 06:30:56.838689] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:43.360 06:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1102283 00:14:43.360 [2024-07-25 06:30:56.839553] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:43.620 00:14:43.620 real 0m9.917s 00:14:43.620 user 0m17.590s 00:14:43.620 sys 0m1.893s 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.620 ************************************ 00:14:43.620 END TEST raid_state_function_test 00:14:43.620 ************************************ 00:14:43.620 06:30:57 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:14:43.620 06:30:57 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:43.620 06:30:57 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:43.620 06:30:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:43.620 ************************************ 00:14:43.620 START TEST raid_state_function_test_sb 00:14:43.620 ************************************ 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 true 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1104322 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1104322' 00:14:43.620 Process raid pid: 1104322 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1104322 /var/tmp/spdk-raid.sock 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1104322 ']' 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:43.620 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:43.620 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:43.620 [2024-07-25 06:30:57.166501] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:14:43.620 [2024-07-25 06:30:57.166559] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:43.880 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:43.880 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:43.880 [2024-07-25 06:30:57.293006] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:43.881 [2024-07-25 06:30:57.335957] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:43.881 [2024-07-25 06:30:57.402497] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:43.881 [2024-07-25 06:30:57.402533] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:44.140 [2024-07-25 06:30:57.666580] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:44.140 [2024-07-25 06:30:57.666613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:44.140 [2024-07-25 06:30:57.666623] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:44.140 [2024-07-25 06:30:57.666633] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.140 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.399 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.399 "name": "Existed_Raid", 00:14:44.399 "uuid": "5e5acf62-78e7-4a5a-b5d3-8eefeb3912f7", 00:14:44.399 "strip_size_kb": 64, 00:14:44.399 "state": "configuring", 00:14:44.399 "raid_level": "concat", 00:14:44.399 "superblock": true, 00:14:44.399 "num_base_bdevs": 2, 00:14:44.399 "num_base_bdevs_discovered": 0, 00:14:44.399 "num_base_bdevs_operational": 2, 00:14:44.399 "base_bdevs_list": [ 00:14:44.399 { 00:14:44.399 "name": "BaseBdev1", 00:14:44.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.399 "is_configured": false, 00:14:44.399 "data_offset": 0, 00:14:44.399 "data_size": 0 00:14:44.399 }, 00:14:44.399 { 00:14:44.399 "name": "BaseBdev2", 00:14:44.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.399 "is_configured": false, 00:14:44.399 "data_offset": 0, 00:14:44.399 "data_size": 0 00:14:44.399 } 00:14:44.399 ] 00:14:44.399 }' 00:14:44.399 06:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.399 06:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:44.966 06:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:45.224 [2024-07-25 06:30:58.705199] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:45.224 [2024-07-25 06:30:58.705222] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24d3470 name Existed_Raid, state configuring 00:14:45.224 06:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:45.482 [2024-07-25 06:30:58.933807] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:45.482 [2024-07-25 06:30:58.933831] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:45.482 [2024-07-25 06:30:58.933840] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:45.482 [2024-07-25 06:30:58.933850] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:45.482 06:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:45.741 [2024-07-25 06:30:59.171901] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:45.741 BaseBdev1 00:14:45.741 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:45.741 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:45.741 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:45.741 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:45.741 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:45.741 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:45.741 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.999 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:46.258 [ 00:14:46.258 { 00:14:46.258 "name": "BaseBdev1", 00:14:46.258 "aliases": [ 00:14:46.258 "60f6c634-6a22-4893-8c4f-1ee23ee63988" 00:14:46.258 ], 00:14:46.258 "product_name": "Malloc disk", 00:14:46.258 "block_size": 512, 00:14:46.258 "num_blocks": 65536, 00:14:46.258 "uuid": "60f6c634-6a22-4893-8c4f-1ee23ee63988", 00:14:46.258 "assigned_rate_limits": { 00:14:46.258 "rw_ios_per_sec": 0, 00:14:46.258 "rw_mbytes_per_sec": 0, 00:14:46.258 "r_mbytes_per_sec": 0, 00:14:46.258 "w_mbytes_per_sec": 0 00:14:46.258 }, 00:14:46.258 "claimed": true, 00:14:46.258 "claim_type": "exclusive_write", 00:14:46.258 "zoned": false, 00:14:46.258 "supported_io_types": { 00:14:46.258 "read": true, 00:14:46.258 "write": true, 00:14:46.258 "unmap": true, 00:14:46.258 "flush": true, 00:14:46.258 "reset": true, 00:14:46.258 "nvme_admin": false, 00:14:46.258 "nvme_io": false, 00:14:46.258 "nvme_io_md": false, 00:14:46.258 "write_zeroes": true, 00:14:46.258 "zcopy": true, 00:14:46.258 "get_zone_info": false, 00:14:46.258 "zone_management": false, 00:14:46.258 "zone_append": false, 00:14:46.258 "compare": false, 00:14:46.258 "compare_and_write": false, 00:14:46.258 "abort": true, 00:14:46.258 "seek_hole": false, 00:14:46.258 "seek_data": false, 00:14:46.258 "copy": true, 00:14:46.258 "nvme_iov_md": false 00:14:46.258 }, 00:14:46.258 "memory_domains": [ 00:14:46.258 { 00:14:46.258 "dma_device_id": "system", 00:14:46.258 "dma_device_type": 1 00:14:46.258 }, 00:14:46.258 { 00:14:46.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.258 "dma_device_type": 2 00:14:46.258 } 00:14:46.258 ], 00:14:46.258 "driver_specific": {} 00:14:46.258 } 00:14:46.258 ] 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.258 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.517 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.517 "name": "Existed_Raid", 00:14:46.517 "uuid": "c2046f31-fd85-42d4-8642-91b2f675d580", 00:14:46.517 "strip_size_kb": 64, 00:14:46.517 "state": "configuring", 00:14:46.517 "raid_level": "concat", 00:14:46.517 "superblock": true, 00:14:46.517 "num_base_bdevs": 2, 00:14:46.517 "num_base_bdevs_discovered": 1, 00:14:46.517 "num_base_bdevs_operational": 2, 00:14:46.517 "base_bdevs_list": [ 00:14:46.517 { 00:14:46.517 "name": "BaseBdev1", 00:14:46.517 "uuid": "60f6c634-6a22-4893-8c4f-1ee23ee63988", 00:14:46.517 "is_configured": true, 00:14:46.517 "data_offset": 2048, 00:14:46.517 "data_size": 63488 00:14:46.517 }, 00:14:46.517 { 00:14:46.517 "name": "BaseBdev2", 00:14:46.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.517 "is_configured": false, 00:14:46.517 "data_offset": 0, 00:14:46.517 "data_size": 0 00:14:46.517 } 00:14:46.517 ] 00:14:46.517 }' 00:14:46.517 06:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.517 06:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.083 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:47.342 [2024-07-25 06:31:00.639760] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:47.342 [2024-07-25 06:31:00.639797] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24d2ce0 name Existed_Raid, state configuring 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:47.342 [2024-07-25 06:31:00.856370] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:47.342 [2024-07-25 06:31:00.857773] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:47.342 [2024-07-25 06:31:00.857805] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.342 06:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.600 06:31:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.601 "name": "Existed_Raid", 00:14:47.601 "uuid": "c2623947-f8c8-4bd6-9fe5-c4be5c524f94", 00:14:47.601 "strip_size_kb": 64, 00:14:47.601 "state": "configuring", 00:14:47.601 "raid_level": "concat", 00:14:47.601 "superblock": true, 00:14:47.601 "num_base_bdevs": 2, 00:14:47.601 "num_base_bdevs_discovered": 1, 00:14:47.601 "num_base_bdevs_operational": 2, 00:14:47.601 "base_bdevs_list": [ 00:14:47.601 { 00:14:47.601 "name": "BaseBdev1", 00:14:47.601 "uuid": "60f6c634-6a22-4893-8c4f-1ee23ee63988", 00:14:47.601 "is_configured": true, 00:14:47.601 "data_offset": 2048, 00:14:47.601 "data_size": 63488 00:14:47.601 }, 00:14:47.601 { 00:14:47.601 "name": "BaseBdev2", 00:14:47.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.601 "is_configured": false, 00:14:47.601 "data_offset": 0, 00:14:47.601 "data_size": 0 00:14:47.601 } 00:14:47.601 ] 00:14:47.601 }' 00:14:47.601 06:31:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.601 06:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:48.167 06:31:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:48.424 [2024-07-25 06:31:01.898229] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:48.424 [2024-07-25 06:31:01.898356] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2686120 00:14:48.424 [2024-07-25 06:31:01.898368] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:48.424 [2024-07-25 06:31:01.898527] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24d4050 00:14:48.424 [2024-07-25 06:31:01.898634] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2686120 00:14:48.424 [2024-07-25 06:31:01.898643] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2686120 00:14:48.424 [2024-07-25 06:31:01.898727] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:48.424 BaseBdev2 00:14:48.424 06:31:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:48.424 06:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:48.424 06:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:48.424 06:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:48.424 06:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:48.424 06:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:48.424 06:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.680 06:31:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:48.937 [ 00:14:48.937 { 00:14:48.937 "name": "BaseBdev2", 00:14:48.937 "aliases": [ 00:14:48.937 "e062bb23-67b2-449b-b107-50581b123092" 00:14:48.937 ], 00:14:48.937 "product_name": "Malloc disk", 00:14:48.937 "block_size": 512, 00:14:48.937 "num_blocks": 65536, 00:14:48.937 "uuid": "e062bb23-67b2-449b-b107-50581b123092", 00:14:48.937 "assigned_rate_limits": { 00:14:48.937 "rw_ios_per_sec": 0, 00:14:48.937 "rw_mbytes_per_sec": 0, 00:14:48.937 "r_mbytes_per_sec": 0, 00:14:48.937 "w_mbytes_per_sec": 0 00:14:48.937 }, 00:14:48.937 "claimed": true, 00:14:48.937 "claim_type": "exclusive_write", 00:14:48.937 "zoned": false, 00:14:48.937 "supported_io_types": { 00:14:48.937 "read": true, 00:14:48.937 "write": true, 00:14:48.937 "unmap": true, 00:14:48.937 "flush": true, 00:14:48.937 "reset": true, 00:14:48.937 "nvme_admin": false, 00:14:48.937 "nvme_io": false, 00:14:48.937 "nvme_io_md": false, 00:14:48.937 "write_zeroes": true, 00:14:48.937 "zcopy": true, 00:14:48.937 "get_zone_info": false, 00:14:48.937 "zone_management": false, 00:14:48.937 "zone_append": false, 00:14:48.937 "compare": false, 00:14:48.937 "compare_and_write": false, 00:14:48.937 "abort": true, 00:14:48.937 "seek_hole": false, 00:14:48.937 "seek_data": false, 00:14:48.937 "copy": true, 00:14:48.937 "nvme_iov_md": false 00:14:48.937 }, 00:14:48.937 "memory_domains": [ 00:14:48.937 { 00:14:48.937 "dma_device_id": "system", 00:14:48.937 "dma_device_type": 1 00:14:48.937 }, 00:14:48.937 { 00:14:48.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.937 "dma_device_type": 2 00:14:48.937 } 00:14:48.937 ], 00:14:48.937 "driver_specific": {} 00:14:48.937 } 00:14:48.937 ] 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.937 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.195 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.195 "name": "Existed_Raid", 00:14:49.195 "uuid": "c2623947-f8c8-4bd6-9fe5-c4be5c524f94", 00:14:49.195 "strip_size_kb": 64, 00:14:49.195 "state": "online", 00:14:49.195 "raid_level": "concat", 00:14:49.195 "superblock": true, 00:14:49.195 "num_base_bdevs": 2, 00:14:49.195 "num_base_bdevs_discovered": 2, 00:14:49.195 "num_base_bdevs_operational": 2, 00:14:49.195 "base_bdevs_list": [ 00:14:49.195 { 00:14:49.195 "name": "BaseBdev1", 00:14:49.195 "uuid": "60f6c634-6a22-4893-8c4f-1ee23ee63988", 00:14:49.195 "is_configured": true, 00:14:49.195 "data_offset": 2048, 00:14:49.195 "data_size": 63488 00:14:49.195 }, 00:14:49.195 { 00:14:49.195 "name": "BaseBdev2", 00:14:49.195 "uuid": "e062bb23-67b2-449b-b107-50581b123092", 00:14:49.195 "is_configured": true, 00:14:49.195 "data_offset": 2048, 00:14:49.195 "data_size": 63488 00:14:49.195 } 00:14:49.195 ] 00:14:49.195 }' 00:14:49.195 06:31:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.195 06:31:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:49.758 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:49.758 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:49.758 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:49.758 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:49.758 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:49.758 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:49.758 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:49.758 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:50.015 [2024-07-25 06:31:03.362334] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:50.015 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:50.015 "name": "Existed_Raid", 00:14:50.015 "aliases": [ 00:14:50.015 "c2623947-f8c8-4bd6-9fe5-c4be5c524f94" 00:14:50.015 ], 00:14:50.015 "product_name": "Raid Volume", 00:14:50.015 "block_size": 512, 00:14:50.015 "num_blocks": 126976, 00:14:50.015 "uuid": "c2623947-f8c8-4bd6-9fe5-c4be5c524f94", 00:14:50.015 "assigned_rate_limits": { 00:14:50.015 "rw_ios_per_sec": 0, 00:14:50.015 "rw_mbytes_per_sec": 0, 00:14:50.015 "r_mbytes_per_sec": 0, 00:14:50.015 "w_mbytes_per_sec": 0 00:14:50.015 }, 00:14:50.015 "claimed": false, 00:14:50.015 "zoned": false, 00:14:50.015 "supported_io_types": { 00:14:50.015 "read": true, 00:14:50.015 "write": true, 00:14:50.015 "unmap": true, 00:14:50.015 "flush": true, 00:14:50.015 "reset": true, 00:14:50.015 "nvme_admin": false, 00:14:50.015 "nvme_io": false, 00:14:50.015 "nvme_io_md": false, 00:14:50.015 "write_zeroes": true, 00:14:50.015 "zcopy": false, 00:14:50.015 "get_zone_info": false, 00:14:50.015 "zone_management": false, 00:14:50.015 "zone_append": false, 00:14:50.015 "compare": false, 00:14:50.015 "compare_and_write": false, 00:14:50.015 "abort": false, 00:14:50.015 "seek_hole": false, 00:14:50.015 "seek_data": false, 00:14:50.015 "copy": false, 00:14:50.015 "nvme_iov_md": false 00:14:50.015 }, 00:14:50.015 "memory_domains": [ 00:14:50.015 { 00:14:50.015 "dma_device_id": "system", 00:14:50.015 "dma_device_type": 1 00:14:50.015 }, 00:14:50.015 { 00:14:50.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.015 "dma_device_type": 2 00:14:50.015 }, 00:14:50.015 { 00:14:50.015 "dma_device_id": "system", 00:14:50.015 "dma_device_type": 1 00:14:50.015 }, 00:14:50.015 { 00:14:50.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.015 "dma_device_type": 2 00:14:50.015 } 00:14:50.015 ], 00:14:50.015 "driver_specific": { 00:14:50.015 "raid": { 00:14:50.015 "uuid": "c2623947-f8c8-4bd6-9fe5-c4be5c524f94", 00:14:50.015 "strip_size_kb": 64, 00:14:50.015 "state": "online", 00:14:50.015 "raid_level": "concat", 00:14:50.015 "superblock": true, 00:14:50.015 "num_base_bdevs": 2, 00:14:50.015 "num_base_bdevs_discovered": 2, 00:14:50.015 "num_base_bdevs_operational": 2, 00:14:50.015 "base_bdevs_list": [ 00:14:50.015 { 00:14:50.015 "name": "BaseBdev1", 00:14:50.015 "uuid": "60f6c634-6a22-4893-8c4f-1ee23ee63988", 00:14:50.015 "is_configured": true, 00:14:50.015 "data_offset": 2048, 00:14:50.015 "data_size": 63488 00:14:50.015 }, 00:14:50.015 { 00:14:50.015 "name": "BaseBdev2", 00:14:50.015 "uuid": "e062bb23-67b2-449b-b107-50581b123092", 00:14:50.015 "is_configured": true, 00:14:50.015 "data_offset": 2048, 00:14:50.015 "data_size": 63488 00:14:50.015 } 00:14:50.015 ] 00:14:50.015 } 00:14:50.015 } 00:14:50.015 }' 00:14:50.015 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:50.015 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:50.015 BaseBdev2' 00:14:50.015 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:50.015 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:50.015 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:50.272 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:50.272 "name": "BaseBdev1", 00:14:50.272 "aliases": [ 00:14:50.272 "60f6c634-6a22-4893-8c4f-1ee23ee63988" 00:14:50.272 ], 00:14:50.272 "product_name": "Malloc disk", 00:14:50.272 "block_size": 512, 00:14:50.272 "num_blocks": 65536, 00:14:50.272 "uuid": "60f6c634-6a22-4893-8c4f-1ee23ee63988", 00:14:50.272 "assigned_rate_limits": { 00:14:50.272 "rw_ios_per_sec": 0, 00:14:50.272 "rw_mbytes_per_sec": 0, 00:14:50.272 "r_mbytes_per_sec": 0, 00:14:50.272 "w_mbytes_per_sec": 0 00:14:50.272 }, 00:14:50.272 "claimed": true, 00:14:50.272 "claim_type": "exclusive_write", 00:14:50.273 "zoned": false, 00:14:50.273 "supported_io_types": { 00:14:50.273 "read": true, 00:14:50.273 "write": true, 00:14:50.273 "unmap": true, 00:14:50.273 "flush": true, 00:14:50.273 "reset": true, 00:14:50.273 "nvme_admin": false, 00:14:50.273 "nvme_io": false, 00:14:50.273 "nvme_io_md": false, 00:14:50.273 "write_zeroes": true, 00:14:50.273 "zcopy": true, 00:14:50.273 "get_zone_info": false, 00:14:50.273 "zone_management": false, 00:14:50.273 "zone_append": false, 00:14:50.273 "compare": false, 00:14:50.273 "compare_and_write": false, 00:14:50.273 "abort": true, 00:14:50.273 "seek_hole": false, 00:14:50.273 "seek_data": false, 00:14:50.273 "copy": true, 00:14:50.273 "nvme_iov_md": false 00:14:50.273 }, 00:14:50.273 "memory_domains": [ 00:14:50.273 { 00:14:50.273 "dma_device_id": "system", 00:14:50.273 "dma_device_type": 1 00:14:50.273 }, 00:14:50.273 { 00:14:50.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.273 "dma_device_type": 2 00:14:50.273 } 00:14:50.273 ], 00:14:50.273 "driver_specific": {} 00:14:50.273 }' 00:14:50.273 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.273 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.273 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:50.273 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.273 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.273 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:50.273 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:50.530 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:50.530 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:50.530 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:50.530 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:50.530 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:50.530 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:50.530 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:50.530 06:31:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:50.788 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:50.788 "name": "BaseBdev2", 00:14:50.788 "aliases": [ 00:14:50.788 "e062bb23-67b2-449b-b107-50581b123092" 00:14:50.788 ], 00:14:50.788 "product_name": "Malloc disk", 00:14:50.788 "block_size": 512, 00:14:50.788 "num_blocks": 65536, 00:14:50.788 "uuid": "e062bb23-67b2-449b-b107-50581b123092", 00:14:50.788 "assigned_rate_limits": { 00:14:50.788 "rw_ios_per_sec": 0, 00:14:50.788 "rw_mbytes_per_sec": 0, 00:14:50.788 "r_mbytes_per_sec": 0, 00:14:50.788 "w_mbytes_per_sec": 0 00:14:50.788 }, 00:14:50.788 "claimed": true, 00:14:50.788 "claim_type": "exclusive_write", 00:14:50.788 "zoned": false, 00:14:50.788 "supported_io_types": { 00:14:50.788 "read": true, 00:14:50.788 "write": true, 00:14:50.788 "unmap": true, 00:14:50.788 "flush": true, 00:14:50.788 "reset": true, 00:14:50.788 "nvme_admin": false, 00:14:50.788 "nvme_io": false, 00:14:50.788 "nvme_io_md": false, 00:14:50.788 "write_zeroes": true, 00:14:50.788 "zcopy": true, 00:14:50.788 "get_zone_info": false, 00:14:50.788 "zone_management": false, 00:14:50.788 "zone_append": false, 00:14:50.788 "compare": false, 00:14:50.788 "compare_and_write": false, 00:14:50.788 "abort": true, 00:14:50.788 "seek_hole": false, 00:14:50.788 "seek_data": false, 00:14:50.788 "copy": true, 00:14:50.788 "nvme_iov_md": false 00:14:50.788 }, 00:14:50.788 "memory_domains": [ 00:14:50.788 { 00:14:50.788 "dma_device_id": "system", 00:14:50.788 "dma_device_type": 1 00:14:50.788 }, 00:14:50.788 { 00:14:50.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.788 "dma_device_type": 2 00:14:50.788 } 00:14:50.788 ], 00:14:50.788 "driver_specific": {} 00:14:50.788 }' 00:14:50.788 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.788 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.788 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:50.788 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.788 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.048 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:51.048 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.048 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.048 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:51.048 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.048 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.048 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.048 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:51.308 [2024-07-25 06:31:04.729825] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:51.308 [2024-07-25 06:31:04.729848] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:51.308 [2024-07-25 06:31:04.729885] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.308 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.564 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.564 "name": "Existed_Raid", 00:14:51.564 "uuid": "c2623947-f8c8-4bd6-9fe5-c4be5c524f94", 00:14:51.564 "strip_size_kb": 64, 00:14:51.564 "state": "offline", 00:14:51.564 "raid_level": "concat", 00:14:51.564 "superblock": true, 00:14:51.564 "num_base_bdevs": 2, 00:14:51.564 "num_base_bdevs_discovered": 1, 00:14:51.564 "num_base_bdevs_operational": 1, 00:14:51.564 "base_bdevs_list": [ 00:14:51.564 { 00:14:51.564 "name": null, 00:14:51.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.564 "is_configured": false, 00:14:51.564 "data_offset": 2048, 00:14:51.564 "data_size": 63488 00:14:51.564 }, 00:14:51.564 { 00:14:51.564 "name": "BaseBdev2", 00:14:51.564 "uuid": "e062bb23-67b2-449b-b107-50581b123092", 00:14:51.564 "is_configured": true, 00:14:51.564 "data_offset": 2048, 00:14:51.564 "data_size": 63488 00:14:51.564 } 00:14:51.564 ] 00:14:51.564 }' 00:14:51.564 06:31:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.564 06:31:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:52.128 06:31:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:52.128 06:31:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:52.128 06:31:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.128 06:31:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:52.385 06:31:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:52.385 06:31:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:52.385 06:31:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:52.643 [2024-07-25 06:31:06.006208] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:52.643 [2024-07-25 06:31:06.006252] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2686120 name Existed_Raid, state offline 00:14:52.643 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:52.643 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:52.643 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:52.643 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1104322 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1104322 ']' 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1104322 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1104322 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1104322' 00:14:52.936 killing process with pid 1104322 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1104322 00:14:52.936 [2024-07-25 06:31:06.325081] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:52.936 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1104322 00:14:52.936 [2024-07-25 06:31:06.325933] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:53.198 06:31:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:53.198 00:14:53.198 real 0m9.402s 00:14:53.198 user 0m17.009s 00:14:53.198 sys 0m1.846s 00:14:53.198 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:53.198 06:31:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:53.198 ************************************ 00:14:53.198 END TEST raid_state_function_test_sb 00:14:53.198 ************************************ 00:14:53.198 06:31:06 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:14:53.198 06:31:06 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:14:53.198 06:31:06 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:53.198 06:31:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:53.198 ************************************ 00:14:53.198 START TEST raid_superblock_test 00:14:53.198 ************************************ 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 2 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1106166 00:14:53.198 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1106166 /var/tmp/spdk-raid.sock 00:14:53.199 06:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:53.199 06:31:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1106166 ']' 00:14:53.199 06:31:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:53.199 06:31:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:53.199 06:31:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:53.199 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:53.199 06:31:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:53.199 06:31:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.199 [2024-07-25 06:31:06.647984] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:14:53.199 [2024-07-25 06:31:06.648039] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1106166 ] 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:01.0 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:01.1 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:01.2 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:01.3 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:01.4 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:01.5 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:01.6 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:01.7 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:02.0 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:02.1 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:02.2 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:02.3 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:02.4 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:02.5 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:02.6 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3d:02.7 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:01.0 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:01.1 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:01.2 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:01.3 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:01.4 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:01.5 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:01.6 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:01.7 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:02.0 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:02.1 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:02.2 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:02.3 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:02.4 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:02.5 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:02.6 cannot be used 00:14:53.199 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:14:53.199 EAL: Requested device 0000:3f:02.7 cannot be used 00:14:53.458 [2024-07-25 06:31:06.784317] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:53.458 [2024-07-25 06:31:06.827569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:53.458 [2024-07-25 06:31:06.886435] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:53.458 [2024-07-25 06:31:06.886472] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:54.023 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:54.281 malloc1 00:14:54.281 06:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:54.539 [2024-07-25 06:31:07.993856] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:54.539 [2024-07-25 06:31:07.993906] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:54.539 [2024-07-25 06:31:07.993928] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a1d70 00:14:54.539 [2024-07-25 06:31:07.993939] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:54.539 [2024-07-25 06:31:07.995502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:54.539 [2024-07-25 06:31:07.995533] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:54.539 pt1 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:54.539 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:54.798 malloc2 00:14:54.798 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:55.055 [2024-07-25 06:31:08.443491] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:55.055 [2024-07-25 06:31:08.443536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:55.055 [2024-07-25 06:31:08.443555] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff0790 00:14:55.055 [2024-07-25 06:31:08.443567] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:55.055 [2024-07-25 06:31:08.444957] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:55.055 [2024-07-25 06:31:08.444984] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:55.055 pt2 00:14:55.055 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:55.055 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:55.055 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:14:55.313 [2024-07-25 06:31:08.668105] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:55.313 [2024-07-25 06:31:08.669250] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:55.313 [2024-07-25 06:31:08.669388] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21931c0 00:14:55.313 [2024-07-25 06:31:08.669400] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:55.313 [2024-07-25 06:31:08.669582] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fee6e0 00:14:55.313 [2024-07-25 06:31:08.669715] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21931c0 00:14:55.313 [2024-07-25 06:31:08.669724] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21931c0 00:14:55.313 [2024-07-25 06:31:08.669810] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.313 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:55.571 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.571 "name": "raid_bdev1", 00:14:55.571 "uuid": "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb", 00:14:55.571 "strip_size_kb": 64, 00:14:55.571 "state": "online", 00:14:55.571 "raid_level": "concat", 00:14:55.571 "superblock": true, 00:14:55.571 "num_base_bdevs": 2, 00:14:55.571 "num_base_bdevs_discovered": 2, 00:14:55.571 "num_base_bdevs_operational": 2, 00:14:55.571 "base_bdevs_list": [ 00:14:55.571 { 00:14:55.571 "name": "pt1", 00:14:55.571 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:55.571 "is_configured": true, 00:14:55.571 "data_offset": 2048, 00:14:55.571 "data_size": 63488 00:14:55.571 }, 00:14:55.571 { 00:14:55.571 "name": "pt2", 00:14:55.571 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:55.571 "is_configured": true, 00:14:55.571 "data_offset": 2048, 00:14:55.571 "data_size": 63488 00:14:55.571 } 00:14:55.571 ] 00:14:55.571 }' 00:14:55.571 06:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.571 06:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.139 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:14:56.139 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:56.139 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:56.139 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:56.139 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:56.139 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:56.139 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:56.139 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:56.139 [2024-07-25 06:31:09.678979] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:56.398 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:56.398 "name": "raid_bdev1", 00:14:56.398 "aliases": [ 00:14:56.398 "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb" 00:14:56.398 ], 00:14:56.398 "product_name": "Raid Volume", 00:14:56.398 "block_size": 512, 00:14:56.398 "num_blocks": 126976, 00:14:56.398 "uuid": "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb", 00:14:56.398 "assigned_rate_limits": { 00:14:56.398 "rw_ios_per_sec": 0, 00:14:56.398 "rw_mbytes_per_sec": 0, 00:14:56.398 "r_mbytes_per_sec": 0, 00:14:56.398 "w_mbytes_per_sec": 0 00:14:56.398 }, 00:14:56.398 "claimed": false, 00:14:56.398 "zoned": false, 00:14:56.398 "supported_io_types": { 00:14:56.398 "read": true, 00:14:56.398 "write": true, 00:14:56.398 "unmap": true, 00:14:56.398 "flush": true, 00:14:56.398 "reset": true, 00:14:56.398 "nvme_admin": false, 00:14:56.398 "nvme_io": false, 00:14:56.398 "nvme_io_md": false, 00:14:56.398 "write_zeroes": true, 00:14:56.398 "zcopy": false, 00:14:56.398 "get_zone_info": false, 00:14:56.398 "zone_management": false, 00:14:56.398 "zone_append": false, 00:14:56.398 "compare": false, 00:14:56.398 "compare_and_write": false, 00:14:56.398 "abort": false, 00:14:56.398 "seek_hole": false, 00:14:56.398 "seek_data": false, 00:14:56.398 "copy": false, 00:14:56.398 "nvme_iov_md": false 00:14:56.398 }, 00:14:56.398 "memory_domains": [ 00:14:56.398 { 00:14:56.398 "dma_device_id": "system", 00:14:56.398 "dma_device_type": 1 00:14:56.398 }, 00:14:56.398 { 00:14:56.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.398 "dma_device_type": 2 00:14:56.398 }, 00:14:56.398 { 00:14:56.398 "dma_device_id": "system", 00:14:56.398 "dma_device_type": 1 00:14:56.398 }, 00:14:56.398 { 00:14:56.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.398 "dma_device_type": 2 00:14:56.398 } 00:14:56.398 ], 00:14:56.398 "driver_specific": { 00:14:56.398 "raid": { 00:14:56.398 "uuid": "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb", 00:14:56.398 "strip_size_kb": 64, 00:14:56.398 "state": "online", 00:14:56.398 "raid_level": "concat", 00:14:56.398 "superblock": true, 00:14:56.398 "num_base_bdevs": 2, 00:14:56.398 "num_base_bdevs_discovered": 2, 00:14:56.398 "num_base_bdevs_operational": 2, 00:14:56.399 "base_bdevs_list": [ 00:14:56.399 { 00:14:56.399 "name": "pt1", 00:14:56.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:56.399 "is_configured": true, 00:14:56.399 "data_offset": 2048, 00:14:56.399 "data_size": 63488 00:14:56.399 }, 00:14:56.399 { 00:14:56.399 "name": "pt2", 00:14:56.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:56.399 "is_configured": true, 00:14:56.399 "data_offset": 2048, 00:14:56.399 "data_size": 63488 00:14:56.399 } 00:14:56.399 ] 00:14:56.399 } 00:14:56.399 } 00:14:56.399 }' 00:14:56.399 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:56.399 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:56.399 pt2' 00:14:56.399 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.399 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:56.399 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.658 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.658 "name": "pt1", 00:14:56.658 "aliases": [ 00:14:56.658 "00000000-0000-0000-0000-000000000001" 00:14:56.658 ], 00:14:56.658 "product_name": "passthru", 00:14:56.658 "block_size": 512, 00:14:56.658 "num_blocks": 65536, 00:14:56.658 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:56.658 "assigned_rate_limits": { 00:14:56.658 "rw_ios_per_sec": 0, 00:14:56.658 "rw_mbytes_per_sec": 0, 00:14:56.658 "r_mbytes_per_sec": 0, 00:14:56.658 "w_mbytes_per_sec": 0 00:14:56.658 }, 00:14:56.658 "claimed": true, 00:14:56.658 "claim_type": "exclusive_write", 00:14:56.658 "zoned": false, 00:14:56.658 "supported_io_types": { 00:14:56.658 "read": true, 00:14:56.658 "write": true, 00:14:56.658 "unmap": true, 00:14:56.658 "flush": true, 00:14:56.658 "reset": true, 00:14:56.658 "nvme_admin": false, 00:14:56.658 "nvme_io": false, 00:14:56.658 "nvme_io_md": false, 00:14:56.658 "write_zeroes": true, 00:14:56.658 "zcopy": true, 00:14:56.658 "get_zone_info": false, 00:14:56.658 "zone_management": false, 00:14:56.658 "zone_append": false, 00:14:56.658 "compare": false, 00:14:56.658 "compare_and_write": false, 00:14:56.658 "abort": true, 00:14:56.658 "seek_hole": false, 00:14:56.658 "seek_data": false, 00:14:56.658 "copy": true, 00:14:56.658 "nvme_iov_md": false 00:14:56.658 }, 00:14:56.658 "memory_domains": [ 00:14:56.658 { 00:14:56.658 "dma_device_id": "system", 00:14:56.658 "dma_device_type": 1 00:14:56.658 }, 00:14:56.658 { 00:14:56.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.658 "dma_device_type": 2 00:14:56.658 } 00:14:56.658 ], 00:14:56.658 "driver_specific": { 00:14:56.658 "passthru": { 00:14:56.658 "name": "pt1", 00:14:56.658 "base_bdev_name": "malloc1" 00:14:56.658 } 00:14:56.658 } 00:14:56.658 }' 00:14:56.658 06:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:56.658 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.918 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.918 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.918 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.918 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:56.918 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.918 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.918 "name": "pt2", 00:14:56.918 "aliases": [ 00:14:56.918 "00000000-0000-0000-0000-000000000002" 00:14:56.918 ], 00:14:56.918 "product_name": "passthru", 00:14:56.918 "block_size": 512, 00:14:56.918 "num_blocks": 65536, 00:14:56.918 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:56.918 "assigned_rate_limits": { 00:14:56.918 "rw_ios_per_sec": 0, 00:14:56.918 "rw_mbytes_per_sec": 0, 00:14:56.918 "r_mbytes_per_sec": 0, 00:14:56.918 "w_mbytes_per_sec": 0 00:14:56.918 }, 00:14:56.918 "claimed": true, 00:14:56.918 "claim_type": "exclusive_write", 00:14:56.918 "zoned": false, 00:14:56.918 "supported_io_types": { 00:14:56.918 "read": true, 00:14:56.918 "write": true, 00:14:56.918 "unmap": true, 00:14:56.918 "flush": true, 00:14:56.918 "reset": true, 00:14:56.918 "nvme_admin": false, 00:14:56.918 "nvme_io": false, 00:14:56.918 "nvme_io_md": false, 00:14:56.918 "write_zeroes": true, 00:14:56.918 "zcopy": true, 00:14:56.918 "get_zone_info": false, 00:14:56.918 "zone_management": false, 00:14:56.918 "zone_append": false, 00:14:56.918 "compare": false, 00:14:56.918 "compare_and_write": false, 00:14:56.918 "abort": true, 00:14:56.918 "seek_hole": false, 00:14:56.918 "seek_data": false, 00:14:56.918 "copy": true, 00:14:56.918 "nvme_iov_md": false 00:14:56.918 }, 00:14:56.918 "memory_domains": [ 00:14:56.918 { 00:14:56.918 "dma_device_id": "system", 00:14:56.918 "dma_device_type": 1 00:14:56.918 }, 00:14:56.918 { 00:14:56.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.918 "dma_device_type": 2 00:14:56.918 } 00:14:56.918 ], 00:14:56.918 "driver_specific": { 00:14:56.918 "passthru": { 00:14:56.918 "name": "pt2", 00:14:56.918 "base_bdev_name": "malloc2" 00:14:56.918 } 00:14:56.918 } 00:14:56.918 }' 00:14:56.918 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:57.177 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.436 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.436 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:57.436 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:57.436 06:31:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:14:57.695 [2024-07-25 06:31:11.014664] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:57.695 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=5d97ea9c-a1a9-4c90-95ab-6821d4de83cb 00:14:57.695 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 5d97ea9c-a1a9-4c90-95ab-6821d4de83cb ']' 00:14:57.695 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:57.695 [2024-07-25 06:31:11.243024] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:57.695 [2024-07-25 06:31:11.243042] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:57.695 [2024-07-25 06:31:11.243094] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:57.695 [2024-07-25 06:31:11.243135] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:57.695 [2024-07-25 06:31:11.243152] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21931c0 name raid_bdev1, state offline 00:14:57.955 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.955 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:14:57.955 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:14:57.955 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:14:57.955 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:57.955 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:58.214 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:58.214 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:58.474 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:58.474 06:31:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:58.733 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:14:58.994 [2024-07-25 06:31:12.373955] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:58.994 [2024-07-25 06:31:12.375205] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:58.994 [2024-07-25 06:31:12.375258] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:58.994 [2024-07-25 06:31:12.375296] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:58.994 [2024-07-25 06:31:12.375313] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:58.994 [2024-07-25 06:31:12.375323] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21945a0 name raid_bdev1, state configuring 00:14:58.994 request: 00:14:58.994 { 00:14:58.994 "name": "raid_bdev1", 00:14:58.994 "raid_level": "concat", 00:14:58.994 "base_bdevs": [ 00:14:58.994 "malloc1", 00:14:58.994 "malloc2" 00:14:58.994 ], 00:14:58.994 "strip_size_kb": 64, 00:14:58.994 "superblock": false, 00:14:58.994 "method": "bdev_raid_create", 00:14:58.994 "req_id": 1 00:14:58.994 } 00:14:58.994 Got JSON-RPC error response 00:14:58.994 response: 00:14:58.994 { 00:14:58.994 "code": -17, 00:14:58.994 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:58.994 } 00:14:58.994 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:14:58.994 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:14:58.994 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:14:58.994 06:31:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:14:58.994 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.994 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:14:59.253 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:14:59.253 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:14:59.253 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:59.513 [2024-07-25 06:31:12.823088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:59.513 [2024-07-25 06:31:12.823130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:59.514 [2024-07-25 06:31:12.823157] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2192f60 00:14:59.514 [2024-07-25 06:31:12.823169] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:59.514 [2024-07-25 06:31:12.824615] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:59.514 [2024-07-25 06:31:12.824644] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:59.514 [2024-07-25 06:31:12.824706] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:59.514 [2024-07-25 06:31:12.824731] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:59.514 pt1 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.514 06:31:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:59.514 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.514 "name": "raid_bdev1", 00:14:59.514 "uuid": "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb", 00:14:59.514 "strip_size_kb": 64, 00:14:59.514 "state": "configuring", 00:14:59.514 "raid_level": "concat", 00:14:59.514 "superblock": true, 00:14:59.514 "num_base_bdevs": 2, 00:14:59.514 "num_base_bdevs_discovered": 1, 00:14:59.514 "num_base_bdevs_operational": 2, 00:14:59.514 "base_bdevs_list": [ 00:14:59.514 { 00:14:59.514 "name": "pt1", 00:14:59.514 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:59.514 "is_configured": true, 00:14:59.514 "data_offset": 2048, 00:14:59.514 "data_size": 63488 00:14:59.514 }, 00:14:59.514 { 00:14:59.514 "name": null, 00:14:59.514 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:59.514 "is_configured": false, 00:14:59.514 "data_offset": 2048, 00:14:59.514 "data_size": 63488 00:14:59.514 } 00:14:59.514 ] 00:14:59.514 }' 00:14:59.514 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.514 06:31:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.082 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:15:00.082 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:15:00.082 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:00.082 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:00.341 [2024-07-25 06:31:13.797822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:00.341 [2024-07-25 06:31:13.797871] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:00.341 [2024-07-25 06:31:13.797892] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fee7f0 00:15:00.341 [2024-07-25 06:31:13.797903] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:00.341 [2024-07-25 06:31:13.798240] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:00.341 [2024-07-25 06:31:13.798259] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:00.341 [2024-07-25 06:31:13.798318] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:00.341 [2024-07-25 06:31:13.798336] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:00.341 [2024-07-25 06:31:13.798428] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2197700 00:15:00.341 [2024-07-25 06:31:13.798438] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:15:00.341 [2024-07-25 06:31:13.798600] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2198910 00:15:00.341 [2024-07-25 06:31:13.798720] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2197700 00:15:00.341 [2024-07-25 06:31:13.798730] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2197700 00:15:00.341 [2024-07-25 06:31:13.798820] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:00.341 pt2 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.341 06:31:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:00.600 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.600 "name": "raid_bdev1", 00:15:00.600 "uuid": "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb", 00:15:00.600 "strip_size_kb": 64, 00:15:00.600 "state": "online", 00:15:00.600 "raid_level": "concat", 00:15:00.600 "superblock": true, 00:15:00.600 "num_base_bdevs": 2, 00:15:00.600 "num_base_bdevs_discovered": 2, 00:15:00.600 "num_base_bdevs_operational": 2, 00:15:00.600 "base_bdevs_list": [ 00:15:00.600 { 00:15:00.600 "name": "pt1", 00:15:00.600 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:00.600 "is_configured": true, 00:15:00.600 "data_offset": 2048, 00:15:00.600 "data_size": 63488 00:15:00.600 }, 00:15:00.600 { 00:15:00.600 "name": "pt2", 00:15:00.600 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:00.600 "is_configured": true, 00:15:00.600 "data_offset": 2048, 00:15:00.600 "data_size": 63488 00:15:00.600 } 00:15:00.600 ] 00:15:00.600 }' 00:15:00.600 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.600 06:31:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.168 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:15:01.168 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:01.168 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:01.168 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:01.168 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:01.168 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:01.168 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:01.168 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:01.427 [2024-07-25 06:31:14.820810] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:01.427 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:01.427 "name": "raid_bdev1", 00:15:01.427 "aliases": [ 00:15:01.427 "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb" 00:15:01.427 ], 00:15:01.427 "product_name": "Raid Volume", 00:15:01.427 "block_size": 512, 00:15:01.427 "num_blocks": 126976, 00:15:01.427 "uuid": "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb", 00:15:01.427 "assigned_rate_limits": { 00:15:01.427 "rw_ios_per_sec": 0, 00:15:01.427 "rw_mbytes_per_sec": 0, 00:15:01.427 "r_mbytes_per_sec": 0, 00:15:01.427 "w_mbytes_per_sec": 0 00:15:01.427 }, 00:15:01.427 "claimed": false, 00:15:01.427 "zoned": false, 00:15:01.427 "supported_io_types": { 00:15:01.427 "read": true, 00:15:01.427 "write": true, 00:15:01.427 "unmap": true, 00:15:01.427 "flush": true, 00:15:01.427 "reset": true, 00:15:01.427 "nvme_admin": false, 00:15:01.427 "nvme_io": false, 00:15:01.427 "nvme_io_md": false, 00:15:01.427 "write_zeroes": true, 00:15:01.427 "zcopy": false, 00:15:01.427 "get_zone_info": false, 00:15:01.427 "zone_management": false, 00:15:01.427 "zone_append": false, 00:15:01.427 "compare": false, 00:15:01.427 "compare_and_write": false, 00:15:01.427 "abort": false, 00:15:01.427 "seek_hole": false, 00:15:01.427 "seek_data": false, 00:15:01.427 "copy": false, 00:15:01.427 "nvme_iov_md": false 00:15:01.427 }, 00:15:01.427 "memory_domains": [ 00:15:01.427 { 00:15:01.427 "dma_device_id": "system", 00:15:01.427 "dma_device_type": 1 00:15:01.427 }, 00:15:01.427 { 00:15:01.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.427 "dma_device_type": 2 00:15:01.427 }, 00:15:01.427 { 00:15:01.427 "dma_device_id": "system", 00:15:01.427 "dma_device_type": 1 00:15:01.427 }, 00:15:01.427 { 00:15:01.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.427 "dma_device_type": 2 00:15:01.427 } 00:15:01.427 ], 00:15:01.427 "driver_specific": { 00:15:01.427 "raid": { 00:15:01.427 "uuid": "5d97ea9c-a1a9-4c90-95ab-6821d4de83cb", 00:15:01.427 "strip_size_kb": 64, 00:15:01.427 "state": "online", 00:15:01.427 "raid_level": "concat", 00:15:01.427 "superblock": true, 00:15:01.427 "num_base_bdevs": 2, 00:15:01.427 "num_base_bdevs_discovered": 2, 00:15:01.427 "num_base_bdevs_operational": 2, 00:15:01.427 "base_bdevs_list": [ 00:15:01.427 { 00:15:01.427 "name": "pt1", 00:15:01.427 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:01.427 "is_configured": true, 00:15:01.427 "data_offset": 2048, 00:15:01.427 "data_size": 63488 00:15:01.427 }, 00:15:01.427 { 00:15:01.427 "name": "pt2", 00:15:01.427 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:01.427 "is_configured": true, 00:15:01.427 "data_offset": 2048, 00:15:01.427 "data_size": 63488 00:15:01.427 } 00:15:01.427 ] 00:15:01.427 } 00:15:01.427 } 00:15:01.427 }' 00:15:01.427 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:01.427 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:01.427 pt2' 00:15:01.427 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:01.427 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:01.427 06:31:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:01.685 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:01.685 "name": "pt1", 00:15:01.685 "aliases": [ 00:15:01.685 "00000000-0000-0000-0000-000000000001" 00:15:01.685 ], 00:15:01.685 "product_name": "passthru", 00:15:01.685 "block_size": 512, 00:15:01.685 "num_blocks": 65536, 00:15:01.685 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:01.685 "assigned_rate_limits": { 00:15:01.685 "rw_ios_per_sec": 0, 00:15:01.685 "rw_mbytes_per_sec": 0, 00:15:01.685 "r_mbytes_per_sec": 0, 00:15:01.685 "w_mbytes_per_sec": 0 00:15:01.685 }, 00:15:01.685 "claimed": true, 00:15:01.685 "claim_type": "exclusive_write", 00:15:01.685 "zoned": false, 00:15:01.685 "supported_io_types": { 00:15:01.685 "read": true, 00:15:01.685 "write": true, 00:15:01.685 "unmap": true, 00:15:01.685 "flush": true, 00:15:01.685 "reset": true, 00:15:01.685 "nvme_admin": false, 00:15:01.685 "nvme_io": false, 00:15:01.685 "nvme_io_md": false, 00:15:01.685 "write_zeroes": true, 00:15:01.685 "zcopy": true, 00:15:01.685 "get_zone_info": false, 00:15:01.685 "zone_management": false, 00:15:01.685 "zone_append": false, 00:15:01.685 "compare": false, 00:15:01.685 "compare_and_write": false, 00:15:01.685 "abort": true, 00:15:01.685 "seek_hole": false, 00:15:01.685 "seek_data": false, 00:15:01.685 "copy": true, 00:15:01.685 "nvme_iov_md": false 00:15:01.685 }, 00:15:01.685 "memory_domains": [ 00:15:01.685 { 00:15:01.685 "dma_device_id": "system", 00:15:01.685 "dma_device_type": 1 00:15:01.685 }, 00:15:01.685 { 00:15:01.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.685 "dma_device_type": 2 00:15:01.685 } 00:15:01.685 ], 00:15:01.685 "driver_specific": { 00:15:01.685 "passthru": { 00:15:01.685 "name": "pt1", 00:15:01.685 "base_bdev_name": "malloc1" 00:15:01.685 } 00:15:01.685 } 00:15:01.685 }' 00:15:01.685 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:01.685 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:01.685 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:01.685 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:01.943 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:02.202 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:02.202 "name": "pt2", 00:15:02.202 "aliases": [ 00:15:02.202 "00000000-0000-0000-0000-000000000002" 00:15:02.202 ], 00:15:02.202 "product_name": "passthru", 00:15:02.202 "block_size": 512, 00:15:02.202 "num_blocks": 65536, 00:15:02.202 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:02.202 "assigned_rate_limits": { 00:15:02.202 "rw_ios_per_sec": 0, 00:15:02.202 "rw_mbytes_per_sec": 0, 00:15:02.202 "r_mbytes_per_sec": 0, 00:15:02.202 "w_mbytes_per_sec": 0 00:15:02.202 }, 00:15:02.202 "claimed": true, 00:15:02.202 "claim_type": "exclusive_write", 00:15:02.202 "zoned": false, 00:15:02.202 "supported_io_types": { 00:15:02.202 "read": true, 00:15:02.202 "write": true, 00:15:02.202 "unmap": true, 00:15:02.202 "flush": true, 00:15:02.202 "reset": true, 00:15:02.202 "nvme_admin": false, 00:15:02.202 "nvme_io": false, 00:15:02.202 "nvme_io_md": false, 00:15:02.202 "write_zeroes": true, 00:15:02.202 "zcopy": true, 00:15:02.202 "get_zone_info": false, 00:15:02.202 "zone_management": false, 00:15:02.202 "zone_append": false, 00:15:02.202 "compare": false, 00:15:02.202 "compare_and_write": false, 00:15:02.202 "abort": true, 00:15:02.202 "seek_hole": false, 00:15:02.202 "seek_data": false, 00:15:02.202 "copy": true, 00:15:02.202 "nvme_iov_md": false 00:15:02.202 }, 00:15:02.202 "memory_domains": [ 00:15:02.202 { 00:15:02.202 "dma_device_id": "system", 00:15:02.202 "dma_device_type": 1 00:15:02.202 }, 00:15:02.202 { 00:15:02.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.202 "dma_device_type": 2 00:15:02.202 } 00:15:02.202 ], 00:15:02.202 "driver_specific": { 00:15:02.202 "passthru": { 00:15:02.202 "name": "pt2", 00:15:02.202 "base_bdev_name": "malloc2" 00:15:02.202 } 00:15:02.202 } 00:15:02.202 }' 00:15:02.202 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.202 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.460 06:31:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.718 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:02.718 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:02.718 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:15:02.718 [2024-07-25 06:31:16.248576] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:02.718 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 5d97ea9c-a1a9-4c90-95ab-6821d4de83cb '!=' 5d97ea9c-a1a9-4c90-95ab-6821d4de83cb ']' 00:15:02.718 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:15:02.719 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:02.719 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:02.719 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1106166 00:15:02.719 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1106166 ']' 00:15:02.719 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1106166 00:15:02.719 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:15:02.719 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1106166 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1106166' 00:15:02.978 killing process with pid 1106166 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1106166 00:15:02.978 [2024-07-25 06:31:16.323806] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:02.978 [2024-07-25 06:31:16.323858] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:02.978 [2024-07-25 06:31:16.323899] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:02.978 [2024-07-25 06:31:16.323910] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2197700 name raid_bdev1, state offline 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1106166 00:15:02.978 [2024-07-25 06:31:16.339656] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:15:02.978 00:15:02.978 real 0m9.927s 00:15:02.978 user 0m17.748s 00:15:02.978 sys 0m1.864s 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:02.978 06:31:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.978 ************************************ 00:15:02.978 END TEST raid_superblock_test 00:15:02.978 ************************************ 00:15:03.238 06:31:16 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:15:03.238 06:31:16 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:03.238 06:31:16 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:03.238 06:31:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:03.238 ************************************ 00:15:03.238 START TEST raid_read_error_test 00:15:03.238 ************************************ 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 read 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:03.238 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.KbUfM6HGVp 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1107992 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1107992 /var/tmp/spdk-raid.sock 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1107992 ']' 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:03.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:03.239 06:31:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:03.239 [2024-07-25 06:31:16.654919] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:15:03.239 [2024-07-25 06:31:16.654975] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1107992 ] 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:03.239 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:03.239 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:03.239 [2024-07-25 06:31:16.792874] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:03.498 [2024-07-25 06:31:16.836947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:03.498 [2024-07-25 06:31:16.896342] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:03.498 [2024-07-25 06:31:16.896377] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:04.067 06:31:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:04.067 06:31:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:04.067 06:31:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:04.067 06:31:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:04.326 BaseBdev1_malloc 00:15:04.326 06:31:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:04.585 true 00:15:04.585 06:31:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:04.844 [2024-07-25 06:31:18.215471] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:04.844 [2024-07-25 06:31:18.215512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:04.844 [2024-07-25 06:31:18.215530] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252aa60 00:15:04.844 [2024-07-25 06:31:18.215542] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:04.844 [2024-07-25 06:31:18.216997] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:04.844 [2024-07-25 06:31:18.217025] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:04.844 BaseBdev1 00:15:04.844 06:31:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:04.844 06:31:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:05.103 BaseBdev2_malloc 00:15:05.103 06:31:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:05.363 true 00:15:05.363 06:31:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:05.363 [2024-07-25 06:31:18.889366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:05.363 [2024-07-25 06:31:18.889405] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:05.363 [2024-07-25 06:31:18.889427] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252fdc0 00:15:05.363 [2024-07-25 06:31:18.889439] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:05.363 [2024-07-25 06:31:18.890810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:05.363 [2024-07-25 06:31:18.890836] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:05.363 BaseBdev2 00:15:05.363 06:31:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:15:05.624 [2024-07-25 06:31:19.105963] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:05.624 [2024-07-25 06:31:19.107065] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:05.624 [2024-07-25 06:31:19.107240] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2530e90 00:15:05.624 [2024-07-25 06:31:19.107253] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:15:05.624 [2024-07-25 06:31:19.107421] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252ca50 00:15:05.624 [2024-07-25 06:31:19.107550] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2530e90 00:15:05.624 [2024-07-25 06:31:19.107559] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2530e90 00:15:05.624 [2024-07-25 06:31:19.107650] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.624 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:05.913 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.913 "name": "raid_bdev1", 00:15:05.913 "uuid": "11575b35-8cf3-4373-af79-fa38ce39d93d", 00:15:05.913 "strip_size_kb": 64, 00:15:05.913 "state": "online", 00:15:05.913 "raid_level": "concat", 00:15:05.913 "superblock": true, 00:15:05.913 "num_base_bdevs": 2, 00:15:05.913 "num_base_bdevs_discovered": 2, 00:15:05.913 "num_base_bdevs_operational": 2, 00:15:05.913 "base_bdevs_list": [ 00:15:05.913 { 00:15:05.913 "name": "BaseBdev1", 00:15:05.913 "uuid": "de6767f8-4b04-5eab-a7e0-0c2c06b09b7c", 00:15:05.913 "is_configured": true, 00:15:05.913 "data_offset": 2048, 00:15:05.913 "data_size": 63488 00:15:05.913 }, 00:15:05.913 { 00:15:05.913 "name": "BaseBdev2", 00:15:05.913 "uuid": "90b40e3f-de39-5bab-97f6-5b8248cb7554", 00:15:05.913 "is_configured": true, 00:15:05.913 "data_offset": 2048, 00:15:05.913 "data_size": 63488 00:15:05.913 } 00:15:05.913 ] 00:15:05.913 }' 00:15:05.913 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.913 06:31:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.481 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:06.481 06:31:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:06.481 [2024-07-25 06:31:19.956438] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2530990 00:15:07.418 06:31:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.677 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:07.936 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.936 "name": "raid_bdev1", 00:15:07.936 "uuid": "11575b35-8cf3-4373-af79-fa38ce39d93d", 00:15:07.936 "strip_size_kb": 64, 00:15:07.936 "state": "online", 00:15:07.936 "raid_level": "concat", 00:15:07.936 "superblock": true, 00:15:07.936 "num_base_bdevs": 2, 00:15:07.936 "num_base_bdevs_discovered": 2, 00:15:07.936 "num_base_bdevs_operational": 2, 00:15:07.936 "base_bdevs_list": [ 00:15:07.936 { 00:15:07.936 "name": "BaseBdev1", 00:15:07.936 "uuid": "de6767f8-4b04-5eab-a7e0-0c2c06b09b7c", 00:15:07.936 "is_configured": true, 00:15:07.936 "data_offset": 2048, 00:15:07.936 "data_size": 63488 00:15:07.936 }, 00:15:07.936 { 00:15:07.936 "name": "BaseBdev2", 00:15:07.936 "uuid": "90b40e3f-de39-5bab-97f6-5b8248cb7554", 00:15:07.936 "is_configured": true, 00:15:07.936 "data_offset": 2048, 00:15:07.936 "data_size": 63488 00:15:07.936 } 00:15:07.936 ] 00:15:07.936 }' 00:15:07.936 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.936 06:31:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.504 06:31:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:08.763 [2024-07-25 06:31:22.106475] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:08.763 [2024-07-25 06:31:22.106505] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:08.763 [2024-07-25 06:31:22.109405] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:08.763 [2024-07-25 06:31:22.109432] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:08.763 [2024-07-25 06:31:22.109456] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:08.763 [2024-07-25 06:31:22.109466] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2530e90 name raid_bdev1, state offline 00:15:08.763 0 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1107992 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1107992 ']' 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1107992 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1107992 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1107992' 00:15:08.763 killing process with pid 1107992 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1107992 00:15:08.763 [2024-07-25 06:31:22.182951] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:08.763 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1107992 00:15:08.763 [2024-07-25 06:31:22.192643] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.KbUfM6HGVp 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:15:09.023 00:15:09.023 real 0m5.802s 00:15:09.023 user 0m8.991s 00:15:09.023 sys 0m1.034s 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:09.023 06:31:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.023 ************************************ 00:15:09.023 END TEST raid_read_error_test 00:15:09.023 ************************************ 00:15:09.023 06:31:22 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:15:09.023 06:31:22 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:09.023 06:31:22 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:09.023 06:31:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:09.023 ************************************ 00:15:09.023 START TEST raid_write_error_test 00:15:09.023 ************************************ 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 write 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.tzN0fWXZqq 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1109148 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1109148 /var/tmp/spdk-raid.sock 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1109148 ']' 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:09.023 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:09.023 06:31:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.023 [2024-07-25 06:31:22.554045] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:15:09.023 [2024-07-25 06:31:22.554105] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1109148 ] 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:09.283 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:09.283 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:09.283 [2024-07-25 06:31:22.689128] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:09.283 [2024-07-25 06:31:22.731640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:09.283 [2024-07-25 06:31:22.792433] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:09.283 [2024-07-25 06:31:22.792468] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:10.219 06:31:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:10.219 06:31:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:10.219 06:31:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:10.219 06:31:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:10.219 BaseBdev1_malloc 00:15:10.219 06:31:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:10.477 true 00:15:10.477 06:31:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:10.735 [2024-07-25 06:31:24.116155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:10.735 [2024-07-25 06:31:24.116198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:10.735 [2024-07-25 06:31:24.116216] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125fa60 00:15:10.735 [2024-07-25 06:31:24.116228] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:10.735 [2024-07-25 06:31:24.117591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:10.735 [2024-07-25 06:31:24.117618] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:10.735 BaseBdev1 00:15:10.735 06:31:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:10.735 06:31:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:10.993 BaseBdev2_malloc 00:15:10.993 06:31:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:11.251 true 00:15:11.251 06:31:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:11.251 [2024-07-25 06:31:24.794118] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:11.251 [2024-07-25 06:31:24.794159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:11.251 [2024-07-25 06:31:24.794178] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1264dc0 00:15:11.251 [2024-07-25 06:31:24.794190] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:11.251 [2024-07-25 06:31:24.795434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:11.251 [2024-07-25 06:31:24.795460] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:11.251 BaseBdev2 00:15:11.510 06:31:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:15:11.510 [2024-07-25 06:31:25.062847] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:11.510 [2024-07-25 06:31:25.063951] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:11.510 [2024-07-25 06:31:25.064115] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1265e90 00:15:11.510 [2024-07-25 06:31:25.064128] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:15:11.510 [2024-07-25 06:31:25.064300] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1261a50 00:15:11.510 [2024-07-25 06:31:25.064435] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1265e90 00:15:11.510 [2024-07-25 06:31:25.064444] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1265e90 00:15:11.510 [2024-07-25 06:31:25.064533] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.769 "name": "raid_bdev1", 00:15:11.769 "uuid": "a433b75d-e96b-4aa8-a762-cd24313a7a34", 00:15:11.769 "strip_size_kb": 64, 00:15:11.769 "state": "online", 00:15:11.769 "raid_level": "concat", 00:15:11.769 "superblock": true, 00:15:11.769 "num_base_bdevs": 2, 00:15:11.769 "num_base_bdevs_discovered": 2, 00:15:11.769 "num_base_bdevs_operational": 2, 00:15:11.769 "base_bdevs_list": [ 00:15:11.769 { 00:15:11.769 "name": "BaseBdev1", 00:15:11.769 "uuid": "9ec130b9-afaf-5c3d-adc6-bdb3328d05b5", 00:15:11.769 "is_configured": true, 00:15:11.769 "data_offset": 2048, 00:15:11.769 "data_size": 63488 00:15:11.769 }, 00:15:11.769 { 00:15:11.769 "name": "BaseBdev2", 00:15:11.769 "uuid": "c0fa449e-771c-5d69-be02-36f4f9a581c0", 00:15:11.769 "is_configured": true, 00:15:11.769 "data_offset": 2048, 00:15:11.769 "data_size": 63488 00:15:11.769 } 00:15:11.769 ] 00:15:11.769 }' 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.769 06:31:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.706 06:31:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:12.706 06:31:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:12.706 [2024-07-25 06:31:26.254249] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1265990 00:15:13.643 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.901 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:14.160 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.160 "name": "raid_bdev1", 00:15:14.160 "uuid": "a433b75d-e96b-4aa8-a762-cd24313a7a34", 00:15:14.160 "strip_size_kb": 64, 00:15:14.160 "state": "online", 00:15:14.160 "raid_level": "concat", 00:15:14.160 "superblock": true, 00:15:14.160 "num_base_bdevs": 2, 00:15:14.160 "num_base_bdevs_discovered": 2, 00:15:14.160 "num_base_bdevs_operational": 2, 00:15:14.160 "base_bdevs_list": [ 00:15:14.160 { 00:15:14.160 "name": "BaseBdev1", 00:15:14.160 "uuid": "9ec130b9-afaf-5c3d-adc6-bdb3328d05b5", 00:15:14.160 "is_configured": true, 00:15:14.160 "data_offset": 2048, 00:15:14.160 "data_size": 63488 00:15:14.160 }, 00:15:14.160 { 00:15:14.160 "name": "BaseBdev2", 00:15:14.160 "uuid": "c0fa449e-771c-5d69-be02-36f4f9a581c0", 00:15:14.160 "is_configured": true, 00:15:14.160 "data_offset": 2048, 00:15:14.160 "data_size": 63488 00:15:14.160 } 00:15:14.160 ] 00:15:14.160 }' 00:15:14.160 06:31:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.160 06:31:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.096 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:15.355 [2024-07-25 06:31:28.654435] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:15.355 [2024-07-25 06:31:28.654464] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:15.355 [2024-07-25 06:31:28.657379] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:15.355 [2024-07-25 06:31:28.657409] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:15.355 [2024-07-25 06:31:28.657432] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:15.355 [2024-07-25 06:31:28.657442] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1265e90 name raid_bdev1, state offline 00:15:15.355 0 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1109148 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1109148 ']' 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1109148 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1109148 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1109148' 00:15:15.355 killing process with pid 1109148 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1109148 00:15:15.355 [2024-07-25 06:31:28.732787] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:15.355 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1109148 00:15:15.355 [2024-07-25 06:31:28.742803] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.tzN0fWXZqq 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.42 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.42 != \0\.\0\0 ]] 00:15:15.615 00:15:15.615 real 0m6.461s 00:15:15.615 user 0m10.136s 00:15:15.615 sys 0m1.173s 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:15.615 06:31:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.616 ************************************ 00:15:15.616 END TEST raid_write_error_test 00:15:15.616 ************************************ 00:15:15.616 06:31:28 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:15:15.616 06:31:28 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:15:15.616 06:31:28 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:15.616 06:31:28 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:15.616 06:31:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:15.616 ************************************ 00:15:15.616 START TEST raid_state_function_test 00:15:15.616 ************************************ 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 false 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1110309 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1110309' 00:15:15.616 Process raid pid: 1110309 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1110309 /var/tmp/spdk-raid.sock 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1110309 ']' 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:15.616 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:15.616 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.616 [2024-07-25 06:31:29.085829] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:15:15.616 [2024-07-25 06:31:29.085885] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:15.616 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:15.616 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:15.876 [2024-07-25 06:31:29.223234] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:15.876 [2024-07-25 06:31:29.268003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:15.876 [2024-07-25 06:31:29.329059] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:15.876 [2024-07-25 06:31:29.329093] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:16.444 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:16.444 06:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:15:16.444 06:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:16.703 [2024-07-25 06:31:30.194390] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:16.703 [2024-07-25 06:31:30.194429] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:16.703 [2024-07-25 06:31:30.194439] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:16.703 [2024-07-25 06:31:30.194450] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.703 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.961 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.961 "name": "Existed_Raid", 00:15:16.961 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.961 "strip_size_kb": 0, 00:15:16.961 "state": "configuring", 00:15:16.961 "raid_level": "raid1", 00:15:16.961 "superblock": false, 00:15:16.961 "num_base_bdevs": 2, 00:15:16.961 "num_base_bdevs_discovered": 0, 00:15:16.961 "num_base_bdevs_operational": 2, 00:15:16.961 "base_bdevs_list": [ 00:15:16.961 { 00:15:16.961 "name": "BaseBdev1", 00:15:16.961 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.961 "is_configured": false, 00:15:16.961 "data_offset": 0, 00:15:16.961 "data_size": 0 00:15:16.961 }, 00:15:16.961 { 00:15:16.961 "name": "BaseBdev2", 00:15:16.961 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.961 "is_configured": false, 00:15:16.961 "data_offset": 0, 00:15:16.961 "data_size": 0 00:15:16.961 } 00:15:16.961 ] 00:15:16.961 }' 00:15:16.961 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.961 06:31:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:17.528 06:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:17.787 [2024-07-25 06:31:31.188898] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:17.787 [2024-07-25 06:31:31.188923] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ea470 name Existed_Raid, state configuring 00:15:17.787 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:18.046 [2024-07-25 06:31:31.369382] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:18.046 [2024-07-25 06:31:31.369405] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:18.046 [2024-07-25 06:31:31.369414] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:18.046 [2024-07-25 06:31:31.369424] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:18.046 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:18.046 [2024-07-25 06:31:31.551307] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:18.046 BaseBdev1 00:15:18.046 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:18.046 06:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:18.046 06:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:18.046 06:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:18.046 06:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:18.046 06:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:18.046 06:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.305 06:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:18.564 [ 00:15:18.564 { 00:15:18.564 "name": "BaseBdev1", 00:15:18.564 "aliases": [ 00:15:18.564 "bb1925ff-52f3-42aa-a780-79b947e18c2a" 00:15:18.564 ], 00:15:18.564 "product_name": "Malloc disk", 00:15:18.564 "block_size": 512, 00:15:18.564 "num_blocks": 65536, 00:15:18.564 "uuid": "bb1925ff-52f3-42aa-a780-79b947e18c2a", 00:15:18.564 "assigned_rate_limits": { 00:15:18.564 "rw_ios_per_sec": 0, 00:15:18.564 "rw_mbytes_per_sec": 0, 00:15:18.564 "r_mbytes_per_sec": 0, 00:15:18.564 "w_mbytes_per_sec": 0 00:15:18.564 }, 00:15:18.564 "claimed": true, 00:15:18.564 "claim_type": "exclusive_write", 00:15:18.564 "zoned": false, 00:15:18.564 "supported_io_types": { 00:15:18.564 "read": true, 00:15:18.564 "write": true, 00:15:18.564 "unmap": true, 00:15:18.564 "flush": true, 00:15:18.564 "reset": true, 00:15:18.564 "nvme_admin": false, 00:15:18.564 "nvme_io": false, 00:15:18.564 "nvme_io_md": false, 00:15:18.564 "write_zeroes": true, 00:15:18.564 "zcopy": true, 00:15:18.564 "get_zone_info": false, 00:15:18.564 "zone_management": false, 00:15:18.564 "zone_append": false, 00:15:18.564 "compare": false, 00:15:18.564 "compare_and_write": false, 00:15:18.564 "abort": true, 00:15:18.564 "seek_hole": false, 00:15:18.564 "seek_data": false, 00:15:18.564 "copy": true, 00:15:18.564 "nvme_iov_md": false 00:15:18.564 }, 00:15:18.564 "memory_domains": [ 00:15:18.564 { 00:15:18.564 "dma_device_id": "system", 00:15:18.564 "dma_device_type": 1 00:15:18.564 }, 00:15:18.564 { 00:15:18.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.564 "dma_device_type": 2 00:15:18.564 } 00:15:18.564 ], 00:15:18.564 "driver_specific": {} 00:15:18.564 } 00:15:18.564 ] 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.564 06:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.823 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.823 "name": "Existed_Raid", 00:15:18.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.823 "strip_size_kb": 0, 00:15:18.823 "state": "configuring", 00:15:18.823 "raid_level": "raid1", 00:15:18.823 "superblock": false, 00:15:18.823 "num_base_bdevs": 2, 00:15:18.823 "num_base_bdevs_discovered": 1, 00:15:18.823 "num_base_bdevs_operational": 2, 00:15:18.823 "base_bdevs_list": [ 00:15:18.823 { 00:15:18.823 "name": "BaseBdev1", 00:15:18.823 "uuid": "bb1925ff-52f3-42aa-a780-79b947e18c2a", 00:15:18.823 "is_configured": true, 00:15:18.823 "data_offset": 0, 00:15:18.823 "data_size": 65536 00:15:18.823 }, 00:15:18.823 { 00:15:18.823 "name": "BaseBdev2", 00:15:18.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.823 "is_configured": false, 00:15:18.823 "data_offset": 0, 00:15:18.823 "data_size": 0 00:15:18.823 } 00:15:18.823 ] 00:15:18.823 }' 00:15:18.823 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.823 06:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:19.081 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:19.345 [2024-07-25 06:31:32.782533] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:19.345 [2024-07-25 06:31:32.782566] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21e9ce0 name Existed_Raid, state configuring 00:15:19.345 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:19.660 [2024-07-25 06:31:32.959027] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:19.660 [2024-07-25 06:31:32.960399] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:19.660 [2024-07-25 06:31:32.960432] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.660 06:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.660 06:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.660 "name": "Existed_Raid", 00:15:19.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.660 "strip_size_kb": 0, 00:15:19.660 "state": "configuring", 00:15:19.660 "raid_level": "raid1", 00:15:19.660 "superblock": false, 00:15:19.660 "num_base_bdevs": 2, 00:15:19.660 "num_base_bdevs_discovered": 1, 00:15:19.660 "num_base_bdevs_operational": 2, 00:15:19.660 "base_bdevs_list": [ 00:15:19.660 { 00:15:19.660 "name": "BaseBdev1", 00:15:19.660 "uuid": "bb1925ff-52f3-42aa-a780-79b947e18c2a", 00:15:19.660 "is_configured": true, 00:15:19.660 "data_offset": 0, 00:15:19.660 "data_size": 65536 00:15:19.660 }, 00:15:19.660 { 00:15:19.660 "name": "BaseBdev2", 00:15:19.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.660 "is_configured": false, 00:15:19.660 "data_offset": 0, 00:15:19.660 "data_size": 0 00:15:19.660 } 00:15:19.660 ] 00:15:19.660 }' 00:15:19.660 06:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.660 06:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.227 06:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:20.485 [2024-07-25 06:31:33.956685] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:20.485 [2024-07-25 06:31:33.956717] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x239d120 00:15:20.486 [2024-07-25 06:31:33.956725] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:15:20.486 [2024-07-25 06:31:33.956899] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2394170 00:15:20.486 [2024-07-25 06:31:33.957013] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x239d120 00:15:20.486 [2024-07-25 06:31:33.957023] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x239d120 00:15:20.486 [2024-07-25 06:31:33.957174] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:20.486 BaseBdev2 00:15:20.486 06:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:20.486 06:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:20.486 06:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:20.486 06:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:20.486 06:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:20.486 06:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:20.486 06:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.744 06:31:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:21.003 [ 00:15:21.003 { 00:15:21.003 "name": "BaseBdev2", 00:15:21.003 "aliases": [ 00:15:21.003 "d5f7f5c1-c439-4ea8-a964-c97423062548" 00:15:21.003 ], 00:15:21.003 "product_name": "Malloc disk", 00:15:21.003 "block_size": 512, 00:15:21.003 "num_blocks": 65536, 00:15:21.003 "uuid": "d5f7f5c1-c439-4ea8-a964-c97423062548", 00:15:21.003 "assigned_rate_limits": { 00:15:21.003 "rw_ios_per_sec": 0, 00:15:21.003 "rw_mbytes_per_sec": 0, 00:15:21.003 "r_mbytes_per_sec": 0, 00:15:21.003 "w_mbytes_per_sec": 0 00:15:21.003 }, 00:15:21.003 "claimed": true, 00:15:21.003 "claim_type": "exclusive_write", 00:15:21.003 "zoned": false, 00:15:21.003 "supported_io_types": { 00:15:21.003 "read": true, 00:15:21.003 "write": true, 00:15:21.003 "unmap": true, 00:15:21.003 "flush": true, 00:15:21.003 "reset": true, 00:15:21.003 "nvme_admin": false, 00:15:21.003 "nvme_io": false, 00:15:21.003 "nvme_io_md": false, 00:15:21.003 "write_zeroes": true, 00:15:21.003 "zcopy": true, 00:15:21.003 "get_zone_info": false, 00:15:21.003 "zone_management": false, 00:15:21.003 "zone_append": false, 00:15:21.003 "compare": false, 00:15:21.003 "compare_and_write": false, 00:15:21.003 "abort": true, 00:15:21.003 "seek_hole": false, 00:15:21.003 "seek_data": false, 00:15:21.003 "copy": true, 00:15:21.003 "nvme_iov_md": false 00:15:21.003 }, 00:15:21.003 "memory_domains": [ 00:15:21.003 { 00:15:21.003 "dma_device_id": "system", 00:15:21.003 "dma_device_type": 1 00:15:21.003 }, 00:15:21.003 { 00:15:21.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.003 "dma_device_type": 2 00:15:21.003 } 00:15:21.003 ], 00:15:21.003 "driver_specific": {} 00:15:21.003 } 00:15:21.003 ] 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.003 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.262 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.262 "name": "Existed_Raid", 00:15:21.262 "uuid": "d70493bf-c7ee-4180-96e1-7548c6f6cdff", 00:15:21.262 "strip_size_kb": 0, 00:15:21.262 "state": "online", 00:15:21.262 "raid_level": "raid1", 00:15:21.262 "superblock": false, 00:15:21.262 "num_base_bdevs": 2, 00:15:21.262 "num_base_bdevs_discovered": 2, 00:15:21.262 "num_base_bdevs_operational": 2, 00:15:21.262 "base_bdevs_list": [ 00:15:21.262 { 00:15:21.262 "name": "BaseBdev1", 00:15:21.262 "uuid": "bb1925ff-52f3-42aa-a780-79b947e18c2a", 00:15:21.262 "is_configured": true, 00:15:21.262 "data_offset": 0, 00:15:21.263 "data_size": 65536 00:15:21.263 }, 00:15:21.263 { 00:15:21.263 "name": "BaseBdev2", 00:15:21.263 "uuid": "d5f7f5c1-c439-4ea8-a964-c97423062548", 00:15:21.263 "is_configured": true, 00:15:21.263 "data_offset": 0, 00:15:21.263 "data_size": 65536 00:15:21.263 } 00:15:21.263 ] 00:15:21.263 }' 00:15:21.263 06:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.263 06:31:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:22.200 [2024-07-25 06:31:35.685500] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:22.200 "name": "Existed_Raid", 00:15:22.200 "aliases": [ 00:15:22.200 "d70493bf-c7ee-4180-96e1-7548c6f6cdff" 00:15:22.200 ], 00:15:22.200 "product_name": "Raid Volume", 00:15:22.200 "block_size": 512, 00:15:22.200 "num_blocks": 65536, 00:15:22.200 "uuid": "d70493bf-c7ee-4180-96e1-7548c6f6cdff", 00:15:22.200 "assigned_rate_limits": { 00:15:22.200 "rw_ios_per_sec": 0, 00:15:22.200 "rw_mbytes_per_sec": 0, 00:15:22.200 "r_mbytes_per_sec": 0, 00:15:22.200 "w_mbytes_per_sec": 0 00:15:22.200 }, 00:15:22.200 "claimed": false, 00:15:22.200 "zoned": false, 00:15:22.200 "supported_io_types": { 00:15:22.200 "read": true, 00:15:22.200 "write": true, 00:15:22.200 "unmap": false, 00:15:22.200 "flush": false, 00:15:22.200 "reset": true, 00:15:22.200 "nvme_admin": false, 00:15:22.200 "nvme_io": false, 00:15:22.200 "nvme_io_md": false, 00:15:22.200 "write_zeroes": true, 00:15:22.200 "zcopy": false, 00:15:22.200 "get_zone_info": false, 00:15:22.200 "zone_management": false, 00:15:22.200 "zone_append": false, 00:15:22.200 "compare": false, 00:15:22.200 "compare_and_write": false, 00:15:22.200 "abort": false, 00:15:22.200 "seek_hole": false, 00:15:22.200 "seek_data": false, 00:15:22.200 "copy": false, 00:15:22.200 "nvme_iov_md": false 00:15:22.200 }, 00:15:22.200 "memory_domains": [ 00:15:22.200 { 00:15:22.200 "dma_device_id": "system", 00:15:22.200 "dma_device_type": 1 00:15:22.200 }, 00:15:22.200 { 00:15:22.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.200 "dma_device_type": 2 00:15:22.200 }, 00:15:22.200 { 00:15:22.200 "dma_device_id": "system", 00:15:22.200 "dma_device_type": 1 00:15:22.200 }, 00:15:22.200 { 00:15:22.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.200 "dma_device_type": 2 00:15:22.200 } 00:15:22.200 ], 00:15:22.200 "driver_specific": { 00:15:22.200 "raid": { 00:15:22.200 "uuid": "d70493bf-c7ee-4180-96e1-7548c6f6cdff", 00:15:22.200 "strip_size_kb": 0, 00:15:22.200 "state": "online", 00:15:22.200 "raid_level": "raid1", 00:15:22.200 "superblock": false, 00:15:22.200 "num_base_bdevs": 2, 00:15:22.200 "num_base_bdevs_discovered": 2, 00:15:22.200 "num_base_bdevs_operational": 2, 00:15:22.200 "base_bdevs_list": [ 00:15:22.200 { 00:15:22.200 "name": "BaseBdev1", 00:15:22.200 "uuid": "bb1925ff-52f3-42aa-a780-79b947e18c2a", 00:15:22.200 "is_configured": true, 00:15:22.200 "data_offset": 0, 00:15:22.200 "data_size": 65536 00:15:22.200 }, 00:15:22.200 { 00:15:22.200 "name": "BaseBdev2", 00:15:22.200 "uuid": "d5f7f5c1-c439-4ea8-a964-c97423062548", 00:15:22.200 "is_configured": true, 00:15:22.200 "data_offset": 0, 00:15:22.200 "data_size": 65536 00:15:22.200 } 00:15:22.200 ] 00:15:22.200 } 00:15:22.200 } 00:15:22.200 }' 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:22.200 BaseBdev2' 00:15:22.200 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:22.460 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:22.460 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:22.460 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:22.460 "name": "BaseBdev1", 00:15:22.460 "aliases": [ 00:15:22.460 "bb1925ff-52f3-42aa-a780-79b947e18c2a" 00:15:22.460 ], 00:15:22.460 "product_name": "Malloc disk", 00:15:22.460 "block_size": 512, 00:15:22.460 "num_blocks": 65536, 00:15:22.460 "uuid": "bb1925ff-52f3-42aa-a780-79b947e18c2a", 00:15:22.460 "assigned_rate_limits": { 00:15:22.460 "rw_ios_per_sec": 0, 00:15:22.460 "rw_mbytes_per_sec": 0, 00:15:22.460 "r_mbytes_per_sec": 0, 00:15:22.460 "w_mbytes_per_sec": 0 00:15:22.460 }, 00:15:22.460 "claimed": true, 00:15:22.460 "claim_type": "exclusive_write", 00:15:22.460 "zoned": false, 00:15:22.460 "supported_io_types": { 00:15:22.460 "read": true, 00:15:22.460 "write": true, 00:15:22.460 "unmap": true, 00:15:22.460 "flush": true, 00:15:22.460 "reset": true, 00:15:22.460 "nvme_admin": false, 00:15:22.460 "nvme_io": false, 00:15:22.460 "nvme_io_md": false, 00:15:22.460 "write_zeroes": true, 00:15:22.460 "zcopy": true, 00:15:22.460 "get_zone_info": false, 00:15:22.460 "zone_management": false, 00:15:22.460 "zone_append": false, 00:15:22.460 "compare": false, 00:15:22.460 "compare_and_write": false, 00:15:22.460 "abort": true, 00:15:22.460 "seek_hole": false, 00:15:22.460 "seek_data": false, 00:15:22.460 "copy": true, 00:15:22.460 "nvme_iov_md": false 00:15:22.460 }, 00:15:22.460 "memory_domains": [ 00:15:22.460 { 00:15:22.460 "dma_device_id": "system", 00:15:22.460 "dma_device_type": 1 00:15:22.460 }, 00:15:22.460 { 00:15:22.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.460 "dma_device_type": 2 00:15:22.460 } 00:15:22.460 ], 00:15:22.460 "driver_specific": {} 00:15:22.460 }' 00:15:22.460 06:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:22.720 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.979 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.979 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.979 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:22.979 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:22.979 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:23.239 "name": "BaseBdev2", 00:15:23.239 "aliases": [ 00:15:23.239 "d5f7f5c1-c439-4ea8-a964-c97423062548" 00:15:23.239 ], 00:15:23.239 "product_name": "Malloc disk", 00:15:23.239 "block_size": 512, 00:15:23.239 "num_blocks": 65536, 00:15:23.239 "uuid": "d5f7f5c1-c439-4ea8-a964-c97423062548", 00:15:23.239 "assigned_rate_limits": { 00:15:23.239 "rw_ios_per_sec": 0, 00:15:23.239 "rw_mbytes_per_sec": 0, 00:15:23.239 "r_mbytes_per_sec": 0, 00:15:23.239 "w_mbytes_per_sec": 0 00:15:23.239 }, 00:15:23.239 "claimed": true, 00:15:23.239 "claim_type": "exclusive_write", 00:15:23.239 "zoned": false, 00:15:23.239 "supported_io_types": { 00:15:23.239 "read": true, 00:15:23.239 "write": true, 00:15:23.239 "unmap": true, 00:15:23.239 "flush": true, 00:15:23.239 "reset": true, 00:15:23.239 "nvme_admin": false, 00:15:23.239 "nvme_io": false, 00:15:23.239 "nvme_io_md": false, 00:15:23.239 "write_zeroes": true, 00:15:23.239 "zcopy": true, 00:15:23.239 "get_zone_info": false, 00:15:23.239 "zone_management": false, 00:15:23.239 "zone_append": false, 00:15:23.239 "compare": false, 00:15:23.239 "compare_and_write": false, 00:15:23.239 "abort": true, 00:15:23.239 "seek_hole": false, 00:15:23.239 "seek_data": false, 00:15:23.239 "copy": true, 00:15:23.239 "nvme_iov_md": false 00:15:23.239 }, 00:15:23.239 "memory_domains": [ 00:15:23.239 { 00:15:23.239 "dma_device_id": "system", 00:15:23.239 "dma_device_type": 1 00:15:23.239 }, 00:15:23.239 { 00:15:23.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.239 "dma_device_type": 2 00:15:23.239 } 00:15:23.239 ], 00:15:23.239 "driver_specific": {} 00:15:23.239 }' 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.239 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.498 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:23.498 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.498 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.498 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:23.498 06:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:23.758 [2024-07-25 06:31:37.080989] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.758 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.018 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.018 "name": "Existed_Raid", 00:15:24.018 "uuid": "d70493bf-c7ee-4180-96e1-7548c6f6cdff", 00:15:24.018 "strip_size_kb": 0, 00:15:24.018 "state": "online", 00:15:24.018 "raid_level": "raid1", 00:15:24.018 "superblock": false, 00:15:24.018 "num_base_bdevs": 2, 00:15:24.018 "num_base_bdevs_discovered": 1, 00:15:24.018 "num_base_bdevs_operational": 1, 00:15:24.018 "base_bdevs_list": [ 00:15:24.018 { 00:15:24.018 "name": null, 00:15:24.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.018 "is_configured": false, 00:15:24.018 "data_offset": 0, 00:15:24.018 "data_size": 65536 00:15:24.018 }, 00:15:24.018 { 00:15:24.018 "name": "BaseBdev2", 00:15:24.018 "uuid": "d5f7f5c1-c439-4ea8-a964-c97423062548", 00:15:24.018 "is_configured": true, 00:15:24.018 "data_offset": 0, 00:15:24.018 "data_size": 65536 00:15:24.018 } 00:15:24.018 ] 00:15:24.018 }' 00:15:24.018 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.018 06:31:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.586 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:24.586 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:24.586 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.586 06:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:24.586 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:24.586 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:24.586 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:24.846 [2024-07-25 06:31:38.337254] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:24.846 [2024-07-25 06:31:38.337330] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:24.846 [2024-07-25 06:31:38.347745] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:24.846 [2024-07-25 06:31:38.347773] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:24.846 [2024-07-25 06:31:38.347783] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x239d120 name Existed_Raid, state offline 00:15:24.846 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:24.846 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:24.846 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.846 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1110309 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1110309 ']' 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1110309 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1110309 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1110309' 00:15:25.106 killing process with pid 1110309 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1110309 00:15:25.106 [2024-07-25 06:31:38.652173] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:25.106 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1110309 00:15:25.106 [2024-07-25 06:31:38.653013] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:25.366 00:15:25.366 real 0m9.810s 00:15:25.366 user 0m17.365s 00:15:25.366 sys 0m1.892s 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:25.366 ************************************ 00:15:25.366 END TEST raid_state_function_test 00:15:25.366 ************************************ 00:15:25.366 06:31:38 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:15:25.366 06:31:38 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:25.366 06:31:38 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:25.366 06:31:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:25.366 ************************************ 00:15:25.366 START TEST raid_state_function_test_sb 00:15:25.366 ************************************ 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:25.366 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1112187 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1112187' 00:15:25.626 Process raid pid: 1112187 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1112187 /var/tmp/spdk-raid.sock 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1112187 ']' 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:25.626 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:25.626 06:31:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.626 [2024-07-25 06:31:38.975813] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:15:25.626 [2024-07-25 06:31:38.975872] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.626 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.626 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.626 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.626 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.626 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.626 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.626 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.626 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:25.626 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:25.627 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:25.627 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:25.627 [2024-07-25 06:31:39.115435] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:25.627 [2024-07-25 06:31:39.159828] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:25.886 [2024-07-25 06:31:39.217853] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:25.886 [2024-07-25 06:31:39.217884] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:26.454 06:31:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:26.454 06:31:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:15:26.454 06:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:26.714 [2024-07-25 06:31:40.082371] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:26.714 [2024-07-25 06:31:40.082408] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:26.714 [2024-07-25 06:31:40.082418] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:26.714 [2024-07-25 06:31:40.082429] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.714 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:26.974 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.974 "name": "Existed_Raid", 00:15:26.974 "uuid": "d7431977-deab-48ed-8772-3d7b6cbe7c8c", 00:15:26.974 "strip_size_kb": 0, 00:15:26.974 "state": "configuring", 00:15:26.974 "raid_level": "raid1", 00:15:26.974 "superblock": true, 00:15:26.974 "num_base_bdevs": 2, 00:15:26.974 "num_base_bdevs_discovered": 0, 00:15:26.974 "num_base_bdevs_operational": 2, 00:15:26.974 "base_bdevs_list": [ 00:15:26.974 { 00:15:26.974 "name": "BaseBdev1", 00:15:26.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:26.974 "is_configured": false, 00:15:26.974 "data_offset": 0, 00:15:26.974 "data_size": 0 00:15:26.974 }, 00:15:26.974 { 00:15:26.974 "name": "BaseBdev2", 00:15:26.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:26.974 "is_configured": false, 00:15:26.974 "data_offset": 0, 00:15:26.974 "data_size": 0 00:15:26.974 } 00:15:26.974 ] 00:15:26.974 }' 00:15:26.974 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.974 06:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.543 06:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:27.543 [2024-07-25 06:31:41.096911] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:27.543 [2024-07-25 06:31:41.096936] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc31470 name Existed_Raid, state configuring 00:15:27.803 06:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:27.803 [2024-07-25 06:31:41.321512] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:27.803 [2024-07-25 06:31:41.321537] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:27.803 [2024-07-25 06:31:41.321546] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:27.803 [2024-07-25 06:31:41.321556] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:27.803 06:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:28.062 [2024-07-25 06:31:41.559560] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:28.062 BaseBdev1 00:15:28.062 06:31:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:28.062 06:31:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:28.062 06:31:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:28.062 06:31:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:28.062 06:31:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:28.062 06:31:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:28.062 06:31:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:28.322 06:31:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:28.582 [ 00:15:28.582 { 00:15:28.582 "name": "BaseBdev1", 00:15:28.582 "aliases": [ 00:15:28.582 "47b3374e-88f3-4646-9e15-3e3c4f3185b5" 00:15:28.582 ], 00:15:28.582 "product_name": "Malloc disk", 00:15:28.582 "block_size": 512, 00:15:28.582 "num_blocks": 65536, 00:15:28.582 "uuid": "47b3374e-88f3-4646-9e15-3e3c4f3185b5", 00:15:28.582 "assigned_rate_limits": { 00:15:28.582 "rw_ios_per_sec": 0, 00:15:28.582 "rw_mbytes_per_sec": 0, 00:15:28.582 "r_mbytes_per_sec": 0, 00:15:28.582 "w_mbytes_per_sec": 0 00:15:28.582 }, 00:15:28.582 "claimed": true, 00:15:28.582 "claim_type": "exclusive_write", 00:15:28.582 "zoned": false, 00:15:28.582 "supported_io_types": { 00:15:28.582 "read": true, 00:15:28.582 "write": true, 00:15:28.582 "unmap": true, 00:15:28.582 "flush": true, 00:15:28.582 "reset": true, 00:15:28.582 "nvme_admin": false, 00:15:28.582 "nvme_io": false, 00:15:28.582 "nvme_io_md": false, 00:15:28.582 "write_zeroes": true, 00:15:28.582 "zcopy": true, 00:15:28.582 "get_zone_info": false, 00:15:28.582 "zone_management": false, 00:15:28.582 "zone_append": false, 00:15:28.582 "compare": false, 00:15:28.582 "compare_and_write": false, 00:15:28.582 "abort": true, 00:15:28.582 "seek_hole": false, 00:15:28.582 "seek_data": false, 00:15:28.582 "copy": true, 00:15:28.582 "nvme_iov_md": false 00:15:28.582 }, 00:15:28.582 "memory_domains": [ 00:15:28.582 { 00:15:28.582 "dma_device_id": "system", 00:15:28.582 "dma_device_type": 1 00:15:28.582 }, 00:15:28.582 { 00:15:28.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.582 "dma_device_type": 2 00:15:28.582 } 00:15:28.582 ], 00:15:28.582 "driver_specific": {} 00:15:28.582 } 00:15:28.582 ] 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.582 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.841 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.841 "name": "Existed_Raid", 00:15:28.841 "uuid": "5b375393-7c31-4dc1-9c67-1a791217a9de", 00:15:28.841 "strip_size_kb": 0, 00:15:28.841 "state": "configuring", 00:15:28.841 "raid_level": "raid1", 00:15:28.841 "superblock": true, 00:15:28.841 "num_base_bdevs": 2, 00:15:28.842 "num_base_bdevs_discovered": 1, 00:15:28.842 "num_base_bdevs_operational": 2, 00:15:28.842 "base_bdevs_list": [ 00:15:28.842 { 00:15:28.842 "name": "BaseBdev1", 00:15:28.842 "uuid": "47b3374e-88f3-4646-9e15-3e3c4f3185b5", 00:15:28.842 "is_configured": true, 00:15:28.842 "data_offset": 2048, 00:15:28.842 "data_size": 63488 00:15:28.842 }, 00:15:28.842 { 00:15:28.842 "name": "BaseBdev2", 00:15:28.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.842 "is_configured": false, 00:15:28.842 "data_offset": 0, 00:15:28.842 "data_size": 0 00:15:28.842 } 00:15:28.842 ] 00:15:28.842 }' 00:15:28.842 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.842 06:31:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:29.410 06:31:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:29.670 [2024-07-25 06:31:43.019445] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:29.670 [2024-07-25 06:31:43.019481] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc30ce0 name Existed_Raid, state configuring 00:15:29.670 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:29.930 [2024-07-25 06:31:43.248073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:29.930 [2024-07-25 06:31:43.249454] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:29.930 [2024-07-25 06:31:43.249485] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.930 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.188 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.188 "name": "Existed_Raid", 00:15:30.188 "uuid": "dd6205b0-963b-41c5-9bbc-1cbf45dc5a46", 00:15:30.188 "strip_size_kb": 0, 00:15:30.188 "state": "configuring", 00:15:30.188 "raid_level": "raid1", 00:15:30.188 "superblock": true, 00:15:30.188 "num_base_bdevs": 2, 00:15:30.188 "num_base_bdevs_discovered": 1, 00:15:30.188 "num_base_bdevs_operational": 2, 00:15:30.188 "base_bdevs_list": [ 00:15:30.188 { 00:15:30.188 "name": "BaseBdev1", 00:15:30.188 "uuid": "47b3374e-88f3-4646-9e15-3e3c4f3185b5", 00:15:30.188 "is_configured": true, 00:15:30.188 "data_offset": 2048, 00:15:30.188 "data_size": 63488 00:15:30.188 }, 00:15:30.188 { 00:15:30.188 "name": "BaseBdev2", 00:15:30.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.188 "is_configured": false, 00:15:30.188 "data_offset": 0, 00:15:30.188 "data_size": 0 00:15:30.188 } 00:15:30.188 ] 00:15:30.188 }' 00:15:30.188 06:31:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.188 06:31:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:30.755 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:30.755 [2024-07-25 06:31:44.265801] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:30.755 [2024-07-25 06:31:44.265933] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xde4120 00:15:30.755 [2024-07-25 06:31:44.265946] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:30.755 [2024-07-25 06:31:44.266106] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc32050 00:15:30.755 [2024-07-25 06:31:44.266232] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde4120 00:15:30.755 [2024-07-25 06:31:44.266242] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xde4120 00:15:30.755 [2024-07-25 06:31:44.266326] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:30.755 BaseBdev2 00:15:30.755 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:30.755 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:30.755 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:30.755 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:30.755 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:30.755 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:30.755 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:31.014 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:31.274 [ 00:15:31.274 { 00:15:31.274 "name": "BaseBdev2", 00:15:31.274 "aliases": [ 00:15:31.274 "e51f2d9f-c40f-4542-8cc2-f18b7214a199" 00:15:31.274 ], 00:15:31.274 "product_name": "Malloc disk", 00:15:31.274 "block_size": 512, 00:15:31.274 "num_blocks": 65536, 00:15:31.274 "uuid": "e51f2d9f-c40f-4542-8cc2-f18b7214a199", 00:15:31.274 "assigned_rate_limits": { 00:15:31.274 "rw_ios_per_sec": 0, 00:15:31.274 "rw_mbytes_per_sec": 0, 00:15:31.274 "r_mbytes_per_sec": 0, 00:15:31.274 "w_mbytes_per_sec": 0 00:15:31.274 }, 00:15:31.274 "claimed": true, 00:15:31.274 "claim_type": "exclusive_write", 00:15:31.274 "zoned": false, 00:15:31.274 "supported_io_types": { 00:15:31.274 "read": true, 00:15:31.274 "write": true, 00:15:31.274 "unmap": true, 00:15:31.274 "flush": true, 00:15:31.274 "reset": true, 00:15:31.274 "nvme_admin": false, 00:15:31.274 "nvme_io": false, 00:15:31.274 "nvme_io_md": false, 00:15:31.274 "write_zeroes": true, 00:15:31.274 "zcopy": true, 00:15:31.274 "get_zone_info": false, 00:15:31.274 "zone_management": false, 00:15:31.274 "zone_append": false, 00:15:31.274 "compare": false, 00:15:31.274 "compare_and_write": false, 00:15:31.274 "abort": true, 00:15:31.274 "seek_hole": false, 00:15:31.274 "seek_data": false, 00:15:31.274 "copy": true, 00:15:31.274 "nvme_iov_md": false 00:15:31.274 }, 00:15:31.274 "memory_domains": [ 00:15:31.274 { 00:15:31.274 "dma_device_id": "system", 00:15:31.274 "dma_device_type": 1 00:15:31.274 }, 00:15:31.274 { 00:15:31.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.274 "dma_device_type": 2 00:15:31.274 } 00:15:31.274 ], 00:15:31.274 "driver_specific": {} 00:15:31.274 } 00:15:31.274 ] 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.274 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.534 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.534 "name": "Existed_Raid", 00:15:31.534 "uuid": "dd6205b0-963b-41c5-9bbc-1cbf45dc5a46", 00:15:31.534 "strip_size_kb": 0, 00:15:31.534 "state": "online", 00:15:31.534 "raid_level": "raid1", 00:15:31.534 "superblock": true, 00:15:31.534 "num_base_bdevs": 2, 00:15:31.534 "num_base_bdevs_discovered": 2, 00:15:31.534 "num_base_bdevs_operational": 2, 00:15:31.534 "base_bdevs_list": [ 00:15:31.534 { 00:15:31.534 "name": "BaseBdev1", 00:15:31.534 "uuid": "47b3374e-88f3-4646-9e15-3e3c4f3185b5", 00:15:31.534 "is_configured": true, 00:15:31.534 "data_offset": 2048, 00:15:31.534 "data_size": 63488 00:15:31.534 }, 00:15:31.534 { 00:15:31.534 "name": "BaseBdev2", 00:15:31.534 "uuid": "e51f2d9f-c40f-4542-8cc2-f18b7214a199", 00:15:31.534 "is_configured": true, 00:15:31.534 "data_offset": 2048, 00:15:31.534 "data_size": 63488 00:15:31.534 } 00:15:31.534 ] 00:15:31.534 }' 00:15:31.534 06:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.534 06:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:32.103 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:32.103 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:32.103 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:32.103 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:32.103 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:32.103 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:32.103 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:32.103 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:32.362 [2024-07-25 06:31:45.741943] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:32.362 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:32.362 "name": "Existed_Raid", 00:15:32.363 "aliases": [ 00:15:32.363 "dd6205b0-963b-41c5-9bbc-1cbf45dc5a46" 00:15:32.363 ], 00:15:32.363 "product_name": "Raid Volume", 00:15:32.363 "block_size": 512, 00:15:32.363 "num_blocks": 63488, 00:15:32.363 "uuid": "dd6205b0-963b-41c5-9bbc-1cbf45dc5a46", 00:15:32.363 "assigned_rate_limits": { 00:15:32.363 "rw_ios_per_sec": 0, 00:15:32.363 "rw_mbytes_per_sec": 0, 00:15:32.363 "r_mbytes_per_sec": 0, 00:15:32.363 "w_mbytes_per_sec": 0 00:15:32.363 }, 00:15:32.363 "claimed": false, 00:15:32.363 "zoned": false, 00:15:32.363 "supported_io_types": { 00:15:32.363 "read": true, 00:15:32.363 "write": true, 00:15:32.363 "unmap": false, 00:15:32.363 "flush": false, 00:15:32.363 "reset": true, 00:15:32.363 "nvme_admin": false, 00:15:32.363 "nvme_io": false, 00:15:32.363 "nvme_io_md": false, 00:15:32.363 "write_zeroes": true, 00:15:32.363 "zcopy": false, 00:15:32.363 "get_zone_info": false, 00:15:32.363 "zone_management": false, 00:15:32.363 "zone_append": false, 00:15:32.363 "compare": false, 00:15:32.363 "compare_and_write": false, 00:15:32.363 "abort": false, 00:15:32.363 "seek_hole": false, 00:15:32.363 "seek_data": false, 00:15:32.363 "copy": false, 00:15:32.363 "nvme_iov_md": false 00:15:32.363 }, 00:15:32.363 "memory_domains": [ 00:15:32.363 { 00:15:32.363 "dma_device_id": "system", 00:15:32.363 "dma_device_type": 1 00:15:32.363 }, 00:15:32.363 { 00:15:32.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.363 "dma_device_type": 2 00:15:32.363 }, 00:15:32.363 { 00:15:32.363 "dma_device_id": "system", 00:15:32.363 "dma_device_type": 1 00:15:32.363 }, 00:15:32.363 { 00:15:32.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.363 "dma_device_type": 2 00:15:32.363 } 00:15:32.363 ], 00:15:32.363 "driver_specific": { 00:15:32.363 "raid": { 00:15:32.363 "uuid": "dd6205b0-963b-41c5-9bbc-1cbf45dc5a46", 00:15:32.363 "strip_size_kb": 0, 00:15:32.363 "state": "online", 00:15:32.363 "raid_level": "raid1", 00:15:32.363 "superblock": true, 00:15:32.363 "num_base_bdevs": 2, 00:15:32.363 "num_base_bdevs_discovered": 2, 00:15:32.363 "num_base_bdevs_operational": 2, 00:15:32.363 "base_bdevs_list": [ 00:15:32.363 { 00:15:32.363 "name": "BaseBdev1", 00:15:32.363 "uuid": "47b3374e-88f3-4646-9e15-3e3c4f3185b5", 00:15:32.363 "is_configured": true, 00:15:32.363 "data_offset": 2048, 00:15:32.363 "data_size": 63488 00:15:32.363 }, 00:15:32.363 { 00:15:32.363 "name": "BaseBdev2", 00:15:32.363 "uuid": "e51f2d9f-c40f-4542-8cc2-f18b7214a199", 00:15:32.363 "is_configured": true, 00:15:32.363 "data_offset": 2048, 00:15:32.363 "data_size": 63488 00:15:32.363 } 00:15:32.363 ] 00:15:32.363 } 00:15:32.363 } 00:15:32.363 }' 00:15:32.363 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:32.363 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:32.363 BaseBdev2' 00:15:32.363 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.363 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:32.363 06:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.672 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.672 "name": "BaseBdev1", 00:15:32.672 "aliases": [ 00:15:32.672 "47b3374e-88f3-4646-9e15-3e3c4f3185b5" 00:15:32.672 ], 00:15:32.672 "product_name": "Malloc disk", 00:15:32.672 "block_size": 512, 00:15:32.672 "num_blocks": 65536, 00:15:32.672 "uuid": "47b3374e-88f3-4646-9e15-3e3c4f3185b5", 00:15:32.672 "assigned_rate_limits": { 00:15:32.672 "rw_ios_per_sec": 0, 00:15:32.672 "rw_mbytes_per_sec": 0, 00:15:32.672 "r_mbytes_per_sec": 0, 00:15:32.672 "w_mbytes_per_sec": 0 00:15:32.672 }, 00:15:32.672 "claimed": true, 00:15:32.672 "claim_type": "exclusive_write", 00:15:32.672 "zoned": false, 00:15:32.672 "supported_io_types": { 00:15:32.672 "read": true, 00:15:32.672 "write": true, 00:15:32.672 "unmap": true, 00:15:32.672 "flush": true, 00:15:32.672 "reset": true, 00:15:32.672 "nvme_admin": false, 00:15:32.672 "nvme_io": false, 00:15:32.672 "nvme_io_md": false, 00:15:32.672 "write_zeroes": true, 00:15:32.672 "zcopy": true, 00:15:32.672 "get_zone_info": false, 00:15:32.672 "zone_management": false, 00:15:32.672 "zone_append": false, 00:15:32.672 "compare": false, 00:15:32.672 "compare_and_write": false, 00:15:32.672 "abort": true, 00:15:32.672 "seek_hole": false, 00:15:32.672 "seek_data": false, 00:15:32.672 "copy": true, 00:15:32.672 "nvme_iov_md": false 00:15:32.672 }, 00:15:32.672 "memory_domains": [ 00:15:32.672 { 00:15:32.672 "dma_device_id": "system", 00:15:32.672 "dma_device_type": 1 00:15:32.672 }, 00:15:32.672 { 00:15:32.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.672 "dma_device_type": 2 00:15:32.672 } 00:15:32.672 ], 00:15:32.672 "driver_specific": {} 00:15:32.672 }' 00:15:32.672 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.672 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.672 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.672 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.672 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.672 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.672 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.942 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.942 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.942 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.942 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.942 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.942 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.942 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:32.942 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.202 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.202 "name": "BaseBdev2", 00:15:33.202 "aliases": [ 00:15:33.202 "e51f2d9f-c40f-4542-8cc2-f18b7214a199" 00:15:33.202 ], 00:15:33.202 "product_name": "Malloc disk", 00:15:33.202 "block_size": 512, 00:15:33.202 "num_blocks": 65536, 00:15:33.202 "uuid": "e51f2d9f-c40f-4542-8cc2-f18b7214a199", 00:15:33.202 "assigned_rate_limits": { 00:15:33.202 "rw_ios_per_sec": 0, 00:15:33.202 "rw_mbytes_per_sec": 0, 00:15:33.202 "r_mbytes_per_sec": 0, 00:15:33.202 "w_mbytes_per_sec": 0 00:15:33.202 }, 00:15:33.202 "claimed": true, 00:15:33.202 "claim_type": "exclusive_write", 00:15:33.202 "zoned": false, 00:15:33.202 "supported_io_types": { 00:15:33.202 "read": true, 00:15:33.202 "write": true, 00:15:33.202 "unmap": true, 00:15:33.202 "flush": true, 00:15:33.202 "reset": true, 00:15:33.202 "nvme_admin": false, 00:15:33.202 "nvme_io": false, 00:15:33.202 "nvme_io_md": false, 00:15:33.202 "write_zeroes": true, 00:15:33.202 "zcopy": true, 00:15:33.202 "get_zone_info": false, 00:15:33.202 "zone_management": false, 00:15:33.202 "zone_append": false, 00:15:33.202 "compare": false, 00:15:33.202 "compare_and_write": false, 00:15:33.202 "abort": true, 00:15:33.202 "seek_hole": false, 00:15:33.202 "seek_data": false, 00:15:33.202 "copy": true, 00:15:33.202 "nvme_iov_md": false 00:15:33.202 }, 00:15:33.202 "memory_domains": [ 00:15:33.202 { 00:15:33.202 "dma_device_id": "system", 00:15:33.202 "dma_device_type": 1 00:15:33.202 }, 00:15:33.202 { 00:15:33.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.202 "dma_device_type": 2 00:15:33.202 } 00:15:33.202 ], 00:15:33.202 "driver_specific": {} 00:15:33.202 }' 00:15:33.202 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.202 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.202 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.202 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.202 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.202 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:33.202 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.462 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.462 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.462 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.462 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.462 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.462 06:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:33.722 [2024-07-25 06:31:47.133433] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.722 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.982 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.982 "name": "Existed_Raid", 00:15:33.982 "uuid": "dd6205b0-963b-41c5-9bbc-1cbf45dc5a46", 00:15:33.982 "strip_size_kb": 0, 00:15:33.982 "state": "online", 00:15:33.982 "raid_level": "raid1", 00:15:33.982 "superblock": true, 00:15:33.982 "num_base_bdevs": 2, 00:15:33.982 "num_base_bdevs_discovered": 1, 00:15:33.982 "num_base_bdevs_operational": 1, 00:15:33.982 "base_bdevs_list": [ 00:15:33.982 { 00:15:33.982 "name": null, 00:15:33.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.982 "is_configured": false, 00:15:33.982 "data_offset": 2048, 00:15:33.982 "data_size": 63488 00:15:33.982 }, 00:15:33.982 { 00:15:33.982 "name": "BaseBdev2", 00:15:33.982 "uuid": "e51f2d9f-c40f-4542-8cc2-f18b7214a199", 00:15:33.982 "is_configured": true, 00:15:33.982 "data_offset": 2048, 00:15:33.982 "data_size": 63488 00:15:33.982 } 00:15:33.982 ] 00:15:33.982 }' 00:15:33.982 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.982 06:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:34.551 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:34.551 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:34.551 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:34.551 06:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.810 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:34.811 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:34.811 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:35.070 [2024-07-25 06:31:48.394052] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:35.070 [2024-07-25 06:31:48.394132] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:35.070 [2024-07-25 06:31:48.404541] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:35.070 [2024-07-25 06:31:48.404570] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:35.070 [2024-07-25 06:31:48.404580] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde4120 name Existed_Raid, state offline 00:15:35.070 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:35.070 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:35.070 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.070 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1112187 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1112187 ']' 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1112187 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1112187 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1112187' 00:15:35.330 killing process with pid 1112187 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1112187 00:15:35.330 [2024-07-25 06:31:48.710980] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:35.330 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1112187 00:15:35.330 [2024-07-25 06:31:48.711827] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:35.590 06:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:35.590 00:15:35.590 real 0m9.976s 00:15:35.590 user 0m17.678s 00:15:35.590 sys 0m1.927s 00:15:35.590 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:35.590 06:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:35.590 ************************************ 00:15:35.591 END TEST raid_state_function_test_sb 00:15:35.591 ************************************ 00:15:35.591 06:31:48 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:15:35.591 06:31:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:15:35.591 06:31:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:35.591 06:31:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:35.591 ************************************ 00:15:35.591 START TEST raid_superblock_test 00:15:35.591 ************************************ 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1114191 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1114191 /var/tmp/spdk-raid.sock 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1114191 ']' 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:35.591 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:35.591 06:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.591 [2024-07-25 06:31:49.029163] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:15:35.591 [2024-07-25 06:31:49.029219] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1114191 ] 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:35.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:35.591 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:35.851 [2024-07-25 06:31:49.164327] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:35.851 [2024-07-25 06:31:49.209321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:35.851 [2024-07-25 06:31:49.263879] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:35.851 [2024-07-25 06:31:49.263916] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:36.420 06:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:36.680 malloc1 00:15:36.680 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:36.940 [2024-07-25 06:31:50.373516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:36.940 [2024-07-25 06:31:50.373560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.940 [2024-07-25 06:31:50.373579] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12d9d70 00:15:36.940 [2024-07-25 06:31:50.373591] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.940 [2024-07-25 06:31:50.375065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.940 [2024-07-25 06:31:50.375092] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:36.940 pt1 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:36.940 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:37.200 malloc2 00:15:37.200 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:37.459 [2024-07-25 06:31:50.823151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:37.459 [2024-07-25 06:31:50.823190] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.459 [2024-07-25 06:31:50.823206] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1128790 00:15:37.459 [2024-07-25 06:31:50.823218] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.459 [2024-07-25 06:31:50.824533] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.459 [2024-07-25 06:31:50.824558] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:37.459 pt2 00:15:37.459 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:15:37.459 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:37.459 06:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:15:37.719 [2024-07-25 06:31:51.039723] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:37.719 [2024-07-25 06:31:51.040812] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:37.719 [2024-07-25 06:31:51.040940] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12cb1c0 00:15:37.719 [2024-07-25 06:31:51.040952] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:37.719 [2024-07-25 06:31:51.041121] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1126880 00:15:37.719 [2024-07-25 06:31:51.041261] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12cb1c0 00:15:37.719 [2024-07-25 06:31:51.041271] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12cb1c0 00:15:37.719 [2024-07-25 06:31:51.041355] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.719 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:37.979 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.979 "name": "raid_bdev1", 00:15:37.979 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:37.979 "strip_size_kb": 0, 00:15:37.979 "state": "online", 00:15:37.979 "raid_level": "raid1", 00:15:37.979 "superblock": true, 00:15:37.979 "num_base_bdevs": 2, 00:15:37.979 "num_base_bdevs_discovered": 2, 00:15:37.979 "num_base_bdevs_operational": 2, 00:15:37.979 "base_bdevs_list": [ 00:15:37.979 { 00:15:37.979 "name": "pt1", 00:15:37.979 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.979 "is_configured": true, 00:15:37.979 "data_offset": 2048, 00:15:37.979 "data_size": 63488 00:15:37.979 }, 00:15:37.979 { 00:15:37.979 "name": "pt2", 00:15:37.979 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:37.979 "is_configured": true, 00:15:37.979 "data_offset": 2048, 00:15:37.979 "data_size": 63488 00:15:37.979 } 00:15:37.979 ] 00:15:37.979 }' 00:15:37.979 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.979 06:31:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.549 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:15:38.549 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:38.549 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:38.549 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:38.549 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:38.549 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:38.549 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:38.549 06:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:38.549 [2024-07-25 06:31:52.074663] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.549 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:38.549 "name": "raid_bdev1", 00:15:38.549 "aliases": [ 00:15:38.549 "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0" 00:15:38.549 ], 00:15:38.549 "product_name": "Raid Volume", 00:15:38.549 "block_size": 512, 00:15:38.549 "num_blocks": 63488, 00:15:38.549 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:38.549 "assigned_rate_limits": { 00:15:38.549 "rw_ios_per_sec": 0, 00:15:38.549 "rw_mbytes_per_sec": 0, 00:15:38.549 "r_mbytes_per_sec": 0, 00:15:38.549 "w_mbytes_per_sec": 0 00:15:38.549 }, 00:15:38.549 "claimed": false, 00:15:38.549 "zoned": false, 00:15:38.549 "supported_io_types": { 00:15:38.549 "read": true, 00:15:38.549 "write": true, 00:15:38.549 "unmap": false, 00:15:38.549 "flush": false, 00:15:38.549 "reset": true, 00:15:38.549 "nvme_admin": false, 00:15:38.549 "nvme_io": false, 00:15:38.549 "nvme_io_md": false, 00:15:38.549 "write_zeroes": true, 00:15:38.549 "zcopy": false, 00:15:38.549 "get_zone_info": false, 00:15:38.549 "zone_management": false, 00:15:38.549 "zone_append": false, 00:15:38.549 "compare": false, 00:15:38.549 "compare_and_write": false, 00:15:38.549 "abort": false, 00:15:38.549 "seek_hole": false, 00:15:38.549 "seek_data": false, 00:15:38.549 "copy": false, 00:15:38.549 "nvme_iov_md": false 00:15:38.549 }, 00:15:38.549 "memory_domains": [ 00:15:38.549 { 00:15:38.549 "dma_device_id": "system", 00:15:38.549 "dma_device_type": 1 00:15:38.549 }, 00:15:38.549 { 00:15:38.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.549 "dma_device_type": 2 00:15:38.549 }, 00:15:38.549 { 00:15:38.549 "dma_device_id": "system", 00:15:38.549 "dma_device_type": 1 00:15:38.549 }, 00:15:38.549 { 00:15:38.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.549 "dma_device_type": 2 00:15:38.549 } 00:15:38.549 ], 00:15:38.549 "driver_specific": { 00:15:38.549 "raid": { 00:15:38.549 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:38.549 "strip_size_kb": 0, 00:15:38.549 "state": "online", 00:15:38.549 "raid_level": "raid1", 00:15:38.549 "superblock": true, 00:15:38.549 "num_base_bdevs": 2, 00:15:38.549 "num_base_bdevs_discovered": 2, 00:15:38.549 "num_base_bdevs_operational": 2, 00:15:38.549 "base_bdevs_list": [ 00:15:38.549 { 00:15:38.549 "name": "pt1", 00:15:38.549 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.549 "is_configured": true, 00:15:38.549 "data_offset": 2048, 00:15:38.549 "data_size": 63488 00:15:38.549 }, 00:15:38.549 { 00:15:38.549 "name": "pt2", 00:15:38.549 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.549 "is_configured": true, 00:15:38.549 "data_offset": 2048, 00:15:38.549 "data_size": 63488 00:15:38.549 } 00:15:38.549 ] 00:15:38.549 } 00:15:38.549 } 00:15:38.549 }' 00:15:38.549 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.808 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:38.808 pt2' 00:15:38.808 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.808 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:38.809 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.809 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.809 "name": "pt1", 00:15:38.809 "aliases": [ 00:15:38.809 "00000000-0000-0000-0000-000000000001" 00:15:38.809 ], 00:15:38.809 "product_name": "passthru", 00:15:38.809 "block_size": 512, 00:15:38.809 "num_blocks": 65536, 00:15:38.809 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.809 "assigned_rate_limits": { 00:15:38.809 "rw_ios_per_sec": 0, 00:15:38.809 "rw_mbytes_per_sec": 0, 00:15:38.809 "r_mbytes_per_sec": 0, 00:15:38.809 "w_mbytes_per_sec": 0 00:15:38.809 }, 00:15:38.809 "claimed": true, 00:15:38.809 "claim_type": "exclusive_write", 00:15:38.809 "zoned": false, 00:15:38.809 "supported_io_types": { 00:15:38.809 "read": true, 00:15:38.809 "write": true, 00:15:38.809 "unmap": true, 00:15:38.809 "flush": true, 00:15:38.809 "reset": true, 00:15:38.809 "nvme_admin": false, 00:15:38.809 "nvme_io": false, 00:15:38.809 "nvme_io_md": false, 00:15:38.809 "write_zeroes": true, 00:15:38.809 "zcopy": true, 00:15:38.809 "get_zone_info": false, 00:15:38.809 "zone_management": false, 00:15:38.809 "zone_append": false, 00:15:38.809 "compare": false, 00:15:38.809 "compare_and_write": false, 00:15:38.809 "abort": true, 00:15:38.809 "seek_hole": false, 00:15:38.809 "seek_data": false, 00:15:38.809 "copy": true, 00:15:38.809 "nvme_iov_md": false 00:15:38.809 }, 00:15:38.809 "memory_domains": [ 00:15:38.809 { 00:15:38.809 "dma_device_id": "system", 00:15:38.809 "dma_device_type": 1 00:15:38.809 }, 00:15:38.809 { 00:15:38.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.809 "dma_device_type": 2 00:15:38.809 } 00:15:38.809 ], 00:15:38.809 "driver_specific": { 00:15:38.809 "passthru": { 00:15:38.809 "name": "pt1", 00:15:38.809 "base_bdev_name": "malloc1" 00:15:38.809 } 00:15:38.809 } 00:15:38.809 }' 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.068 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.330 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.330 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.330 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.330 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:39.330 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.589 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.589 "name": "pt2", 00:15:39.589 "aliases": [ 00:15:39.589 "00000000-0000-0000-0000-000000000002" 00:15:39.589 ], 00:15:39.589 "product_name": "passthru", 00:15:39.589 "block_size": 512, 00:15:39.589 "num_blocks": 65536, 00:15:39.589 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.589 "assigned_rate_limits": { 00:15:39.589 "rw_ios_per_sec": 0, 00:15:39.589 "rw_mbytes_per_sec": 0, 00:15:39.589 "r_mbytes_per_sec": 0, 00:15:39.589 "w_mbytes_per_sec": 0 00:15:39.589 }, 00:15:39.589 "claimed": true, 00:15:39.589 "claim_type": "exclusive_write", 00:15:39.589 "zoned": false, 00:15:39.589 "supported_io_types": { 00:15:39.589 "read": true, 00:15:39.589 "write": true, 00:15:39.589 "unmap": true, 00:15:39.589 "flush": true, 00:15:39.589 "reset": true, 00:15:39.589 "nvme_admin": false, 00:15:39.589 "nvme_io": false, 00:15:39.589 "nvme_io_md": false, 00:15:39.589 "write_zeroes": true, 00:15:39.589 "zcopy": true, 00:15:39.589 "get_zone_info": false, 00:15:39.590 "zone_management": false, 00:15:39.590 "zone_append": false, 00:15:39.590 "compare": false, 00:15:39.590 "compare_and_write": false, 00:15:39.590 "abort": true, 00:15:39.590 "seek_hole": false, 00:15:39.590 "seek_data": false, 00:15:39.590 "copy": true, 00:15:39.590 "nvme_iov_md": false 00:15:39.590 }, 00:15:39.590 "memory_domains": [ 00:15:39.590 { 00:15:39.590 "dma_device_id": "system", 00:15:39.590 "dma_device_type": 1 00:15:39.590 }, 00:15:39.590 { 00:15:39.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.590 "dma_device_type": 2 00:15:39.590 } 00:15:39.590 ], 00:15:39.590 "driver_specific": { 00:15:39.590 "passthru": { 00:15:39.590 "name": "pt2", 00:15:39.590 "base_bdev_name": "malloc2" 00:15:39.590 } 00:15:39.590 } 00:15:39.590 }' 00:15:39.590 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.590 06:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.590 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.590 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.590 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.590 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.590 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.590 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.850 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.850 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.850 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.850 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.850 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:39.850 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:15:40.109 [2024-07-25 06:31:53.466453] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:40.109 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0 00:15:40.109 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0 ']' 00:15:40.109 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:40.368 [2024-07-25 06:31:53.694817] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:40.368 [2024-07-25 06:31:53.694833] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:40.368 [2024-07-25 06:31:53.694882] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:40.368 [2024-07-25 06:31:53.694929] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:40.368 [2024-07-25 06:31:53.694940] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12cb1c0 name raid_bdev1, state offline 00:15:40.368 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.368 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:15:40.627 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:15:40.627 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:15:40.627 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:15:40.627 06:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:40.627 06:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:15:40.627 06:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:40.887 06:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:40.887 06:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:41.146 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:15:41.406 [2024-07-25 06:31:54.821738] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:41.406 [2024-07-25 06:31:54.822961] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:41.406 [2024-07-25 06:31:54.823009] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:41.406 [2024-07-25 06:31:54.823046] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:41.406 [2024-07-25 06:31:54.823063] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:41.406 [2024-07-25 06:31:54.823071] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12cc5a0 name raid_bdev1, state configuring 00:15:41.406 request: 00:15:41.406 { 00:15:41.406 "name": "raid_bdev1", 00:15:41.406 "raid_level": "raid1", 00:15:41.406 "base_bdevs": [ 00:15:41.406 "malloc1", 00:15:41.406 "malloc2" 00:15:41.406 ], 00:15:41.406 "superblock": false, 00:15:41.406 "method": "bdev_raid_create", 00:15:41.406 "req_id": 1 00:15:41.406 } 00:15:41.406 Got JSON-RPC error response 00:15:41.406 response: 00:15:41.406 { 00:15:41.406 "code": -17, 00:15:41.406 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:41.406 } 00:15:41.406 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:15:41.406 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:41.406 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:41.406 06:31:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:41.406 06:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:15:41.406 06:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.665 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:15:41.665 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:15:41.665 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:41.924 [2024-07-25 06:31:55.278891] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:41.924 [2024-07-25 06:31:55.278933] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:41.924 [2024-07-25 06:31:55.278949] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12caf60 00:15:41.924 [2024-07-25 06:31:55.278960] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:41.924 [2024-07-25 06:31:55.280420] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:41.924 [2024-07-25 06:31:55.280445] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:41.924 [2024-07-25 06:31:55.280503] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:41.924 [2024-07-25 06:31:55.280526] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:41.924 pt1 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.924 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:42.183 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.183 "name": "raid_bdev1", 00:15:42.183 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:42.183 "strip_size_kb": 0, 00:15:42.183 "state": "configuring", 00:15:42.183 "raid_level": "raid1", 00:15:42.183 "superblock": true, 00:15:42.183 "num_base_bdevs": 2, 00:15:42.183 "num_base_bdevs_discovered": 1, 00:15:42.183 "num_base_bdevs_operational": 2, 00:15:42.183 "base_bdevs_list": [ 00:15:42.183 { 00:15:42.183 "name": "pt1", 00:15:42.183 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:42.183 "is_configured": true, 00:15:42.183 "data_offset": 2048, 00:15:42.183 "data_size": 63488 00:15:42.183 }, 00:15:42.183 { 00:15:42.183 "name": null, 00:15:42.184 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:42.184 "is_configured": false, 00:15:42.184 "data_offset": 2048, 00:15:42.184 "data_size": 63488 00:15:42.184 } 00:15:42.184 ] 00:15:42.184 }' 00:15:42.184 06:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.184 06:31:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.751 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:15:42.751 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:15:42.751 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:42.751 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:43.009 [2024-07-25 06:31:56.321855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:43.009 [2024-07-25 06:31:56.321901] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:43.009 [2024-07-25 06:31:56.321919] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12cc9d0 00:15:43.009 [2024-07-25 06:31:56.321931] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:43.009 [2024-07-25 06:31:56.322241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:43.009 [2024-07-25 06:31:56.322259] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:43.009 [2024-07-25 06:31:56.322313] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:43.009 [2024-07-25 06:31:56.322331] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:43.009 [2024-07-25 06:31:56.322416] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12cf700 00:15:43.009 [2024-07-25 06:31:56.322426] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:43.009 [2024-07-25 06:31:56.322579] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12d90a0 00:15:43.009 [2024-07-25 06:31:56.322697] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12cf700 00:15:43.009 [2024-07-25 06:31:56.322706] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12cf700 00:15:43.009 [2024-07-25 06:31:56.322793] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:43.009 pt2 00:15:43.009 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:15:43.009 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:43.009 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:43.009 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.010 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:43.268 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.268 "name": "raid_bdev1", 00:15:43.268 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:43.268 "strip_size_kb": 0, 00:15:43.268 "state": "online", 00:15:43.268 "raid_level": "raid1", 00:15:43.268 "superblock": true, 00:15:43.268 "num_base_bdevs": 2, 00:15:43.268 "num_base_bdevs_discovered": 2, 00:15:43.268 "num_base_bdevs_operational": 2, 00:15:43.268 "base_bdevs_list": [ 00:15:43.268 { 00:15:43.268 "name": "pt1", 00:15:43.268 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:43.268 "is_configured": true, 00:15:43.268 "data_offset": 2048, 00:15:43.268 "data_size": 63488 00:15:43.268 }, 00:15:43.268 { 00:15:43.268 "name": "pt2", 00:15:43.268 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.268 "is_configured": true, 00:15:43.268 "data_offset": 2048, 00:15:43.268 "data_size": 63488 00:15:43.268 } 00:15:43.268 ] 00:15:43.268 }' 00:15:43.268 06:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.268 06:31:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:43.836 [2024-07-25 06:31:57.344771] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:43.836 "name": "raid_bdev1", 00:15:43.836 "aliases": [ 00:15:43.836 "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0" 00:15:43.836 ], 00:15:43.836 "product_name": "Raid Volume", 00:15:43.836 "block_size": 512, 00:15:43.836 "num_blocks": 63488, 00:15:43.836 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:43.836 "assigned_rate_limits": { 00:15:43.836 "rw_ios_per_sec": 0, 00:15:43.836 "rw_mbytes_per_sec": 0, 00:15:43.836 "r_mbytes_per_sec": 0, 00:15:43.836 "w_mbytes_per_sec": 0 00:15:43.836 }, 00:15:43.836 "claimed": false, 00:15:43.836 "zoned": false, 00:15:43.836 "supported_io_types": { 00:15:43.836 "read": true, 00:15:43.836 "write": true, 00:15:43.836 "unmap": false, 00:15:43.836 "flush": false, 00:15:43.836 "reset": true, 00:15:43.836 "nvme_admin": false, 00:15:43.836 "nvme_io": false, 00:15:43.836 "nvme_io_md": false, 00:15:43.836 "write_zeroes": true, 00:15:43.836 "zcopy": false, 00:15:43.836 "get_zone_info": false, 00:15:43.836 "zone_management": false, 00:15:43.836 "zone_append": false, 00:15:43.836 "compare": false, 00:15:43.836 "compare_and_write": false, 00:15:43.836 "abort": false, 00:15:43.836 "seek_hole": false, 00:15:43.836 "seek_data": false, 00:15:43.836 "copy": false, 00:15:43.836 "nvme_iov_md": false 00:15:43.836 }, 00:15:43.836 "memory_domains": [ 00:15:43.836 { 00:15:43.836 "dma_device_id": "system", 00:15:43.836 "dma_device_type": 1 00:15:43.836 }, 00:15:43.836 { 00:15:43.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.836 "dma_device_type": 2 00:15:43.836 }, 00:15:43.836 { 00:15:43.836 "dma_device_id": "system", 00:15:43.836 "dma_device_type": 1 00:15:43.836 }, 00:15:43.836 { 00:15:43.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.836 "dma_device_type": 2 00:15:43.836 } 00:15:43.836 ], 00:15:43.836 "driver_specific": { 00:15:43.836 "raid": { 00:15:43.836 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:43.836 "strip_size_kb": 0, 00:15:43.836 "state": "online", 00:15:43.836 "raid_level": "raid1", 00:15:43.836 "superblock": true, 00:15:43.836 "num_base_bdevs": 2, 00:15:43.836 "num_base_bdevs_discovered": 2, 00:15:43.836 "num_base_bdevs_operational": 2, 00:15:43.836 "base_bdevs_list": [ 00:15:43.836 { 00:15:43.836 "name": "pt1", 00:15:43.836 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:43.836 "is_configured": true, 00:15:43.836 "data_offset": 2048, 00:15:43.836 "data_size": 63488 00:15:43.836 }, 00:15:43.836 { 00:15:43.836 "name": "pt2", 00:15:43.836 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.836 "is_configured": true, 00:15:43.836 "data_offset": 2048, 00:15:43.836 "data_size": 63488 00:15:43.836 } 00:15:43.836 ] 00:15:43.836 } 00:15:43.836 } 00:15:43.836 }' 00:15:43.836 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:44.095 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:44.095 pt2' 00:15:44.095 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:44.095 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:44.095 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:44.095 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:44.095 "name": "pt1", 00:15:44.095 "aliases": [ 00:15:44.095 "00000000-0000-0000-0000-000000000001" 00:15:44.095 ], 00:15:44.095 "product_name": "passthru", 00:15:44.095 "block_size": 512, 00:15:44.095 "num_blocks": 65536, 00:15:44.095 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:44.095 "assigned_rate_limits": { 00:15:44.095 "rw_ios_per_sec": 0, 00:15:44.095 "rw_mbytes_per_sec": 0, 00:15:44.095 "r_mbytes_per_sec": 0, 00:15:44.095 "w_mbytes_per_sec": 0 00:15:44.095 }, 00:15:44.095 "claimed": true, 00:15:44.095 "claim_type": "exclusive_write", 00:15:44.095 "zoned": false, 00:15:44.095 "supported_io_types": { 00:15:44.095 "read": true, 00:15:44.095 "write": true, 00:15:44.095 "unmap": true, 00:15:44.095 "flush": true, 00:15:44.095 "reset": true, 00:15:44.095 "nvme_admin": false, 00:15:44.095 "nvme_io": false, 00:15:44.095 "nvme_io_md": false, 00:15:44.095 "write_zeroes": true, 00:15:44.095 "zcopy": true, 00:15:44.095 "get_zone_info": false, 00:15:44.095 "zone_management": false, 00:15:44.095 "zone_append": false, 00:15:44.095 "compare": false, 00:15:44.095 "compare_and_write": false, 00:15:44.095 "abort": true, 00:15:44.095 "seek_hole": false, 00:15:44.095 "seek_data": false, 00:15:44.095 "copy": true, 00:15:44.095 "nvme_iov_md": false 00:15:44.095 }, 00:15:44.095 "memory_domains": [ 00:15:44.095 { 00:15:44.095 "dma_device_id": "system", 00:15:44.095 "dma_device_type": 1 00:15:44.095 }, 00:15:44.095 { 00:15:44.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.095 "dma_device_type": 2 00:15:44.095 } 00:15:44.095 ], 00:15:44.095 "driver_specific": { 00:15:44.095 "passthru": { 00:15:44.095 "name": "pt1", 00:15:44.095 "base_bdev_name": "malloc1" 00:15:44.095 } 00:15:44.095 } 00:15:44.095 }' 00:15:44.095 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:44.354 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.613 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.613 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:44.613 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:44.613 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:44.613 06:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:44.871 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:44.871 "name": "pt2", 00:15:44.871 "aliases": [ 00:15:44.871 "00000000-0000-0000-0000-000000000002" 00:15:44.871 ], 00:15:44.871 "product_name": "passthru", 00:15:44.871 "block_size": 512, 00:15:44.871 "num_blocks": 65536, 00:15:44.871 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:44.871 "assigned_rate_limits": { 00:15:44.871 "rw_ios_per_sec": 0, 00:15:44.871 "rw_mbytes_per_sec": 0, 00:15:44.871 "r_mbytes_per_sec": 0, 00:15:44.871 "w_mbytes_per_sec": 0 00:15:44.871 }, 00:15:44.871 "claimed": true, 00:15:44.871 "claim_type": "exclusive_write", 00:15:44.871 "zoned": false, 00:15:44.871 "supported_io_types": { 00:15:44.871 "read": true, 00:15:44.871 "write": true, 00:15:44.871 "unmap": true, 00:15:44.871 "flush": true, 00:15:44.871 "reset": true, 00:15:44.871 "nvme_admin": false, 00:15:44.871 "nvme_io": false, 00:15:44.871 "nvme_io_md": false, 00:15:44.871 "write_zeroes": true, 00:15:44.871 "zcopy": true, 00:15:44.871 "get_zone_info": false, 00:15:44.871 "zone_management": false, 00:15:44.871 "zone_append": false, 00:15:44.871 "compare": false, 00:15:44.871 "compare_and_write": false, 00:15:44.871 "abort": true, 00:15:44.871 "seek_hole": false, 00:15:44.871 "seek_data": false, 00:15:44.871 "copy": true, 00:15:44.871 "nvme_iov_md": false 00:15:44.871 }, 00:15:44.871 "memory_domains": [ 00:15:44.871 { 00:15:44.871 "dma_device_id": "system", 00:15:44.871 "dma_device_type": 1 00:15:44.871 }, 00:15:44.871 { 00:15:44.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.871 "dma_device_type": 2 00:15:44.871 } 00:15:44.872 ], 00:15:44.872 "driver_specific": { 00:15:44.872 "passthru": { 00:15:44.872 "name": "pt2", 00:15:44.872 "base_bdev_name": "malloc2" 00:15:44.872 } 00:15:44.872 } 00:15:44.872 }' 00:15:44.872 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.872 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.872 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:44.872 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.872 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.872 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:44.872 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.872 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:45.130 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:45.130 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:45.130 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:45.130 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:45.130 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:45.130 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:15:45.422 [2024-07-25 06:31:58.740426] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:45.422 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0 '!=' ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0 ']' 00:15:45.422 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:15:45.422 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:45.422 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:45.422 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:45.681 [2024-07-25 06:31:58.968903] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.681 06:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:45.681 06:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.681 "name": "raid_bdev1", 00:15:45.681 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:45.681 "strip_size_kb": 0, 00:15:45.681 "state": "online", 00:15:45.681 "raid_level": "raid1", 00:15:45.681 "superblock": true, 00:15:45.681 "num_base_bdevs": 2, 00:15:45.681 "num_base_bdevs_discovered": 1, 00:15:45.681 "num_base_bdevs_operational": 1, 00:15:45.681 "base_bdevs_list": [ 00:15:45.681 { 00:15:45.681 "name": null, 00:15:45.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.681 "is_configured": false, 00:15:45.681 "data_offset": 2048, 00:15:45.681 "data_size": 63488 00:15:45.681 }, 00:15:45.681 { 00:15:45.681 "name": "pt2", 00:15:45.681 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:45.681 "is_configured": true, 00:15:45.681 "data_offset": 2048, 00:15:45.681 "data_size": 63488 00:15:45.681 } 00:15:45.681 ] 00:15:45.681 }' 00:15:45.681 06:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.681 06:31:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.248 06:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:46.507 [2024-07-25 06:31:59.983540] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:46.507 [2024-07-25 06:31:59.983563] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:46.507 [2024-07-25 06:31:59.983611] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:46.507 [2024-07-25 06:31:59.983650] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:46.507 [2024-07-25 06:31:59.983660] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12cf700 name raid_bdev1, state offline 00:15:46.507 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.507 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:15:46.766 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:15:46.766 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:15:46.766 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:15:46.766 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:15:46.766 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:47.025 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:15:47.025 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:15:47.025 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:15:47.025 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:15:47.025 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=1 00:15:47.025 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:47.284 [2024-07-25 06:32:00.669317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:47.284 [2024-07-25 06:32:00.669358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.284 [2024-07-25 06:32:00.669373] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11280e0 00:15:47.284 [2024-07-25 06:32:00.669384] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.284 [2024-07-25 06:32:00.670867] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.284 [2024-07-25 06:32:00.670895] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:47.284 [2024-07-25 06:32:00.670953] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:47.284 [2024-07-25 06:32:00.670976] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:47.284 [2024-07-25 06:32:00.671049] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12d0250 00:15:47.284 [2024-07-25 06:32:00.671060] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:47.284 [2024-07-25 06:32:00.671226] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1126f00 00:15:47.284 [2024-07-25 06:32:00.671336] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12d0250 00:15:47.284 [2024-07-25 06:32:00.671346] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12d0250 00:15:47.284 [2024-07-25 06:32:00.671433] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:47.284 pt2 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.285 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:47.543 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.543 "name": "raid_bdev1", 00:15:47.543 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:47.543 "strip_size_kb": 0, 00:15:47.543 "state": "online", 00:15:47.543 "raid_level": "raid1", 00:15:47.543 "superblock": true, 00:15:47.543 "num_base_bdevs": 2, 00:15:47.543 "num_base_bdevs_discovered": 1, 00:15:47.543 "num_base_bdevs_operational": 1, 00:15:47.543 "base_bdevs_list": [ 00:15:47.543 { 00:15:47.543 "name": null, 00:15:47.543 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.543 "is_configured": false, 00:15:47.543 "data_offset": 2048, 00:15:47.543 "data_size": 63488 00:15:47.543 }, 00:15:47.543 { 00:15:47.543 "name": "pt2", 00:15:47.543 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:47.543 "is_configured": true, 00:15:47.543 "data_offset": 2048, 00:15:47.543 "data_size": 63488 00:15:47.543 } 00:15:47.543 ] 00:15:47.543 }' 00:15:47.543 06:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.543 06:32:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.111 06:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:48.369 [2024-07-25 06:32:01.692008] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:48.369 [2024-07-25 06:32:01.692031] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:48.369 [2024-07-25 06:32:01.692080] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:48.369 [2024-07-25 06:32:01.692117] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:48.369 [2024-07-25 06:32:01.692127] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d0250 name raid_bdev1, state offline 00:15:48.369 06:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.369 06:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:15:48.628 06:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:15:48.628 06:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:15:48.628 06:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:15:48.628 06:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:48.628 [2024-07-25 06:32:02.153210] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:48.628 [2024-07-25 06:32:02.153256] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.629 [2024-07-25 06:32:02.153274] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111dea0 00:15:48.629 [2024-07-25 06:32:02.153285] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.629 [2024-07-25 06:32:02.154748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.629 [2024-07-25 06:32:02.154774] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:48.629 [2024-07-25 06:32:02.154830] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:48.629 [2024-07-25 06:32:02.154852] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:48.629 [2024-07-25 06:32:02.154938] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:15:48.629 [2024-07-25 06:32:02.154950] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:48.629 [2024-07-25 06:32:02.154962] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x111e890 name raid_bdev1, state configuring 00:15:48.629 [2024-07-25 06:32:02.154982] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:48.629 [2024-07-25 06:32:02.155030] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12cfd30 00:15:48.629 [2024-07-25 06:32:02.155039] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:48.629 [2024-07-25 06:32:02.155198] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1126f00 00:15:48.629 [2024-07-25 06:32:02.155309] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12cfd30 00:15:48.629 [2024-07-25 06:32:02.155318] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12cfd30 00:15:48.629 [2024-07-25 06:32:02.155403] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:48.629 pt1 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.629 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.888 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.888 "name": "raid_bdev1", 00:15:48.888 "uuid": "ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0", 00:15:48.888 "strip_size_kb": 0, 00:15:48.888 "state": "online", 00:15:48.888 "raid_level": "raid1", 00:15:48.888 "superblock": true, 00:15:48.888 "num_base_bdevs": 2, 00:15:48.888 "num_base_bdevs_discovered": 1, 00:15:48.888 "num_base_bdevs_operational": 1, 00:15:48.888 "base_bdevs_list": [ 00:15:48.888 { 00:15:48.888 "name": null, 00:15:48.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.888 "is_configured": false, 00:15:48.888 "data_offset": 2048, 00:15:48.888 "data_size": 63488 00:15:48.888 }, 00:15:48.888 { 00:15:48.888 "name": "pt2", 00:15:48.888 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:48.888 "is_configured": true, 00:15:48.888 "data_offset": 2048, 00:15:48.888 "data_size": 63488 00:15:48.888 } 00:15:48.888 ] 00:15:48.888 }' 00:15:48.888 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.888 06:32:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.455 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:15:49.455 06:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:15:49.714 06:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:15:49.715 06:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:49.715 06:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:15:49.974 [2024-07-25 06:32:03.432784] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0 '!=' ff46b0d7-3004-4ceb-a05b-d0cdef5c65d0 ']' 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1114191 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1114191 ']' 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1114191 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1114191 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1114191' 00:15:49.974 killing process with pid 1114191 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1114191 00:15:49.974 [2024-07-25 06:32:03.502121] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:49.974 [2024-07-25 06:32:03.502175] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:49.974 [2024-07-25 06:32:03.502215] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:49.974 [2024-07-25 06:32:03.502225] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12cfd30 name raid_bdev1, state offline 00:15:49.974 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1114191 00:15:49.974 [2024-07-25 06:32:03.518109] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:50.233 06:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:15:50.233 00:15:50.233 real 0m14.719s 00:15:50.233 user 0m26.705s 00:15:50.233 sys 0m2.713s 00:15:50.233 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:50.233 06:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.233 ************************************ 00:15:50.233 END TEST raid_superblock_test 00:15:50.233 ************************************ 00:15:50.233 06:32:03 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:15:50.233 06:32:03 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:50.233 06:32:03 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:50.233 06:32:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:50.233 ************************************ 00:15:50.233 START TEST raid_read_error_test 00:15:50.233 ************************************ 00:15:50.233 06:32:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 read 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:15:50.234 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.ZpjKDvugEj 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1116910 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1116910 /var/tmp/spdk-raid.sock 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1116910 ']' 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:50.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:50.493 06:32:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.493 [2024-07-25 06:32:03.849531] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:15:50.493 [2024-07-25 06:32:03.849586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1116910 ] 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:50.493 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:50.493 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:50.493 [2024-07-25 06:32:03.987050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:50.493 [2024-07-25 06:32:04.032166] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:50.751 [2024-07-25 06:32:04.092999] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:50.751 [2024-07-25 06:32:04.093036] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:51.318 06:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:51.318 06:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:51.318 06:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:51.318 06:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:51.576 BaseBdev1_malloc 00:15:51.576 06:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:51.835 true 00:15:51.835 06:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:52.094 [2024-07-25 06:32:05.409263] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:52.094 [2024-07-25 06:32:05.409304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:52.094 [2024-07-25 06:32:05.409323] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2038a60 00:15:52.094 [2024-07-25 06:32:05.409334] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:52.094 [2024-07-25 06:32:05.410805] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:52.094 [2024-07-25 06:32:05.410833] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:52.094 BaseBdev1 00:15:52.094 06:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:52.094 06:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:52.094 BaseBdev2_malloc 00:15:52.354 06:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:52.354 true 00:15:52.354 06:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:52.613 [2024-07-25 06:32:06.095406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:52.613 [2024-07-25 06:32:06.095448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:52.613 [2024-07-25 06:32:06.095470] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x203ddc0 00:15:52.613 [2024-07-25 06:32:06.095482] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:52.613 [2024-07-25 06:32:06.096857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:52.613 [2024-07-25 06:32:06.096884] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:52.613 BaseBdev2 00:15:52.613 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:15:52.873 [2024-07-25 06:32:06.307982] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:52.873 [2024-07-25 06:32:06.309080] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:52.873 [2024-07-25 06:32:06.309256] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x203ee90 00:15:52.873 [2024-07-25 06:32:06.309269] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:52.873 [2024-07-25 06:32:06.309435] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x203aab0 00:15:52.873 [2024-07-25 06:32:06.309571] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x203ee90 00:15:52.873 [2024-07-25 06:32:06.309581] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x203ee90 00:15:52.873 [2024-07-25 06:32:06.309673] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.873 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:53.132 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:53.132 "name": "raid_bdev1", 00:15:53.132 "uuid": "200069e1-6728-4b7a-92ff-4957e7620124", 00:15:53.132 "strip_size_kb": 0, 00:15:53.132 "state": "online", 00:15:53.132 "raid_level": "raid1", 00:15:53.132 "superblock": true, 00:15:53.132 "num_base_bdevs": 2, 00:15:53.132 "num_base_bdevs_discovered": 2, 00:15:53.132 "num_base_bdevs_operational": 2, 00:15:53.132 "base_bdevs_list": [ 00:15:53.132 { 00:15:53.132 "name": "BaseBdev1", 00:15:53.132 "uuid": "36acd986-add7-5362-9a54-31d34af1f2fe", 00:15:53.132 "is_configured": true, 00:15:53.132 "data_offset": 2048, 00:15:53.132 "data_size": 63488 00:15:53.132 }, 00:15:53.132 { 00:15:53.132 "name": "BaseBdev2", 00:15:53.132 "uuid": "e65e3620-63bf-51c3-8aad-fdef8480a190", 00:15:53.133 "is_configured": true, 00:15:53.133 "data_offset": 2048, 00:15:53.133 "data_size": 63488 00:15:53.133 } 00:15:53.133 ] 00:15:53.133 }' 00:15:53.133 06:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:53.133 06:32:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:53.701 06:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:53.701 06:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:53.701 [2024-07-25 06:32:07.238672] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e939e0 00:15:54.638 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.898 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:55.157 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.157 "name": "raid_bdev1", 00:15:55.157 "uuid": "200069e1-6728-4b7a-92ff-4957e7620124", 00:15:55.157 "strip_size_kb": 0, 00:15:55.157 "state": "online", 00:15:55.157 "raid_level": "raid1", 00:15:55.157 "superblock": true, 00:15:55.157 "num_base_bdevs": 2, 00:15:55.157 "num_base_bdevs_discovered": 2, 00:15:55.157 "num_base_bdevs_operational": 2, 00:15:55.157 "base_bdevs_list": [ 00:15:55.157 { 00:15:55.157 "name": "BaseBdev1", 00:15:55.157 "uuid": "36acd986-add7-5362-9a54-31d34af1f2fe", 00:15:55.157 "is_configured": true, 00:15:55.157 "data_offset": 2048, 00:15:55.157 "data_size": 63488 00:15:55.157 }, 00:15:55.157 { 00:15:55.157 "name": "BaseBdev2", 00:15:55.157 "uuid": "e65e3620-63bf-51c3-8aad-fdef8480a190", 00:15:55.157 "is_configured": true, 00:15:55.157 "data_offset": 2048, 00:15:55.157 "data_size": 63488 00:15:55.157 } 00:15:55.157 ] 00:15:55.157 }' 00:15:55.157 06:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.157 06:32:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.725 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:55.985 [2024-07-25 06:32:09.334423] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:55.985 [2024-07-25 06:32:09.334461] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:55.985 [2024-07-25 06:32:09.337383] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:55.985 [2024-07-25 06:32:09.337411] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:55.985 [2024-07-25 06:32:09.337476] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:55.985 [2024-07-25 06:32:09.337487] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x203ee90 name raid_bdev1, state offline 00:15:55.985 0 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1116910 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1116910 ']' 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1116910 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1116910 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1116910' 00:15:55.985 killing process with pid 1116910 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1116910 00:15:55.985 [2024-07-25 06:32:09.413040] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:55.985 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1116910 00:15:55.985 [2024-07-25 06:32:09.422827] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.ZpjKDvugEj 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:15:56.245 00:15:56.245 real 0m5.839s 00:15:56.245 user 0m9.071s 00:15:56.245 sys 0m1.040s 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:56.245 06:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.245 ************************************ 00:15:56.245 END TEST raid_read_error_test 00:15:56.245 ************************************ 00:15:56.245 06:32:09 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:15:56.245 06:32:09 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:56.245 06:32:09 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:56.245 06:32:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:56.245 ************************************ 00:15:56.245 START TEST raid_write_error_test 00:15:56.245 ************************************ 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 write 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.L1WCt5chkU 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1118056 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1118056 /var/tmp/spdk-raid.sock 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1118056 ']' 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:56.245 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:56.245 06:32:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.245 [2024-07-25 06:32:09.781176] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:15:56.245 [2024-07-25 06:32:09.781237] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1118056 ] 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:01.0 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:01.1 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:01.2 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:01.3 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:01.4 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:01.5 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:01.6 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:01.7 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:02.0 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:02.1 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:02.2 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:02.3 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:02.4 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:02.5 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:02.6 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3d:02.7 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:01.0 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:01.1 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:01.2 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:01.3 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:01.4 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:01.5 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:01.6 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:01.7 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.504 EAL: Requested device 0000:3f:02.0 cannot be used 00:15:56.504 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.505 EAL: Requested device 0000:3f:02.1 cannot be used 00:15:56.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.505 EAL: Requested device 0000:3f:02.2 cannot be used 00:15:56.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.505 EAL: Requested device 0000:3f:02.3 cannot be used 00:15:56.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.505 EAL: Requested device 0000:3f:02.4 cannot be used 00:15:56.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.505 EAL: Requested device 0000:3f:02.5 cannot be used 00:15:56.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.505 EAL: Requested device 0000:3f:02.6 cannot be used 00:15:56.505 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:15:56.505 EAL: Requested device 0000:3f:02.7 cannot be used 00:15:56.505 [2024-07-25 06:32:09.919564] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:56.505 [2024-07-25 06:32:09.962410] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:56.505 [2024-07-25 06:32:10.026273] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:56.505 [2024-07-25 06:32:10.026308] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:57.439 06:32:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:57.439 06:32:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:57.439 06:32:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:57.439 06:32:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:57.439 BaseBdev1_malloc 00:15:57.439 06:32:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:57.697 true 00:15:57.697 06:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:57.955 [2024-07-25 06:32:11.354020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:57.955 [2024-07-25 06:32:11.354063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:57.955 [2024-07-25 06:32:11.354080] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13b0a60 00:15:57.955 [2024-07-25 06:32:11.354091] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:57.955 [2024-07-25 06:32:11.355503] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:57.955 [2024-07-25 06:32:11.355531] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:57.955 BaseBdev1 00:15:57.955 06:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:57.955 06:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:58.214 BaseBdev2_malloc 00:15:58.214 06:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:58.471 true 00:15:58.471 06:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:58.732 [2024-07-25 06:32:12.039852] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:58.733 [2024-07-25 06:32:12.039897] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:58.733 [2024-07-25 06:32:12.039917] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13b5dc0 00:15:58.733 [2024-07-25 06:32:12.039928] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:58.733 [2024-07-25 06:32:12.041328] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:58.733 [2024-07-25 06:32:12.041359] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:58.733 BaseBdev2 00:15:58.733 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:15:58.733 [2024-07-25 06:32:12.268472] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:58.733 [2024-07-25 06:32:12.269584] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:58.733 [2024-07-25 06:32:12.269761] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13b6e90 00:15:58.733 [2024-07-25 06:32:12.269777] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:58.733 [2024-07-25 06:32:12.269951] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13b2ab0 00:15:58.733 [2024-07-25 06:32:12.270088] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13b6e90 00:15:58.733 [2024-07-25 06:32:12.270097] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13b6e90 00:15:58.733 [2024-07-25 06:32:12.270196] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.992 "name": "raid_bdev1", 00:15:58.992 "uuid": "d9a03010-0607-4968-a95e-daf5501303aa", 00:15:58.992 "strip_size_kb": 0, 00:15:58.992 "state": "online", 00:15:58.992 "raid_level": "raid1", 00:15:58.992 "superblock": true, 00:15:58.992 "num_base_bdevs": 2, 00:15:58.992 "num_base_bdevs_discovered": 2, 00:15:58.992 "num_base_bdevs_operational": 2, 00:15:58.992 "base_bdevs_list": [ 00:15:58.992 { 00:15:58.992 "name": "BaseBdev1", 00:15:58.992 "uuid": "bf7827df-bf04-5cd1-bec5-126cd08479cf", 00:15:58.992 "is_configured": true, 00:15:58.992 "data_offset": 2048, 00:15:58.992 "data_size": 63488 00:15:58.992 }, 00:15:58.992 { 00:15:58.992 "name": "BaseBdev2", 00:15:58.992 "uuid": "889b4e2f-2790-5362-b248-550a45b73b21", 00:15:58.992 "is_configured": true, 00:15:58.992 "data_offset": 2048, 00:15:58.992 "data_size": 63488 00:15:58.992 } 00:15:58.992 ] 00:15:58.992 }' 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.992 06:32:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.559 06:32:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:59.559 06:32:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:59.817 [2024-07-25 06:32:13.195170] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x120b9e0 00:16:00.749 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:01.008 [2024-07-25 06:32:14.309656] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:16:01.008 [2024-07-25 06:32:14.309705] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:01.008 [2024-07-25 06:32:14.309880] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x120b9e0 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=1 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.008 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:01.266 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.266 "name": "raid_bdev1", 00:16:01.266 "uuid": "d9a03010-0607-4968-a95e-daf5501303aa", 00:16:01.266 "strip_size_kb": 0, 00:16:01.266 "state": "online", 00:16:01.266 "raid_level": "raid1", 00:16:01.266 "superblock": true, 00:16:01.266 "num_base_bdevs": 2, 00:16:01.266 "num_base_bdevs_discovered": 1, 00:16:01.266 "num_base_bdevs_operational": 1, 00:16:01.266 "base_bdevs_list": [ 00:16:01.266 { 00:16:01.266 "name": null, 00:16:01.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.266 "is_configured": false, 00:16:01.266 "data_offset": 2048, 00:16:01.267 "data_size": 63488 00:16:01.267 }, 00:16:01.267 { 00:16:01.267 "name": "BaseBdev2", 00:16:01.267 "uuid": "889b4e2f-2790-5362-b248-550a45b73b21", 00:16:01.267 "is_configured": true, 00:16:01.267 "data_offset": 2048, 00:16:01.267 "data_size": 63488 00:16:01.267 } 00:16:01.267 ] 00:16:01.267 }' 00:16:01.267 06:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.267 06:32:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.833 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:01.833 [2024-07-25 06:32:15.356423] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:01.833 [2024-07-25 06:32:15.356456] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:01.833 [2024-07-25 06:32:15.359337] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:01.833 [2024-07-25 06:32:15.359364] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:01.833 [2024-07-25 06:32:15.359410] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:01.833 [2024-07-25 06:32:15.359421] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13b6e90 name raid_bdev1, state offline 00:16:01.833 0 00:16:01.833 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1118056 00:16:01.833 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1118056 ']' 00:16:01.833 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1118056 00:16:01.833 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:16:01.833 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:01.833 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1118056 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1118056' 00:16:02.092 killing process with pid 1118056 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1118056 00:16:02.092 [2024-07-25 06:32:15.434204] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1118056 00:16:02.092 [2024-07-25 06:32:15.443581] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.L1WCt5chkU 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:16:02.092 00:16:02.092 real 0m5.933s 00:16:02.092 user 0m9.211s 00:16:02.092 sys 0m1.061s 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:02.092 06:32:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.092 ************************************ 00:16:02.092 END TEST raid_write_error_test 00:16:02.092 ************************************ 00:16:02.352 06:32:15 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:16:02.352 06:32:15 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:16:02.352 06:32:15 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:16:02.352 06:32:15 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:02.352 06:32:15 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:02.352 06:32:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:02.352 ************************************ 00:16:02.352 START TEST raid_state_function_test 00:16:02.352 ************************************ 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 false 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1119210 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1119210' 00:16:02.352 Process raid pid: 1119210 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1119210 /var/tmp/spdk-raid.sock 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1119210 ']' 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:02.352 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:02.352 06:32:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.352 [2024-07-25 06:32:15.785773] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:16:02.352 [2024-07-25 06:32:15.785827] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:01.0 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:01.1 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:01.2 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:01.3 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:01.4 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:01.5 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:01.6 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:01.7 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:02.0 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:02.1 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:02.2 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:02.3 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:02.4 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:02.5 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:02.6 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3d:02.7 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:01.0 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:01.1 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:01.2 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:01.3 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:01.4 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:01.5 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:01.6 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:01.7 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:02.0 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:02.1 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:02.2 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:02.3 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:02.4 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:02.5 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:02.6 cannot be used 00:16:02.352 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:02.352 EAL: Requested device 0000:3f:02.7 cannot be used 00:16:02.611 [2024-07-25 06:32:15.912966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:02.611 [2024-07-25 06:32:15.958114] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:02.611 [2024-07-25 06:32:16.020547] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:02.611 [2024-07-25 06:32:16.020576] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:03.177 06:32:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:03.177 06:32:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:16:03.177 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:03.436 [2024-07-25 06:32:16.897904] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:03.436 [2024-07-25 06:32:16.897941] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:03.436 [2024-07-25 06:32:16.897951] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:03.436 [2024-07-25 06:32:16.897961] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:03.436 [2024-07-25 06:32:16.897969] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:03.436 [2024-07-25 06:32:16.897979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.436 06:32:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.694 06:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.694 "name": "Existed_Raid", 00:16:03.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.694 "strip_size_kb": 64, 00:16:03.694 "state": "configuring", 00:16:03.694 "raid_level": "raid0", 00:16:03.694 "superblock": false, 00:16:03.694 "num_base_bdevs": 3, 00:16:03.694 "num_base_bdevs_discovered": 0, 00:16:03.694 "num_base_bdevs_operational": 3, 00:16:03.694 "base_bdevs_list": [ 00:16:03.694 { 00:16:03.694 "name": "BaseBdev1", 00:16:03.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.694 "is_configured": false, 00:16:03.694 "data_offset": 0, 00:16:03.694 "data_size": 0 00:16:03.694 }, 00:16:03.694 { 00:16:03.694 "name": "BaseBdev2", 00:16:03.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.694 "is_configured": false, 00:16:03.694 "data_offset": 0, 00:16:03.694 "data_size": 0 00:16:03.694 }, 00:16:03.694 { 00:16:03.694 "name": "BaseBdev3", 00:16:03.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.694 "is_configured": false, 00:16:03.694 "data_offset": 0, 00:16:03.694 "data_size": 0 00:16:03.694 } 00:16:03.694 ] 00:16:03.694 }' 00:16:03.694 06:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.694 06:32:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.261 06:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:04.520 [2024-07-25 06:32:17.920462] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:04.520 [2024-07-25 06:32:17.920486] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1679470 name Existed_Raid, state configuring 00:16:04.520 06:32:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:04.778 [2024-07-25 06:32:18.149085] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:04.778 [2024-07-25 06:32:18.149113] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:04.778 [2024-07-25 06:32:18.149122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:04.778 [2024-07-25 06:32:18.149133] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:04.778 [2024-07-25 06:32:18.149146] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:04.778 [2024-07-25 06:32:18.149157] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:04.778 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:05.036 [2024-07-25 06:32:18.387077] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:05.036 BaseBdev1 00:16:05.037 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:05.037 06:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:05.037 06:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:05.037 06:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:05.037 06:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:05.037 06:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:05.037 06:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:05.295 06:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:05.295 [ 00:16:05.295 { 00:16:05.295 "name": "BaseBdev1", 00:16:05.295 "aliases": [ 00:16:05.295 "4fe10781-362a-4845-b7c0-0547688c8fc6" 00:16:05.295 ], 00:16:05.295 "product_name": "Malloc disk", 00:16:05.295 "block_size": 512, 00:16:05.295 "num_blocks": 65536, 00:16:05.295 "uuid": "4fe10781-362a-4845-b7c0-0547688c8fc6", 00:16:05.295 "assigned_rate_limits": { 00:16:05.295 "rw_ios_per_sec": 0, 00:16:05.295 "rw_mbytes_per_sec": 0, 00:16:05.295 "r_mbytes_per_sec": 0, 00:16:05.295 "w_mbytes_per_sec": 0 00:16:05.295 }, 00:16:05.295 "claimed": true, 00:16:05.295 "claim_type": "exclusive_write", 00:16:05.295 "zoned": false, 00:16:05.295 "supported_io_types": { 00:16:05.295 "read": true, 00:16:05.295 "write": true, 00:16:05.295 "unmap": true, 00:16:05.295 "flush": true, 00:16:05.295 "reset": true, 00:16:05.295 "nvme_admin": false, 00:16:05.295 "nvme_io": false, 00:16:05.295 "nvme_io_md": false, 00:16:05.295 "write_zeroes": true, 00:16:05.295 "zcopy": true, 00:16:05.295 "get_zone_info": false, 00:16:05.295 "zone_management": false, 00:16:05.295 "zone_append": false, 00:16:05.295 "compare": false, 00:16:05.295 "compare_and_write": false, 00:16:05.295 "abort": true, 00:16:05.295 "seek_hole": false, 00:16:05.295 "seek_data": false, 00:16:05.295 "copy": true, 00:16:05.295 "nvme_iov_md": false 00:16:05.295 }, 00:16:05.295 "memory_domains": [ 00:16:05.295 { 00:16:05.295 "dma_device_id": "system", 00:16:05.295 "dma_device_type": 1 00:16:05.295 }, 00:16:05.295 { 00:16:05.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.295 "dma_device_type": 2 00:16:05.295 } 00:16:05.295 ], 00:16:05.295 "driver_specific": {} 00:16:05.295 } 00:16:05.295 ] 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.553 06:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.554 06:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.554 "name": "Existed_Raid", 00:16:05.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.554 "strip_size_kb": 64, 00:16:05.554 "state": "configuring", 00:16:05.554 "raid_level": "raid0", 00:16:05.554 "superblock": false, 00:16:05.554 "num_base_bdevs": 3, 00:16:05.554 "num_base_bdevs_discovered": 1, 00:16:05.554 "num_base_bdevs_operational": 3, 00:16:05.554 "base_bdevs_list": [ 00:16:05.554 { 00:16:05.554 "name": "BaseBdev1", 00:16:05.554 "uuid": "4fe10781-362a-4845-b7c0-0547688c8fc6", 00:16:05.554 "is_configured": true, 00:16:05.554 "data_offset": 0, 00:16:05.554 "data_size": 65536 00:16:05.554 }, 00:16:05.554 { 00:16:05.554 "name": "BaseBdev2", 00:16:05.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.554 "is_configured": false, 00:16:05.554 "data_offset": 0, 00:16:05.554 "data_size": 0 00:16:05.554 }, 00:16:05.554 { 00:16:05.554 "name": "BaseBdev3", 00:16:05.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.554 "is_configured": false, 00:16:05.554 "data_offset": 0, 00:16:05.554 "data_size": 0 00:16:05.554 } 00:16:05.554 ] 00:16:05.554 }' 00:16:05.554 06:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.554 06:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.488 06:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:06.488 [2024-07-25 06:32:19.891027] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:06.488 [2024-07-25 06:32:19.891056] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1678ce0 name Existed_Raid, state configuring 00:16:06.488 06:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:06.746 [2024-07-25 06:32:20.119662] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:06.746 [2024-07-25 06:32:20.121036] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:06.746 [2024-07-25 06:32:20.121068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:06.746 [2024-07-25 06:32:20.121077] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:06.746 [2024-07-25 06:32:20.121088] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.746 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.747 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.005 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.005 "name": "Existed_Raid", 00:16:07.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.005 "strip_size_kb": 64, 00:16:07.005 "state": "configuring", 00:16:07.005 "raid_level": "raid0", 00:16:07.005 "superblock": false, 00:16:07.005 "num_base_bdevs": 3, 00:16:07.005 "num_base_bdevs_discovered": 1, 00:16:07.005 "num_base_bdevs_operational": 3, 00:16:07.005 "base_bdevs_list": [ 00:16:07.005 { 00:16:07.005 "name": "BaseBdev1", 00:16:07.005 "uuid": "4fe10781-362a-4845-b7c0-0547688c8fc6", 00:16:07.005 "is_configured": true, 00:16:07.005 "data_offset": 0, 00:16:07.005 "data_size": 65536 00:16:07.005 }, 00:16:07.005 { 00:16:07.005 "name": "BaseBdev2", 00:16:07.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.005 "is_configured": false, 00:16:07.005 "data_offset": 0, 00:16:07.005 "data_size": 0 00:16:07.005 }, 00:16:07.005 { 00:16:07.005 "name": "BaseBdev3", 00:16:07.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.005 "is_configured": false, 00:16:07.005 "data_offset": 0, 00:16:07.005 "data_size": 0 00:16:07.005 } 00:16:07.005 ] 00:16:07.005 }' 00:16:07.005 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.005 06:32:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.572 06:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:07.832 [2024-07-25 06:32:21.161610] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:07.832 BaseBdev2 00:16:07.832 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:07.832 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:07.832 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:07.832 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:07.832 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:07.832 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:07.832 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:08.091 [ 00:16:08.091 { 00:16:08.091 "name": "BaseBdev2", 00:16:08.091 "aliases": [ 00:16:08.091 "50c98fca-1eac-4aeb-bedc-72f8b261e57d" 00:16:08.091 ], 00:16:08.091 "product_name": "Malloc disk", 00:16:08.091 "block_size": 512, 00:16:08.091 "num_blocks": 65536, 00:16:08.091 "uuid": "50c98fca-1eac-4aeb-bedc-72f8b261e57d", 00:16:08.091 "assigned_rate_limits": { 00:16:08.091 "rw_ios_per_sec": 0, 00:16:08.091 "rw_mbytes_per_sec": 0, 00:16:08.091 "r_mbytes_per_sec": 0, 00:16:08.091 "w_mbytes_per_sec": 0 00:16:08.091 }, 00:16:08.091 "claimed": true, 00:16:08.091 "claim_type": "exclusive_write", 00:16:08.091 "zoned": false, 00:16:08.091 "supported_io_types": { 00:16:08.091 "read": true, 00:16:08.091 "write": true, 00:16:08.091 "unmap": true, 00:16:08.091 "flush": true, 00:16:08.091 "reset": true, 00:16:08.091 "nvme_admin": false, 00:16:08.091 "nvme_io": false, 00:16:08.091 "nvme_io_md": false, 00:16:08.091 "write_zeroes": true, 00:16:08.091 "zcopy": true, 00:16:08.091 "get_zone_info": false, 00:16:08.091 "zone_management": false, 00:16:08.091 "zone_append": false, 00:16:08.091 "compare": false, 00:16:08.091 "compare_and_write": false, 00:16:08.091 "abort": true, 00:16:08.091 "seek_hole": false, 00:16:08.091 "seek_data": false, 00:16:08.091 "copy": true, 00:16:08.091 "nvme_iov_md": false 00:16:08.091 }, 00:16:08.091 "memory_domains": [ 00:16:08.091 { 00:16:08.091 "dma_device_id": "system", 00:16:08.091 "dma_device_type": 1 00:16:08.091 }, 00:16:08.091 { 00:16:08.091 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.091 "dma_device_type": 2 00:16:08.091 } 00:16:08.091 ], 00:16:08.091 "driver_specific": {} 00:16:08.091 } 00:16:08.091 ] 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:08.091 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.092 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.092 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.092 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.092 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.092 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.351 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.351 "name": "Existed_Raid", 00:16:08.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.351 "strip_size_kb": 64, 00:16:08.351 "state": "configuring", 00:16:08.351 "raid_level": "raid0", 00:16:08.351 "superblock": false, 00:16:08.351 "num_base_bdevs": 3, 00:16:08.351 "num_base_bdevs_discovered": 2, 00:16:08.351 "num_base_bdevs_operational": 3, 00:16:08.351 "base_bdevs_list": [ 00:16:08.351 { 00:16:08.351 "name": "BaseBdev1", 00:16:08.351 "uuid": "4fe10781-362a-4845-b7c0-0547688c8fc6", 00:16:08.351 "is_configured": true, 00:16:08.351 "data_offset": 0, 00:16:08.351 "data_size": 65536 00:16:08.351 }, 00:16:08.351 { 00:16:08.351 "name": "BaseBdev2", 00:16:08.351 "uuid": "50c98fca-1eac-4aeb-bedc-72f8b261e57d", 00:16:08.351 "is_configured": true, 00:16:08.351 "data_offset": 0, 00:16:08.351 "data_size": 65536 00:16:08.351 }, 00:16:08.351 { 00:16:08.351 "name": "BaseBdev3", 00:16:08.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.351 "is_configured": false, 00:16:08.351 "data_offset": 0, 00:16:08.351 "data_size": 0 00:16:08.351 } 00:16:08.351 ] 00:16:08.351 }' 00:16:08.351 06:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.351 06:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.919 06:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:09.178 [2024-07-25 06:32:22.600486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:09.178 [2024-07-25 06:32:22.600521] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x182c380 00:16:09.178 [2024-07-25 06:32:22.600529] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:16:09.178 [2024-07-25 06:32:22.600706] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1825290 00:16:09.178 [2024-07-25 06:32:22.600817] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x182c380 00:16:09.178 [2024-07-25 06:32:22.600826] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x182c380 00:16:09.178 [2024-07-25 06:32:22.600974] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:09.178 BaseBdev3 00:16:09.178 06:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:09.178 06:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:09.178 06:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:09.178 06:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:09.178 06:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:09.178 06:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:09.178 06:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:09.437 06:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:09.696 [ 00:16:09.696 { 00:16:09.696 "name": "BaseBdev3", 00:16:09.696 "aliases": [ 00:16:09.696 "c4444e59-2bbc-4474-a40f-ee67dea5ece0" 00:16:09.696 ], 00:16:09.696 "product_name": "Malloc disk", 00:16:09.696 "block_size": 512, 00:16:09.696 "num_blocks": 65536, 00:16:09.696 "uuid": "c4444e59-2bbc-4474-a40f-ee67dea5ece0", 00:16:09.696 "assigned_rate_limits": { 00:16:09.696 "rw_ios_per_sec": 0, 00:16:09.696 "rw_mbytes_per_sec": 0, 00:16:09.696 "r_mbytes_per_sec": 0, 00:16:09.696 "w_mbytes_per_sec": 0 00:16:09.696 }, 00:16:09.696 "claimed": true, 00:16:09.696 "claim_type": "exclusive_write", 00:16:09.696 "zoned": false, 00:16:09.696 "supported_io_types": { 00:16:09.696 "read": true, 00:16:09.696 "write": true, 00:16:09.696 "unmap": true, 00:16:09.696 "flush": true, 00:16:09.696 "reset": true, 00:16:09.696 "nvme_admin": false, 00:16:09.696 "nvme_io": false, 00:16:09.696 "nvme_io_md": false, 00:16:09.696 "write_zeroes": true, 00:16:09.696 "zcopy": true, 00:16:09.696 "get_zone_info": false, 00:16:09.696 "zone_management": false, 00:16:09.696 "zone_append": false, 00:16:09.696 "compare": false, 00:16:09.696 "compare_and_write": false, 00:16:09.696 "abort": true, 00:16:09.696 "seek_hole": false, 00:16:09.696 "seek_data": false, 00:16:09.696 "copy": true, 00:16:09.696 "nvme_iov_md": false 00:16:09.696 }, 00:16:09.696 "memory_domains": [ 00:16:09.696 { 00:16:09.696 "dma_device_id": "system", 00:16:09.696 "dma_device_type": 1 00:16:09.696 }, 00:16:09.696 { 00:16:09.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.696 "dma_device_type": 2 00:16:09.696 } 00:16:09.696 ], 00:16:09.696 "driver_specific": {} 00:16:09.696 } 00:16:09.696 ] 00:16:09.696 06:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:09.696 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:09.696 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:09.696 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.697 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.956 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.956 "name": "Existed_Raid", 00:16:09.956 "uuid": "657f1995-58ce-4f73-9f2d-8bbbbf370e83", 00:16:09.956 "strip_size_kb": 64, 00:16:09.956 "state": "online", 00:16:09.956 "raid_level": "raid0", 00:16:09.956 "superblock": false, 00:16:09.956 "num_base_bdevs": 3, 00:16:09.956 "num_base_bdevs_discovered": 3, 00:16:09.956 "num_base_bdevs_operational": 3, 00:16:09.956 "base_bdevs_list": [ 00:16:09.956 { 00:16:09.956 "name": "BaseBdev1", 00:16:09.956 "uuid": "4fe10781-362a-4845-b7c0-0547688c8fc6", 00:16:09.956 "is_configured": true, 00:16:09.956 "data_offset": 0, 00:16:09.956 "data_size": 65536 00:16:09.956 }, 00:16:09.956 { 00:16:09.956 "name": "BaseBdev2", 00:16:09.956 "uuid": "50c98fca-1eac-4aeb-bedc-72f8b261e57d", 00:16:09.956 "is_configured": true, 00:16:09.956 "data_offset": 0, 00:16:09.956 "data_size": 65536 00:16:09.956 }, 00:16:09.956 { 00:16:09.956 "name": "BaseBdev3", 00:16:09.956 "uuid": "c4444e59-2bbc-4474-a40f-ee67dea5ece0", 00:16:09.956 "is_configured": true, 00:16:09.956 "data_offset": 0, 00:16:09.956 "data_size": 65536 00:16:09.956 } 00:16:09.956 ] 00:16:09.956 }' 00:16:09.956 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.956 06:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:10.522 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:10.523 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:10.523 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:10.523 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:10.523 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:10.523 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:10.523 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:10.523 06:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:10.781 [2024-07-25 06:32:24.104932] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:10.781 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:10.781 "name": "Existed_Raid", 00:16:10.781 "aliases": [ 00:16:10.781 "657f1995-58ce-4f73-9f2d-8bbbbf370e83" 00:16:10.781 ], 00:16:10.781 "product_name": "Raid Volume", 00:16:10.781 "block_size": 512, 00:16:10.781 "num_blocks": 196608, 00:16:10.781 "uuid": "657f1995-58ce-4f73-9f2d-8bbbbf370e83", 00:16:10.781 "assigned_rate_limits": { 00:16:10.781 "rw_ios_per_sec": 0, 00:16:10.781 "rw_mbytes_per_sec": 0, 00:16:10.781 "r_mbytes_per_sec": 0, 00:16:10.781 "w_mbytes_per_sec": 0 00:16:10.781 }, 00:16:10.781 "claimed": false, 00:16:10.781 "zoned": false, 00:16:10.781 "supported_io_types": { 00:16:10.781 "read": true, 00:16:10.781 "write": true, 00:16:10.781 "unmap": true, 00:16:10.781 "flush": true, 00:16:10.781 "reset": true, 00:16:10.781 "nvme_admin": false, 00:16:10.781 "nvme_io": false, 00:16:10.781 "nvme_io_md": false, 00:16:10.781 "write_zeroes": true, 00:16:10.781 "zcopy": false, 00:16:10.781 "get_zone_info": false, 00:16:10.781 "zone_management": false, 00:16:10.781 "zone_append": false, 00:16:10.781 "compare": false, 00:16:10.781 "compare_and_write": false, 00:16:10.781 "abort": false, 00:16:10.781 "seek_hole": false, 00:16:10.781 "seek_data": false, 00:16:10.781 "copy": false, 00:16:10.781 "nvme_iov_md": false 00:16:10.781 }, 00:16:10.781 "memory_domains": [ 00:16:10.781 { 00:16:10.781 "dma_device_id": "system", 00:16:10.781 "dma_device_type": 1 00:16:10.781 }, 00:16:10.781 { 00:16:10.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.781 "dma_device_type": 2 00:16:10.781 }, 00:16:10.781 { 00:16:10.781 "dma_device_id": "system", 00:16:10.781 "dma_device_type": 1 00:16:10.781 }, 00:16:10.781 { 00:16:10.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.781 "dma_device_type": 2 00:16:10.781 }, 00:16:10.781 { 00:16:10.781 "dma_device_id": "system", 00:16:10.781 "dma_device_type": 1 00:16:10.781 }, 00:16:10.781 { 00:16:10.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.781 "dma_device_type": 2 00:16:10.781 } 00:16:10.781 ], 00:16:10.781 "driver_specific": { 00:16:10.781 "raid": { 00:16:10.781 "uuid": "657f1995-58ce-4f73-9f2d-8bbbbf370e83", 00:16:10.781 "strip_size_kb": 64, 00:16:10.781 "state": "online", 00:16:10.781 "raid_level": "raid0", 00:16:10.781 "superblock": false, 00:16:10.781 "num_base_bdevs": 3, 00:16:10.781 "num_base_bdevs_discovered": 3, 00:16:10.781 "num_base_bdevs_operational": 3, 00:16:10.781 "base_bdevs_list": [ 00:16:10.781 { 00:16:10.781 "name": "BaseBdev1", 00:16:10.781 "uuid": "4fe10781-362a-4845-b7c0-0547688c8fc6", 00:16:10.782 "is_configured": true, 00:16:10.782 "data_offset": 0, 00:16:10.782 "data_size": 65536 00:16:10.782 }, 00:16:10.782 { 00:16:10.782 "name": "BaseBdev2", 00:16:10.782 "uuid": "50c98fca-1eac-4aeb-bedc-72f8b261e57d", 00:16:10.782 "is_configured": true, 00:16:10.782 "data_offset": 0, 00:16:10.782 "data_size": 65536 00:16:10.782 }, 00:16:10.782 { 00:16:10.782 "name": "BaseBdev3", 00:16:10.782 "uuid": "c4444e59-2bbc-4474-a40f-ee67dea5ece0", 00:16:10.782 "is_configured": true, 00:16:10.782 "data_offset": 0, 00:16:10.782 "data_size": 65536 00:16:10.782 } 00:16:10.782 ] 00:16:10.782 } 00:16:10.782 } 00:16:10.782 }' 00:16:10.782 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:10.782 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:10.782 BaseBdev2 00:16:10.782 BaseBdev3' 00:16:10.782 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.782 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:10.782 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:11.040 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.040 "name": "BaseBdev1", 00:16:11.040 "aliases": [ 00:16:11.040 "4fe10781-362a-4845-b7c0-0547688c8fc6" 00:16:11.040 ], 00:16:11.040 "product_name": "Malloc disk", 00:16:11.040 "block_size": 512, 00:16:11.040 "num_blocks": 65536, 00:16:11.040 "uuid": "4fe10781-362a-4845-b7c0-0547688c8fc6", 00:16:11.040 "assigned_rate_limits": { 00:16:11.040 "rw_ios_per_sec": 0, 00:16:11.040 "rw_mbytes_per_sec": 0, 00:16:11.040 "r_mbytes_per_sec": 0, 00:16:11.040 "w_mbytes_per_sec": 0 00:16:11.040 }, 00:16:11.040 "claimed": true, 00:16:11.040 "claim_type": "exclusive_write", 00:16:11.040 "zoned": false, 00:16:11.040 "supported_io_types": { 00:16:11.040 "read": true, 00:16:11.040 "write": true, 00:16:11.040 "unmap": true, 00:16:11.040 "flush": true, 00:16:11.040 "reset": true, 00:16:11.040 "nvme_admin": false, 00:16:11.040 "nvme_io": false, 00:16:11.040 "nvme_io_md": false, 00:16:11.040 "write_zeroes": true, 00:16:11.040 "zcopy": true, 00:16:11.040 "get_zone_info": false, 00:16:11.040 "zone_management": false, 00:16:11.040 "zone_append": false, 00:16:11.040 "compare": false, 00:16:11.040 "compare_and_write": false, 00:16:11.040 "abort": true, 00:16:11.040 "seek_hole": false, 00:16:11.040 "seek_data": false, 00:16:11.040 "copy": true, 00:16:11.040 "nvme_iov_md": false 00:16:11.040 }, 00:16:11.040 "memory_domains": [ 00:16:11.040 { 00:16:11.040 "dma_device_id": "system", 00:16:11.040 "dma_device_type": 1 00:16:11.040 }, 00:16:11.040 { 00:16:11.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.040 "dma_device_type": 2 00:16:11.040 } 00:16:11.040 ], 00:16:11.040 "driver_specific": {} 00:16:11.040 }' 00:16:11.040 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.040 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.040 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.041 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.041 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.041 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.041 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.334 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.334 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.334 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.334 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.334 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.334 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.334 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:11.334 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.615 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.615 "name": "BaseBdev2", 00:16:11.615 "aliases": [ 00:16:11.615 "50c98fca-1eac-4aeb-bedc-72f8b261e57d" 00:16:11.615 ], 00:16:11.615 "product_name": "Malloc disk", 00:16:11.615 "block_size": 512, 00:16:11.615 "num_blocks": 65536, 00:16:11.615 "uuid": "50c98fca-1eac-4aeb-bedc-72f8b261e57d", 00:16:11.615 "assigned_rate_limits": { 00:16:11.615 "rw_ios_per_sec": 0, 00:16:11.615 "rw_mbytes_per_sec": 0, 00:16:11.615 "r_mbytes_per_sec": 0, 00:16:11.615 "w_mbytes_per_sec": 0 00:16:11.615 }, 00:16:11.615 "claimed": true, 00:16:11.615 "claim_type": "exclusive_write", 00:16:11.615 "zoned": false, 00:16:11.615 "supported_io_types": { 00:16:11.615 "read": true, 00:16:11.615 "write": true, 00:16:11.615 "unmap": true, 00:16:11.615 "flush": true, 00:16:11.615 "reset": true, 00:16:11.615 "nvme_admin": false, 00:16:11.615 "nvme_io": false, 00:16:11.615 "nvme_io_md": false, 00:16:11.615 "write_zeroes": true, 00:16:11.615 "zcopy": true, 00:16:11.615 "get_zone_info": false, 00:16:11.615 "zone_management": false, 00:16:11.615 "zone_append": false, 00:16:11.615 "compare": false, 00:16:11.615 "compare_and_write": false, 00:16:11.615 "abort": true, 00:16:11.615 "seek_hole": false, 00:16:11.615 "seek_data": false, 00:16:11.615 "copy": true, 00:16:11.615 "nvme_iov_md": false 00:16:11.615 }, 00:16:11.615 "memory_domains": [ 00:16:11.615 { 00:16:11.615 "dma_device_id": "system", 00:16:11.615 "dma_device_type": 1 00:16:11.615 }, 00:16:11.615 { 00:16:11.615 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.615 "dma_device_type": 2 00:16:11.615 } 00:16:11.615 ], 00:16:11.615 "driver_specific": {} 00:16:11.615 }' 00:16:11.615 06:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.615 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.615 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.615 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.615 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.615 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.615 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.873 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.873 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.873 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.873 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.873 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.873 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.873 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:11.873 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:12.131 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:12.131 "name": "BaseBdev3", 00:16:12.131 "aliases": [ 00:16:12.131 "c4444e59-2bbc-4474-a40f-ee67dea5ece0" 00:16:12.131 ], 00:16:12.131 "product_name": "Malloc disk", 00:16:12.131 "block_size": 512, 00:16:12.131 "num_blocks": 65536, 00:16:12.131 "uuid": "c4444e59-2bbc-4474-a40f-ee67dea5ece0", 00:16:12.131 "assigned_rate_limits": { 00:16:12.131 "rw_ios_per_sec": 0, 00:16:12.131 "rw_mbytes_per_sec": 0, 00:16:12.131 "r_mbytes_per_sec": 0, 00:16:12.131 "w_mbytes_per_sec": 0 00:16:12.131 }, 00:16:12.131 "claimed": true, 00:16:12.131 "claim_type": "exclusive_write", 00:16:12.131 "zoned": false, 00:16:12.131 "supported_io_types": { 00:16:12.131 "read": true, 00:16:12.131 "write": true, 00:16:12.131 "unmap": true, 00:16:12.131 "flush": true, 00:16:12.131 "reset": true, 00:16:12.131 "nvme_admin": false, 00:16:12.131 "nvme_io": false, 00:16:12.131 "nvme_io_md": false, 00:16:12.131 "write_zeroes": true, 00:16:12.131 "zcopy": true, 00:16:12.131 "get_zone_info": false, 00:16:12.131 "zone_management": false, 00:16:12.131 "zone_append": false, 00:16:12.131 "compare": false, 00:16:12.131 "compare_and_write": false, 00:16:12.131 "abort": true, 00:16:12.131 "seek_hole": false, 00:16:12.131 "seek_data": false, 00:16:12.131 "copy": true, 00:16:12.131 "nvme_iov_md": false 00:16:12.131 }, 00:16:12.131 "memory_domains": [ 00:16:12.131 { 00:16:12.131 "dma_device_id": "system", 00:16:12.131 "dma_device_type": 1 00:16:12.131 }, 00:16:12.131 { 00:16:12.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.131 "dma_device_type": 2 00:16:12.131 } 00:16:12.131 ], 00:16:12.131 "driver_specific": {} 00:16:12.131 }' 00:16:12.131 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.131 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.131 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:12.131 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.389 06:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:12.646 [2024-07-25 06:32:26.126203] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:12.646 [2024-07-25 06:32:26.126226] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:12.646 [2024-07-25 06:32:26.126262] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:12.646 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:12.647 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.647 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:12.647 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.647 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.647 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.647 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.647 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.647 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.904 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.904 "name": "Existed_Raid", 00:16:12.904 "uuid": "657f1995-58ce-4f73-9f2d-8bbbbf370e83", 00:16:12.904 "strip_size_kb": 64, 00:16:12.904 "state": "offline", 00:16:12.904 "raid_level": "raid0", 00:16:12.904 "superblock": false, 00:16:12.904 "num_base_bdevs": 3, 00:16:12.904 "num_base_bdevs_discovered": 2, 00:16:12.904 "num_base_bdevs_operational": 2, 00:16:12.904 "base_bdevs_list": [ 00:16:12.904 { 00:16:12.904 "name": null, 00:16:12.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.904 "is_configured": false, 00:16:12.904 "data_offset": 0, 00:16:12.904 "data_size": 65536 00:16:12.904 }, 00:16:12.904 { 00:16:12.904 "name": "BaseBdev2", 00:16:12.904 "uuid": "50c98fca-1eac-4aeb-bedc-72f8b261e57d", 00:16:12.904 "is_configured": true, 00:16:12.904 "data_offset": 0, 00:16:12.904 "data_size": 65536 00:16:12.904 }, 00:16:12.904 { 00:16:12.904 "name": "BaseBdev3", 00:16:12.904 "uuid": "c4444e59-2bbc-4474-a40f-ee67dea5ece0", 00:16:12.904 "is_configured": true, 00:16:12.904 "data_offset": 0, 00:16:12.904 "data_size": 65536 00:16:12.904 } 00:16:12.904 ] 00:16:12.904 }' 00:16:12.904 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.904 06:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.469 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:13.469 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:13.469 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.469 06:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:13.726 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:13.726 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:13.726 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:13.984 [2024-07-25 06:32:27.394475] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:13.984 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:13.984 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:13.984 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.984 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:14.241 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:14.241 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:14.241 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:14.499 [2024-07-25 06:32:27.865851] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:14.499 [2024-07-25 06:32:27.865889] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x182c380 name Existed_Raid, state offline 00:16:14.499 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:14.499 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:14.499 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.499 06:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:14.755 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:14.755 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:14.755 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:14.755 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:14.756 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:14.756 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:15.013 BaseBdev2 00:16:15.013 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:15.013 06:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:15.013 06:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:15.013 06:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:15.013 06:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:15.013 06:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:15.013 06:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.013 06:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:15.271 [ 00:16:15.271 { 00:16:15.271 "name": "BaseBdev2", 00:16:15.271 "aliases": [ 00:16:15.271 "bc268baa-cf83-4757-b601-a4bfbcf8b0d8" 00:16:15.271 ], 00:16:15.271 "product_name": "Malloc disk", 00:16:15.271 "block_size": 512, 00:16:15.271 "num_blocks": 65536, 00:16:15.271 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:15.271 "assigned_rate_limits": { 00:16:15.271 "rw_ios_per_sec": 0, 00:16:15.271 "rw_mbytes_per_sec": 0, 00:16:15.271 "r_mbytes_per_sec": 0, 00:16:15.271 "w_mbytes_per_sec": 0 00:16:15.271 }, 00:16:15.271 "claimed": false, 00:16:15.271 "zoned": false, 00:16:15.271 "supported_io_types": { 00:16:15.271 "read": true, 00:16:15.271 "write": true, 00:16:15.271 "unmap": true, 00:16:15.271 "flush": true, 00:16:15.271 "reset": true, 00:16:15.271 "nvme_admin": false, 00:16:15.271 "nvme_io": false, 00:16:15.271 "nvme_io_md": false, 00:16:15.271 "write_zeroes": true, 00:16:15.271 "zcopy": true, 00:16:15.271 "get_zone_info": false, 00:16:15.271 "zone_management": false, 00:16:15.271 "zone_append": false, 00:16:15.271 "compare": false, 00:16:15.271 "compare_and_write": false, 00:16:15.271 "abort": true, 00:16:15.271 "seek_hole": false, 00:16:15.271 "seek_data": false, 00:16:15.271 "copy": true, 00:16:15.271 "nvme_iov_md": false 00:16:15.271 }, 00:16:15.271 "memory_domains": [ 00:16:15.271 { 00:16:15.271 "dma_device_id": "system", 00:16:15.271 "dma_device_type": 1 00:16:15.271 }, 00:16:15.271 { 00:16:15.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.271 "dma_device_type": 2 00:16:15.271 } 00:16:15.271 ], 00:16:15.271 "driver_specific": {} 00:16:15.271 } 00:16:15.271 ] 00:16:15.271 06:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:15.271 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:15.271 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:15.271 06:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:15.529 BaseBdev3 00:16:15.529 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:15.529 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:15.529 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:15.529 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:15.529 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:15.529 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:15.529 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.787 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:16.045 [ 00:16:16.045 { 00:16:16.045 "name": "BaseBdev3", 00:16:16.045 "aliases": [ 00:16:16.045 "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7" 00:16:16.045 ], 00:16:16.045 "product_name": "Malloc disk", 00:16:16.045 "block_size": 512, 00:16:16.045 "num_blocks": 65536, 00:16:16.045 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:16.045 "assigned_rate_limits": { 00:16:16.045 "rw_ios_per_sec": 0, 00:16:16.045 "rw_mbytes_per_sec": 0, 00:16:16.045 "r_mbytes_per_sec": 0, 00:16:16.045 "w_mbytes_per_sec": 0 00:16:16.045 }, 00:16:16.045 "claimed": false, 00:16:16.045 "zoned": false, 00:16:16.045 "supported_io_types": { 00:16:16.045 "read": true, 00:16:16.045 "write": true, 00:16:16.045 "unmap": true, 00:16:16.045 "flush": true, 00:16:16.045 "reset": true, 00:16:16.045 "nvme_admin": false, 00:16:16.045 "nvme_io": false, 00:16:16.045 "nvme_io_md": false, 00:16:16.045 "write_zeroes": true, 00:16:16.045 "zcopy": true, 00:16:16.045 "get_zone_info": false, 00:16:16.045 "zone_management": false, 00:16:16.045 "zone_append": false, 00:16:16.045 "compare": false, 00:16:16.045 "compare_and_write": false, 00:16:16.045 "abort": true, 00:16:16.045 "seek_hole": false, 00:16:16.045 "seek_data": false, 00:16:16.045 "copy": true, 00:16:16.045 "nvme_iov_md": false 00:16:16.045 }, 00:16:16.045 "memory_domains": [ 00:16:16.045 { 00:16:16.045 "dma_device_id": "system", 00:16:16.045 "dma_device_type": 1 00:16:16.045 }, 00:16:16.045 { 00:16:16.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.045 "dma_device_type": 2 00:16:16.045 } 00:16:16.045 ], 00:16:16.045 "driver_specific": {} 00:16:16.045 } 00:16:16.045 ] 00:16:16.045 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:16.045 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:16.045 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:16.045 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:16.303 [2024-07-25 06:32:29.704951] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:16.303 [2024-07-25 06:32:29.704987] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:16.303 [2024-07-25 06:32:29.705004] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:16.303 [2024-07-25 06:32:29.706212] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.303 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.561 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.561 "name": "Existed_Raid", 00:16:16.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.561 "strip_size_kb": 64, 00:16:16.561 "state": "configuring", 00:16:16.561 "raid_level": "raid0", 00:16:16.561 "superblock": false, 00:16:16.561 "num_base_bdevs": 3, 00:16:16.561 "num_base_bdevs_discovered": 2, 00:16:16.561 "num_base_bdevs_operational": 3, 00:16:16.561 "base_bdevs_list": [ 00:16:16.561 { 00:16:16.561 "name": "BaseBdev1", 00:16:16.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.561 "is_configured": false, 00:16:16.561 "data_offset": 0, 00:16:16.561 "data_size": 0 00:16:16.561 }, 00:16:16.561 { 00:16:16.561 "name": "BaseBdev2", 00:16:16.561 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:16.561 "is_configured": true, 00:16:16.561 "data_offset": 0, 00:16:16.561 "data_size": 65536 00:16:16.561 }, 00:16:16.561 { 00:16:16.561 "name": "BaseBdev3", 00:16:16.561 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:16.561 "is_configured": true, 00:16:16.561 "data_offset": 0, 00:16:16.561 "data_size": 65536 00:16:16.561 } 00:16:16.561 ] 00:16:16.561 }' 00:16:16.561 06:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.561 06:32:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.126 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:17.385 [2024-07-25 06:32:30.687517] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:17.385 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:17.385 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.385 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.385 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:17.385 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:17.386 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.386 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.386 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.386 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.386 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.386 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.386 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.645 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.645 "name": "Existed_Raid", 00:16:17.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.645 "strip_size_kb": 64, 00:16:17.645 "state": "configuring", 00:16:17.645 "raid_level": "raid0", 00:16:17.645 "superblock": false, 00:16:17.645 "num_base_bdevs": 3, 00:16:17.645 "num_base_bdevs_discovered": 1, 00:16:17.645 "num_base_bdevs_operational": 3, 00:16:17.645 "base_bdevs_list": [ 00:16:17.645 { 00:16:17.645 "name": "BaseBdev1", 00:16:17.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.645 "is_configured": false, 00:16:17.645 "data_offset": 0, 00:16:17.645 "data_size": 0 00:16:17.645 }, 00:16:17.645 { 00:16:17.645 "name": null, 00:16:17.645 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:17.645 "is_configured": false, 00:16:17.645 "data_offset": 0, 00:16:17.645 "data_size": 65536 00:16:17.645 }, 00:16:17.645 { 00:16:17.645 "name": "BaseBdev3", 00:16:17.645 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:17.645 "is_configured": true, 00:16:17.645 "data_offset": 0, 00:16:17.645 "data_size": 65536 00:16:17.645 } 00:16:17.645 ] 00:16:17.645 }' 00:16:17.645 06:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.645 06:32:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:18.213 06:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.213 06:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:18.214 06:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:18.214 06:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:18.473 [2024-07-25 06:32:31.937986] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:18.473 BaseBdev1 00:16:18.473 06:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:18.473 06:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:18.473 06:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:18.473 06:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:18.473 06:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:18.473 06:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:18.473 06:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:18.731 06:32:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:18.991 [ 00:16:18.991 { 00:16:18.991 "name": "BaseBdev1", 00:16:18.991 "aliases": [ 00:16:18.991 "f4ba90a8-522c-4f9b-9711-afea666e17c9" 00:16:18.991 ], 00:16:18.991 "product_name": "Malloc disk", 00:16:18.991 "block_size": 512, 00:16:18.991 "num_blocks": 65536, 00:16:18.991 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:18.991 "assigned_rate_limits": { 00:16:18.991 "rw_ios_per_sec": 0, 00:16:18.991 "rw_mbytes_per_sec": 0, 00:16:18.991 "r_mbytes_per_sec": 0, 00:16:18.991 "w_mbytes_per_sec": 0 00:16:18.991 }, 00:16:18.991 "claimed": true, 00:16:18.991 "claim_type": "exclusive_write", 00:16:18.991 "zoned": false, 00:16:18.991 "supported_io_types": { 00:16:18.991 "read": true, 00:16:18.991 "write": true, 00:16:18.991 "unmap": true, 00:16:18.991 "flush": true, 00:16:18.991 "reset": true, 00:16:18.991 "nvme_admin": false, 00:16:18.991 "nvme_io": false, 00:16:18.991 "nvme_io_md": false, 00:16:18.991 "write_zeroes": true, 00:16:18.991 "zcopy": true, 00:16:18.991 "get_zone_info": false, 00:16:18.991 "zone_management": false, 00:16:18.991 "zone_append": false, 00:16:18.991 "compare": false, 00:16:18.991 "compare_and_write": false, 00:16:18.991 "abort": true, 00:16:18.991 "seek_hole": false, 00:16:18.991 "seek_data": false, 00:16:18.991 "copy": true, 00:16:18.991 "nvme_iov_md": false 00:16:18.991 }, 00:16:18.991 "memory_domains": [ 00:16:18.991 { 00:16:18.991 "dma_device_id": "system", 00:16:18.991 "dma_device_type": 1 00:16:18.991 }, 00:16:18.991 { 00:16:18.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.991 "dma_device_type": 2 00:16:18.991 } 00:16:18.991 ], 00:16:18.991 "driver_specific": {} 00:16:18.991 } 00:16:18.991 ] 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.991 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.250 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.250 "name": "Existed_Raid", 00:16:19.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.250 "strip_size_kb": 64, 00:16:19.250 "state": "configuring", 00:16:19.250 "raid_level": "raid0", 00:16:19.250 "superblock": false, 00:16:19.250 "num_base_bdevs": 3, 00:16:19.250 "num_base_bdevs_discovered": 2, 00:16:19.250 "num_base_bdevs_operational": 3, 00:16:19.250 "base_bdevs_list": [ 00:16:19.250 { 00:16:19.250 "name": "BaseBdev1", 00:16:19.250 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:19.250 "is_configured": true, 00:16:19.250 "data_offset": 0, 00:16:19.250 "data_size": 65536 00:16:19.250 }, 00:16:19.250 { 00:16:19.250 "name": null, 00:16:19.250 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:19.250 "is_configured": false, 00:16:19.250 "data_offset": 0, 00:16:19.250 "data_size": 65536 00:16:19.250 }, 00:16:19.250 { 00:16:19.250 "name": "BaseBdev3", 00:16:19.250 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:19.250 "is_configured": true, 00:16:19.250 "data_offset": 0, 00:16:19.251 "data_size": 65536 00:16:19.251 } 00:16:19.251 ] 00:16:19.251 }' 00:16:19.251 06:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.251 06:32:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.819 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:19.819 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.077 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:20.077 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:20.077 [2024-07-25 06:32:33.630480] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.335 "name": "Existed_Raid", 00:16:20.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.335 "strip_size_kb": 64, 00:16:20.335 "state": "configuring", 00:16:20.335 "raid_level": "raid0", 00:16:20.335 "superblock": false, 00:16:20.335 "num_base_bdevs": 3, 00:16:20.335 "num_base_bdevs_discovered": 1, 00:16:20.335 "num_base_bdevs_operational": 3, 00:16:20.335 "base_bdevs_list": [ 00:16:20.335 { 00:16:20.335 "name": "BaseBdev1", 00:16:20.335 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:20.335 "is_configured": true, 00:16:20.335 "data_offset": 0, 00:16:20.335 "data_size": 65536 00:16:20.335 }, 00:16:20.335 { 00:16:20.335 "name": null, 00:16:20.335 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:20.335 "is_configured": false, 00:16:20.335 "data_offset": 0, 00:16:20.335 "data_size": 65536 00:16:20.335 }, 00:16:20.335 { 00:16:20.335 "name": null, 00:16:20.335 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:20.335 "is_configured": false, 00:16:20.335 "data_offset": 0, 00:16:20.335 "data_size": 65536 00:16:20.335 } 00:16:20.335 ] 00:16:20.335 }' 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.335 06:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.903 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.903 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:21.162 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:21.162 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:21.421 [2024-07-25 06:32:34.881782] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.421 06:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.680 06:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.680 "name": "Existed_Raid", 00:16:21.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:21.680 "strip_size_kb": 64, 00:16:21.680 "state": "configuring", 00:16:21.680 "raid_level": "raid0", 00:16:21.680 "superblock": false, 00:16:21.680 "num_base_bdevs": 3, 00:16:21.680 "num_base_bdevs_discovered": 2, 00:16:21.680 "num_base_bdevs_operational": 3, 00:16:21.680 "base_bdevs_list": [ 00:16:21.680 { 00:16:21.680 "name": "BaseBdev1", 00:16:21.680 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:21.680 "is_configured": true, 00:16:21.680 "data_offset": 0, 00:16:21.680 "data_size": 65536 00:16:21.680 }, 00:16:21.680 { 00:16:21.680 "name": null, 00:16:21.681 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:21.681 "is_configured": false, 00:16:21.681 "data_offset": 0, 00:16:21.681 "data_size": 65536 00:16:21.681 }, 00:16:21.681 { 00:16:21.681 "name": "BaseBdev3", 00:16:21.681 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:21.681 "is_configured": true, 00:16:21.681 "data_offset": 0, 00:16:21.681 "data_size": 65536 00:16:21.681 } 00:16:21.681 ] 00:16:21.681 }' 00:16:21.681 06:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.681 06:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.249 06:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.249 06:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:22.509 06:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:22.509 06:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:22.768 [2024-07-25 06:32:36.157160] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.768 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.027 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.027 "name": "Existed_Raid", 00:16:23.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.027 "strip_size_kb": 64, 00:16:23.027 "state": "configuring", 00:16:23.027 "raid_level": "raid0", 00:16:23.027 "superblock": false, 00:16:23.027 "num_base_bdevs": 3, 00:16:23.027 "num_base_bdevs_discovered": 1, 00:16:23.028 "num_base_bdevs_operational": 3, 00:16:23.028 "base_bdevs_list": [ 00:16:23.028 { 00:16:23.028 "name": null, 00:16:23.028 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:23.028 "is_configured": false, 00:16:23.028 "data_offset": 0, 00:16:23.028 "data_size": 65536 00:16:23.028 }, 00:16:23.028 { 00:16:23.028 "name": null, 00:16:23.028 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:23.028 "is_configured": false, 00:16:23.028 "data_offset": 0, 00:16:23.028 "data_size": 65536 00:16:23.028 }, 00:16:23.028 { 00:16:23.028 "name": "BaseBdev3", 00:16:23.028 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:23.028 "is_configured": true, 00:16:23.028 "data_offset": 0, 00:16:23.028 "data_size": 65536 00:16:23.028 } 00:16:23.028 ] 00:16:23.028 }' 00:16:23.028 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.028 06:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.596 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.596 06:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:23.856 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:23.856 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:23.856 [2024-07-25 06:32:37.398752] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:24.123 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.123 "name": "Existed_Raid", 00:16:24.123 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:24.123 "strip_size_kb": 64, 00:16:24.123 "state": "configuring", 00:16:24.123 "raid_level": "raid0", 00:16:24.123 "superblock": false, 00:16:24.123 "num_base_bdevs": 3, 00:16:24.123 "num_base_bdevs_discovered": 2, 00:16:24.123 "num_base_bdevs_operational": 3, 00:16:24.123 "base_bdevs_list": [ 00:16:24.123 { 00:16:24.123 "name": null, 00:16:24.123 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:24.123 "is_configured": false, 00:16:24.123 "data_offset": 0, 00:16:24.123 "data_size": 65536 00:16:24.123 }, 00:16:24.124 { 00:16:24.124 "name": "BaseBdev2", 00:16:24.124 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:24.124 "is_configured": true, 00:16:24.124 "data_offset": 0, 00:16:24.124 "data_size": 65536 00:16:24.124 }, 00:16:24.124 { 00:16:24.124 "name": "BaseBdev3", 00:16:24.124 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:24.124 "is_configured": true, 00:16:24.124 "data_offset": 0, 00:16:24.124 "data_size": 65536 00:16:24.124 } 00:16:24.124 ] 00:16:24.124 }' 00:16:24.124 06:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.124 06:32:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.747 06:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.747 06:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:25.006 06:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:25.006 06:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.006 06:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:25.266 06:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f4ba90a8-522c-4f9b-9711-afea666e17c9 00:16:25.525 [2024-07-25 06:32:38.857780] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:25.525 [2024-07-25 06:32:38.857812] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1824230 00:16:25.525 [2024-07-25 06:32:38.857820] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:16:25.525 [2024-07-25 06:32:38.857991] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1665b70 00:16:25.525 [2024-07-25 06:32:38.858092] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1824230 00:16:25.525 [2024-07-25 06:32:38.858101] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1824230 00:16:25.525 [2024-07-25 06:32:38.858253] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:25.525 NewBaseBdev 00:16:25.525 06:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:25.526 06:32:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:25.526 06:32:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:25.526 06:32:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:25.526 06:32:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:25.526 06:32:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:25.526 06:32:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:25.526 06:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:25.785 [ 00:16:25.785 { 00:16:25.785 "name": "NewBaseBdev", 00:16:25.785 "aliases": [ 00:16:25.785 "f4ba90a8-522c-4f9b-9711-afea666e17c9" 00:16:25.785 ], 00:16:25.785 "product_name": "Malloc disk", 00:16:25.785 "block_size": 512, 00:16:25.785 "num_blocks": 65536, 00:16:25.785 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:25.785 "assigned_rate_limits": { 00:16:25.785 "rw_ios_per_sec": 0, 00:16:25.785 "rw_mbytes_per_sec": 0, 00:16:25.785 "r_mbytes_per_sec": 0, 00:16:25.785 "w_mbytes_per_sec": 0 00:16:25.785 }, 00:16:25.785 "claimed": true, 00:16:25.785 "claim_type": "exclusive_write", 00:16:25.785 "zoned": false, 00:16:25.785 "supported_io_types": { 00:16:25.785 "read": true, 00:16:25.785 "write": true, 00:16:25.785 "unmap": true, 00:16:25.785 "flush": true, 00:16:25.785 "reset": true, 00:16:25.785 "nvme_admin": false, 00:16:25.785 "nvme_io": false, 00:16:25.785 "nvme_io_md": false, 00:16:25.785 "write_zeroes": true, 00:16:25.785 "zcopy": true, 00:16:25.785 "get_zone_info": false, 00:16:25.785 "zone_management": false, 00:16:25.785 "zone_append": false, 00:16:25.785 "compare": false, 00:16:25.785 "compare_and_write": false, 00:16:25.785 "abort": true, 00:16:25.785 "seek_hole": false, 00:16:25.785 "seek_data": false, 00:16:25.785 "copy": true, 00:16:25.785 "nvme_iov_md": false 00:16:25.785 }, 00:16:25.785 "memory_domains": [ 00:16:25.785 { 00:16:25.785 "dma_device_id": "system", 00:16:25.785 "dma_device_type": 1 00:16:25.785 }, 00:16:25.785 { 00:16:25.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.785 "dma_device_type": 2 00:16:25.785 } 00:16:25.785 ], 00:16:25.785 "driver_specific": {} 00:16:25.785 } 00:16:25.785 ] 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.785 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:26.045 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.045 "name": "Existed_Raid", 00:16:26.045 "uuid": "46fcd3b7-9a6e-49e6-bf40-7d437c5eac5f", 00:16:26.045 "strip_size_kb": 64, 00:16:26.045 "state": "online", 00:16:26.045 "raid_level": "raid0", 00:16:26.045 "superblock": false, 00:16:26.045 "num_base_bdevs": 3, 00:16:26.045 "num_base_bdevs_discovered": 3, 00:16:26.045 "num_base_bdevs_operational": 3, 00:16:26.045 "base_bdevs_list": [ 00:16:26.045 { 00:16:26.045 "name": "NewBaseBdev", 00:16:26.045 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:26.045 "is_configured": true, 00:16:26.045 "data_offset": 0, 00:16:26.045 "data_size": 65536 00:16:26.045 }, 00:16:26.045 { 00:16:26.045 "name": "BaseBdev2", 00:16:26.045 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:26.045 "is_configured": true, 00:16:26.045 "data_offset": 0, 00:16:26.045 "data_size": 65536 00:16:26.045 }, 00:16:26.045 { 00:16:26.045 "name": "BaseBdev3", 00:16:26.045 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:26.045 "is_configured": true, 00:16:26.045 "data_offset": 0, 00:16:26.045 "data_size": 65536 00:16:26.045 } 00:16:26.045 ] 00:16:26.045 }' 00:16:26.045 06:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.045 06:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.613 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:26.613 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:26.613 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:26.613 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:26.613 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:26.613 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:26.613 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:26.613 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:26.873 [2024-07-25 06:32:40.289977] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:26.873 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:26.873 "name": "Existed_Raid", 00:16:26.873 "aliases": [ 00:16:26.873 "46fcd3b7-9a6e-49e6-bf40-7d437c5eac5f" 00:16:26.873 ], 00:16:26.873 "product_name": "Raid Volume", 00:16:26.873 "block_size": 512, 00:16:26.873 "num_blocks": 196608, 00:16:26.873 "uuid": "46fcd3b7-9a6e-49e6-bf40-7d437c5eac5f", 00:16:26.873 "assigned_rate_limits": { 00:16:26.873 "rw_ios_per_sec": 0, 00:16:26.873 "rw_mbytes_per_sec": 0, 00:16:26.873 "r_mbytes_per_sec": 0, 00:16:26.873 "w_mbytes_per_sec": 0 00:16:26.873 }, 00:16:26.873 "claimed": false, 00:16:26.873 "zoned": false, 00:16:26.873 "supported_io_types": { 00:16:26.873 "read": true, 00:16:26.873 "write": true, 00:16:26.873 "unmap": true, 00:16:26.873 "flush": true, 00:16:26.873 "reset": true, 00:16:26.873 "nvme_admin": false, 00:16:26.873 "nvme_io": false, 00:16:26.873 "nvme_io_md": false, 00:16:26.873 "write_zeroes": true, 00:16:26.873 "zcopy": false, 00:16:26.873 "get_zone_info": false, 00:16:26.873 "zone_management": false, 00:16:26.873 "zone_append": false, 00:16:26.873 "compare": false, 00:16:26.873 "compare_and_write": false, 00:16:26.873 "abort": false, 00:16:26.873 "seek_hole": false, 00:16:26.873 "seek_data": false, 00:16:26.873 "copy": false, 00:16:26.873 "nvme_iov_md": false 00:16:26.873 }, 00:16:26.873 "memory_domains": [ 00:16:26.873 { 00:16:26.873 "dma_device_id": "system", 00:16:26.873 "dma_device_type": 1 00:16:26.873 }, 00:16:26.873 { 00:16:26.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.873 "dma_device_type": 2 00:16:26.873 }, 00:16:26.873 { 00:16:26.873 "dma_device_id": "system", 00:16:26.873 "dma_device_type": 1 00:16:26.873 }, 00:16:26.873 { 00:16:26.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.873 "dma_device_type": 2 00:16:26.873 }, 00:16:26.873 { 00:16:26.873 "dma_device_id": "system", 00:16:26.873 "dma_device_type": 1 00:16:26.873 }, 00:16:26.873 { 00:16:26.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.873 "dma_device_type": 2 00:16:26.873 } 00:16:26.873 ], 00:16:26.873 "driver_specific": { 00:16:26.873 "raid": { 00:16:26.873 "uuid": "46fcd3b7-9a6e-49e6-bf40-7d437c5eac5f", 00:16:26.873 "strip_size_kb": 64, 00:16:26.873 "state": "online", 00:16:26.873 "raid_level": "raid0", 00:16:26.873 "superblock": false, 00:16:26.873 "num_base_bdevs": 3, 00:16:26.873 "num_base_bdevs_discovered": 3, 00:16:26.873 "num_base_bdevs_operational": 3, 00:16:26.873 "base_bdevs_list": [ 00:16:26.873 { 00:16:26.873 "name": "NewBaseBdev", 00:16:26.873 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:26.873 "is_configured": true, 00:16:26.873 "data_offset": 0, 00:16:26.873 "data_size": 65536 00:16:26.873 }, 00:16:26.873 { 00:16:26.873 "name": "BaseBdev2", 00:16:26.873 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:26.873 "is_configured": true, 00:16:26.873 "data_offset": 0, 00:16:26.873 "data_size": 65536 00:16:26.873 }, 00:16:26.873 { 00:16:26.873 "name": "BaseBdev3", 00:16:26.873 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:26.873 "is_configured": true, 00:16:26.873 "data_offset": 0, 00:16:26.873 "data_size": 65536 00:16:26.873 } 00:16:26.873 ] 00:16:26.873 } 00:16:26.873 } 00:16:26.873 }' 00:16:26.873 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:26.873 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:26.873 BaseBdev2 00:16:26.873 BaseBdev3' 00:16:26.873 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:26.873 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:26.873 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:27.133 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:27.133 "name": "NewBaseBdev", 00:16:27.133 "aliases": [ 00:16:27.133 "f4ba90a8-522c-4f9b-9711-afea666e17c9" 00:16:27.133 ], 00:16:27.133 "product_name": "Malloc disk", 00:16:27.133 "block_size": 512, 00:16:27.133 "num_blocks": 65536, 00:16:27.133 "uuid": "f4ba90a8-522c-4f9b-9711-afea666e17c9", 00:16:27.133 "assigned_rate_limits": { 00:16:27.133 "rw_ios_per_sec": 0, 00:16:27.133 "rw_mbytes_per_sec": 0, 00:16:27.133 "r_mbytes_per_sec": 0, 00:16:27.133 "w_mbytes_per_sec": 0 00:16:27.133 }, 00:16:27.133 "claimed": true, 00:16:27.133 "claim_type": "exclusive_write", 00:16:27.133 "zoned": false, 00:16:27.133 "supported_io_types": { 00:16:27.133 "read": true, 00:16:27.133 "write": true, 00:16:27.133 "unmap": true, 00:16:27.133 "flush": true, 00:16:27.133 "reset": true, 00:16:27.133 "nvme_admin": false, 00:16:27.133 "nvme_io": false, 00:16:27.133 "nvme_io_md": false, 00:16:27.133 "write_zeroes": true, 00:16:27.133 "zcopy": true, 00:16:27.133 "get_zone_info": false, 00:16:27.133 "zone_management": false, 00:16:27.133 "zone_append": false, 00:16:27.133 "compare": false, 00:16:27.133 "compare_and_write": false, 00:16:27.133 "abort": true, 00:16:27.134 "seek_hole": false, 00:16:27.134 "seek_data": false, 00:16:27.134 "copy": true, 00:16:27.134 "nvme_iov_md": false 00:16:27.134 }, 00:16:27.134 "memory_domains": [ 00:16:27.134 { 00:16:27.134 "dma_device_id": "system", 00:16:27.134 "dma_device_type": 1 00:16:27.134 }, 00:16:27.134 { 00:16:27.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.134 "dma_device_type": 2 00:16:27.134 } 00:16:27.134 ], 00:16:27.134 "driver_specific": {} 00:16:27.134 }' 00:16:27.134 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.134 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.134 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:27.134 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:27.393 06:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:27.653 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:27.653 "name": "BaseBdev2", 00:16:27.653 "aliases": [ 00:16:27.653 "bc268baa-cf83-4757-b601-a4bfbcf8b0d8" 00:16:27.653 ], 00:16:27.653 "product_name": "Malloc disk", 00:16:27.653 "block_size": 512, 00:16:27.653 "num_blocks": 65536, 00:16:27.653 "uuid": "bc268baa-cf83-4757-b601-a4bfbcf8b0d8", 00:16:27.653 "assigned_rate_limits": { 00:16:27.653 "rw_ios_per_sec": 0, 00:16:27.653 "rw_mbytes_per_sec": 0, 00:16:27.653 "r_mbytes_per_sec": 0, 00:16:27.653 "w_mbytes_per_sec": 0 00:16:27.653 }, 00:16:27.653 "claimed": true, 00:16:27.653 "claim_type": "exclusive_write", 00:16:27.653 "zoned": false, 00:16:27.653 "supported_io_types": { 00:16:27.653 "read": true, 00:16:27.653 "write": true, 00:16:27.653 "unmap": true, 00:16:27.653 "flush": true, 00:16:27.653 "reset": true, 00:16:27.653 "nvme_admin": false, 00:16:27.653 "nvme_io": false, 00:16:27.653 "nvme_io_md": false, 00:16:27.653 "write_zeroes": true, 00:16:27.653 "zcopy": true, 00:16:27.653 "get_zone_info": false, 00:16:27.653 "zone_management": false, 00:16:27.653 "zone_append": false, 00:16:27.653 "compare": false, 00:16:27.653 "compare_and_write": false, 00:16:27.653 "abort": true, 00:16:27.653 "seek_hole": false, 00:16:27.653 "seek_data": false, 00:16:27.653 "copy": true, 00:16:27.653 "nvme_iov_md": false 00:16:27.653 }, 00:16:27.653 "memory_domains": [ 00:16:27.653 { 00:16:27.653 "dma_device_id": "system", 00:16:27.653 "dma_device_type": 1 00:16:27.653 }, 00:16:27.653 { 00:16:27.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.653 "dma_device_type": 2 00:16:27.653 } 00:16:27.653 ], 00:16:27.653 "driver_specific": {} 00:16:27.653 }' 00:16:27.653 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.912 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.172 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:28.172 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:28.172 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:28.172 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:28.431 "name": "BaseBdev3", 00:16:28.431 "aliases": [ 00:16:28.431 "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7" 00:16:28.431 ], 00:16:28.431 "product_name": "Malloc disk", 00:16:28.431 "block_size": 512, 00:16:28.431 "num_blocks": 65536, 00:16:28.431 "uuid": "e1cdaa6a-787a-487c-a7ea-b20742d1b1d7", 00:16:28.431 "assigned_rate_limits": { 00:16:28.431 "rw_ios_per_sec": 0, 00:16:28.431 "rw_mbytes_per_sec": 0, 00:16:28.431 "r_mbytes_per_sec": 0, 00:16:28.431 "w_mbytes_per_sec": 0 00:16:28.431 }, 00:16:28.431 "claimed": true, 00:16:28.431 "claim_type": "exclusive_write", 00:16:28.431 "zoned": false, 00:16:28.431 "supported_io_types": { 00:16:28.431 "read": true, 00:16:28.431 "write": true, 00:16:28.431 "unmap": true, 00:16:28.431 "flush": true, 00:16:28.431 "reset": true, 00:16:28.431 "nvme_admin": false, 00:16:28.431 "nvme_io": false, 00:16:28.431 "nvme_io_md": false, 00:16:28.431 "write_zeroes": true, 00:16:28.431 "zcopy": true, 00:16:28.431 "get_zone_info": false, 00:16:28.431 "zone_management": false, 00:16:28.431 "zone_append": false, 00:16:28.431 "compare": false, 00:16:28.431 "compare_and_write": false, 00:16:28.431 "abort": true, 00:16:28.431 "seek_hole": false, 00:16:28.431 "seek_data": false, 00:16:28.431 "copy": true, 00:16:28.431 "nvme_iov_md": false 00:16:28.431 }, 00:16:28.431 "memory_domains": [ 00:16:28.431 { 00:16:28.431 "dma_device_id": "system", 00:16:28.431 "dma_device_type": 1 00:16:28.431 }, 00:16:28.431 { 00:16:28.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.431 "dma_device_type": 2 00:16:28.431 } 00:16:28.431 ], 00:16:28.431 "driver_specific": {} 00:16:28.431 }' 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:28.431 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:28.691 06:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:28.691 06:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.691 06:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.691 06:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:28.691 06:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:28.951 [2024-07-25 06:32:42.287005] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:28.951 [2024-07-25 06:32:42.287028] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:28.951 [2024-07-25 06:32:42.287074] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:28.951 [2024-07-25 06:32:42.287122] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:28.951 [2024-07-25 06:32:42.287132] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1824230 name Existed_Raid, state offline 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1119210 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1119210 ']' 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1119210 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1119210 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1119210' 00:16:28.951 killing process with pid 1119210 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1119210 00:16:28.951 [2024-07-25 06:32:42.366593] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:28.951 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1119210 00:16:28.951 [2024-07-25 06:32:42.390056] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:29.211 00:16:29.211 real 0m26.846s 00:16:29.211 user 0m49.099s 00:16:29.211 sys 0m4.962s 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.211 ************************************ 00:16:29.211 END TEST raid_state_function_test 00:16:29.211 ************************************ 00:16:29.211 06:32:42 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:16:29.211 06:32:42 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:29.211 06:32:42 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:29.211 06:32:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:29.211 ************************************ 00:16:29.211 START TEST raid_state_function_test_sb 00:16:29.211 ************************************ 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 true 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1124316 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1124316' 00:16:29.211 Process raid pid: 1124316 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1124316 /var/tmp/spdk-raid.sock 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1124316 ']' 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:29.211 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:29.211 06:32:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:29.211 [2024-07-25 06:32:42.718804] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:16:29.211 [2024-07-25 06:32:42.718862] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:01.0 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:01.1 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:01.2 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:01.3 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:01.4 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:01.5 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:01.6 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:01.7 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:02.0 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:02.1 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:02.2 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:02.3 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:02.4 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:02.5 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:02.6 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3d:02.7 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:01.0 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:01.1 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:01.2 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:01.3 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:01.4 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:01.5 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:01.6 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:01.7 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:02.0 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:02.1 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:02.2 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:02.3 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:02.4 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:02.5 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:02.6 cannot be used 00:16:29.471 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:29.471 EAL: Requested device 0000:3f:02.7 cannot be used 00:16:29.471 [2024-07-25 06:32:42.856757] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:29.471 [2024-07-25 06:32:42.900148] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:29.471 [2024-07-25 06:32:42.972063] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:29.471 [2024-07-25 06:32:42.972119] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:30.408 [2024-07-25 06:32:43.818435] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:30.408 [2024-07-25 06:32:43.818471] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:30.408 [2024-07-25 06:32:43.818481] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:30.408 [2024-07-25 06:32:43.818492] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:30.408 [2024-07-25 06:32:43.818500] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:30.408 [2024-07-25 06:32:43.818511] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.408 06:32:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:30.668 06:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:30.668 "name": "Existed_Raid", 00:16:30.668 "uuid": "05d0eef9-5543-4a06-a0c5-75043dc27856", 00:16:30.668 "strip_size_kb": 64, 00:16:30.668 "state": "configuring", 00:16:30.668 "raid_level": "raid0", 00:16:30.668 "superblock": true, 00:16:30.668 "num_base_bdevs": 3, 00:16:30.668 "num_base_bdevs_discovered": 0, 00:16:30.668 "num_base_bdevs_operational": 3, 00:16:30.668 "base_bdevs_list": [ 00:16:30.668 { 00:16:30.668 "name": "BaseBdev1", 00:16:30.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.668 "is_configured": false, 00:16:30.668 "data_offset": 0, 00:16:30.668 "data_size": 0 00:16:30.668 }, 00:16:30.668 { 00:16:30.668 "name": "BaseBdev2", 00:16:30.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.668 "is_configured": false, 00:16:30.668 "data_offset": 0, 00:16:30.668 "data_size": 0 00:16:30.668 }, 00:16:30.668 { 00:16:30.668 "name": "BaseBdev3", 00:16:30.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.668 "is_configured": false, 00:16:30.668 "data_offset": 0, 00:16:30.668 "data_size": 0 00:16:30.668 } 00:16:30.668 ] 00:16:30.668 }' 00:16:30.668 06:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:30.668 06:32:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.235 06:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:31.494 [2024-07-25 06:32:44.836958] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:31.494 [2024-07-25 06:32:44.836986] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x235f470 name Existed_Raid, state configuring 00:16:31.494 06:32:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:31.754 [2024-07-25 06:32:45.069592] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:31.754 [2024-07-25 06:32:45.069619] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:31.754 [2024-07-25 06:32:45.069628] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:31.754 [2024-07-25 06:32:45.069639] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:31.754 [2024-07-25 06:32:45.069647] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:31.754 [2024-07-25 06:32:45.069658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:31.754 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:31.754 [2024-07-25 06:32:45.307686] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:31.754 BaseBdev1 00:16:32.013 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:32.013 06:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:32.013 06:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:32.013 06:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:32.013 06:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:32.013 06:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:32.013 06:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:32.013 06:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:32.273 [ 00:16:32.273 { 00:16:32.273 "name": "BaseBdev1", 00:16:32.273 "aliases": [ 00:16:32.273 "19c436e3-ca5a-42dd-83ce-d660ab68d4ef" 00:16:32.273 ], 00:16:32.273 "product_name": "Malloc disk", 00:16:32.273 "block_size": 512, 00:16:32.273 "num_blocks": 65536, 00:16:32.273 "uuid": "19c436e3-ca5a-42dd-83ce-d660ab68d4ef", 00:16:32.273 "assigned_rate_limits": { 00:16:32.273 "rw_ios_per_sec": 0, 00:16:32.273 "rw_mbytes_per_sec": 0, 00:16:32.273 "r_mbytes_per_sec": 0, 00:16:32.273 "w_mbytes_per_sec": 0 00:16:32.273 }, 00:16:32.273 "claimed": true, 00:16:32.273 "claim_type": "exclusive_write", 00:16:32.273 "zoned": false, 00:16:32.273 "supported_io_types": { 00:16:32.273 "read": true, 00:16:32.273 "write": true, 00:16:32.273 "unmap": true, 00:16:32.273 "flush": true, 00:16:32.273 "reset": true, 00:16:32.273 "nvme_admin": false, 00:16:32.273 "nvme_io": false, 00:16:32.273 "nvme_io_md": false, 00:16:32.273 "write_zeroes": true, 00:16:32.273 "zcopy": true, 00:16:32.273 "get_zone_info": false, 00:16:32.273 "zone_management": false, 00:16:32.273 "zone_append": false, 00:16:32.273 "compare": false, 00:16:32.273 "compare_and_write": false, 00:16:32.273 "abort": true, 00:16:32.273 "seek_hole": false, 00:16:32.273 "seek_data": false, 00:16:32.273 "copy": true, 00:16:32.273 "nvme_iov_md": false 00:16:32.273 }, 00:16:32.273 "memory_domains": [ 00:16:32.273 { 00:16:32.273 "dma_device_id": "system", 00:16:32.273 "dma_device_type": 1 00:16:32.273 }, 00:16:32.273 { 00:16:32.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.273 "dma_device_type": 2 00:16:32.273 } 00:16:32.273 ], 00:16:32.273 "driver_specific": {} 00:16:32.273 } 00:16:32.273 ] 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.273 06:32:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.533 06:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.533 "name": "Existed_Raid", 00:16:32.533 "uuid": "24832e54-3acf-4961-9789-db21ad3113d8", 00:16:32.533 "strip_size_kb": 64, 00:16:32.533 "state": "configuring", 00:16:32.533 "raid_level": "raid0", 00:16:32.533 "superblock": true, 00:16:32.533 "num_base_bdevs": 3, 00:16:32.533 "num_base_bdevs_discovered": 1, 00:16:32.533 "num_base_bdevs_operational": 3, 00:16:32.533 "base_bdevs_list": [ 00:16:32.533 { 00:16:32.533 "name": "BaseBdev1", 00:16:32.533 "uuid": "19c436e3-ca5a-42dd-83ce-d660ab68d4ef", 00:16:32.533 "is_configured": true, 00:16:32.533 "data_offset": 2048, 00:16:32.533 "data_size": 63488 00:16:32.533 }, 00:16:32.533 { 00:16:32.533 "name": "BaseBdev2", 00:16:32.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.533 "is_configured": false, 00:16:32.533 "data_offset": 0, 00:16:32.533 "data_size": 0 00:16:32.533 }, 00:16:32.533 { 00:16:32.533 "name": "BaseBdev3", 00:16:32.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.533 "is_configured": false, 00:16:32.533 "data_offset": 0, 00:16:32.533 "data_size": 0 00:16:32.533 } 00:16:32.533 ] 00:16:32.533 }' 00:16:32.533 06:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.533 06:32:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:33.102 06:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:33.361 [2024-07-25 06:32:46.799637] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:33.361 [2024-07-25 06:32:46.799677] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x235ece0 name Existed_Raid, state configuring 00:16:33.361 06:32:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:33.620 [2024-07-25 06:32:47.028283] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:33.620 [2024-07-25 06:32:47.029693] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:33.620 [2024-07-25 06:32:47.029724] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:33.620 [2024-07-25 06:32:47.029733] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:33.620 [2024-07-25 06:32:47.029744] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.620 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.621 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.621 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.880 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.880 "name": "Existed_Raid", 00:16:33.880 "uuid": "0e7bc994-4537-4c48-babb-8da57e7d02d4", 00:16:33.880 "strip_size_kb": 64, 00:16:33.880 "state": "configuring", 00:16:33.880 "raid_level": "raid0", 00:16:33.880 "superblock": true, 00:16:33.880 "num_base_bdevs": 3, 00:16:33.880 "num_base_bdevs_discovered": 1, 00:16:33.880 "num_base_bdevs_operational": 3, 00:16:33.880 "base_bdevs_list": [ 00:16:33.880 { 00:16:33.880 "name": "BaseBdev1", 00:16:33.880 "uuid": "19c436e3-ca5a-42dd-83ce-d660ab68d4ef", 00:16:33.880 "is_configured": true, 00:16:33.880 "data_offset": 2048, 00:16:33.880 "data_size": 63488 00:16:33.880 }, 00:16:33.880 { 00:16:33.880 "name": "BaseBdev2", 00:16:33.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.880 "is_configured": false, 00:16:33.880 "data_offset": 0, 00:16:33.880 "data_size": 0 00:16:33.880 }, 00:16:33.880 { 00:16:33.880 "name": "BaseBdev3", 00:16:33.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.880 "is_configured": false, 00:16:33.880 "data_offset": 0, 00:16:33.880 "data_size": 0 00:16:33.880 } 00:16:33.880 ] 00:16:33.880 }' 00:16:33.880 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.880 06:32:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.446 06:32:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:34.706 [2024-07-25 06:32:48.090195] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:34.706 BaseBdev2 00:16:34.706 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:34.706 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:34.706 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:34.706 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:34.706 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:34.706 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:34.706 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.966 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:35.225 [ 00:16:35.225 { 00:16:35.225 "name": "BaseBdev2", 00:16:35.225 "aliases": [ 00:16:35.225 "51417cf0-1b8c-4f1c-95f2-294e497bb30a" 00:16:35.225 ], 00:16:35.225 "product_name": "Malloc disk", 00:16:35.225 "block_size": 512, 00:16:35.225 "num_blocks": 65536, 00:16:35.225 "uuid": "51417cf0-1b8c-4f1c-95f2-294e497bb30a", 00:16:35.225 "assigned_rate_limits": { 00:16:35.225 "rw_ios_per_sec": 0, 00:16:35.225 "rw_mbytes_per_sec": 0, 00:16:35.225 "r_mbytes_per_sec": 0, 00:16:35.225 "w_mbytes_per_sec": 0 00:16:35.225 }, 00:16:35.225 "claimed": true, 00:16:35.225 "claim_type": "exclusive_write", 00:16:35.225 "zoned": false, 00:16:35.225 "supported_io_types": { 00:16:35.225 "read": true, 00:16:35.225 "write": true, 00:16:35.225 "unmap": true, 00:16:35.225 "flush": true, 00:16:35.225 "reset": true, 00:16:35.225 "nvme_admin": false, 00:16:35.225 "nvme_io": false, 00:16:35.225 "nvme_io_md": false, 00:16:35.225 "write_zeroes": true, 00:16:35.225 "zcopy": true, 00:16:35.225 "get_zone_info": false, 00:16:35.225 "zone_management": false, 00:16:35.225 "zone_append": false, 00:16:35.225 "compare": false, 00:16:35.225 "compare_and_write": false, 00:16:35.225 "abort": true, 00:16:35.225 "seek_hole": false, 00:16:35.225 "seek_data": false, 00:16:35.225 "copy": true, 00:16:35.225 "nvme_iov_md": false 00:16:35.225 }, 00:16:35.225 "memory_domains": [ 00:16:35.225 { 00:16:35.225 "dma_device_id": "system", 00:16:35.225 "dma_device_type": 1 00:16:35.225 }, 00:16:35.225 { 00:16:35.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.225 "dma_device_type": 2 00:16:35.225 } 00:16:35.225 ], 00:16:35.225 "driver_specific": {} 00:16:35.225 } 00:16:35.225 ] 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.225 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.226 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.226 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.226 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.226 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.226 "name": "Existed_Raid", 00:16:35.226 "uuid": "0e7bc994-4537-4c48-babb-8da57e7d02d4", 00:16:35.226 "strip_size_kb": 64, 00:16:35.226 "state": "configuring", 00:16:35.226 "raid_level": "raid0", 00:16:35.226 "superblock": true, 00:16:35.226 "num_base_bdevs": 3, 00:16:35.226 "num_base_bdevs_discovered": 2, 00:16:35.226 "num_base_bdevs_operational": 3, 00:16:35.226 "base_bdevs_list": [ 00:16:35.226 { 00:16:35.226 "name": "BaseBdev1", 00:16:35.226 "uuid": "19c436e3-ca5a-42dd-83ce-d660ab68d4ef", 00:16:35.226 "is_configured": true, 00:16:35.226 "data_offset": 2048, 00:16:35.226 "data_size": 63488 00:16:35.226 }, 00:16:35.226 { 00:16:35.226 "name": "BaseBdev2", 00:16:35.226 "uuid": "51417cf0-1b8c-4f1c-95f2-294e497bb30a", 00:16:35.226 "is_configured": true, 00:16:35.226 "data_offset": 2048, 00:16:35.226 "data_size": 63488 00:16:35.226 }, 00:16:35.226 { 00:16:35.226 "name": "BaseBdev3", 00:16:35.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.226 "is_configured": false, 00:16:35.226 "data_offset": 0, 00:16:35.226 "data_size": 0 00:16:35.226 } 00:16:35.226 ] 00:16:35.226 }' 00:16:35.226 06:32:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.226 06:32:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:36.164 06:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:36.164 [2024-07-25 06:32:49.561333] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:36.164 [2024-07-25 06:32:49.561481] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2512380 00:16:36.164 [2024-07-25 06:32:49.561494] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:36.164 [2024-07-25 06:32:49.561659] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2509550 00:16:36.164 [2024-07-25 06:32:49.561768] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2512380 00:16:36.164 [2024-07-25 06:32:49.561777] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2512380 00:16:36.164 [2024-07-25 06:32:49.561861] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:36.164 BaseBdev3 00:16:36.164 06:32:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:36.164 06:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:36.164 06:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:36.164 06:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:36.164 06:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:36.164 06:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:36.164 06:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:36.422 06:32:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:36.680 [ 00:16:36.680 { 00:16:36.680 "name": "BaseBdev3", 00:16:36.680 "aliases": [ 00:16:36.680 "b8b0cee2-4aaa-435f-8cf3-9baf1640e1be" 00:16:36.680 ], 00:16:36.680 "product_name": "Malloc disk", 00:16:36.680 "block_size": 512, 00:16:36.680 "num_blocks": 65536, 00:16:36.680 "uuid": "b8b0cee2-4aaa-435f-8cf3-9baf1640e1be", 00:16:36.680 "assigned_rate_limits": { 00:16:36.680 "rw_ios_per_sec": 0, 00:16:36.680 "rw_mbytes_per_sec": 0, 00:16:36.680 "r_mbytes_per_sec": 0, 00:16:36.680 "w_mbytes_per_sec": 0 00:16:36.680 }, 00:16:36.680 "claimed": true, 00:16:36.680 "claim_type": "exclusive_write", 00:16:36.680 "zoned": false, 00:16:36.680 "supported_io_types": { 00:16:36.680 "read": true, 00:16:36.680 "write": true, 00:16:36.680 "unmap": true, 00:16:36.680 "flush": true, 00:16:36.680 "reset": true, 00:16:36.680 "nvme_admin": false, 00:16:36.680 "nvme_io": false, 00:16:36.680 "nvme_io_md": false, 00:16:36.680 "write_zeroes": true, 00:16:36.680 "zcopy": true, 00:16:36.680 "get_zone_info": false, 00:16:36.680 "zone_management": false, 00:16:36.680 "zone_append": false, 00:16:36.680 "compare": false, 00:16:36.680 "compare_and_write": false, 00:16:36.680 "abort": true, 00:16:36.680 "seek_hole": false, 00:16:36.680 "seek_data": false, 00:16:36.680 "copy": true, 00:16:36.680 "nvme_iov_md": false 00:16:36.680 }, 00:16:36.680 "memory_domains": [ 00:16:36.680 { 00:16:36.680 "dma_device_id": "system", 00:16:36.680 "dma_device_type": 1 00:16:36.680 }, 00:16:36.680 { 00:16:36.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.680 "dma_device_type": 2 00:16:36.680 } 00:16:36.680 ], 00:16:36.680 "driver_specific": {} 00:16:36.680 } 00:16:36.680 ] 00:16:36.680 06:32:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:36.680 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.681 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.939 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.939 "name": "Existed_Raid", 00:16:36.939 "uuid": "0e7bc994-4537-4c48-babb-8da57e7d02d4", 00:16:36.939 "strip_size_kb": 64, 00:16:36.939 "state": "online", 00:16:36.939 "raid_level": "raid0", 00:16:36.939 "superblock": true, 00:16:36.939 "num_base_bdevs": 3, 00:16:36.939 "num_base_bdevs_discovered": 3, 00:16:36.939 "num_base_bdevs_operational": 3, 00:16:36.939 "base_bdevs_list": [ 00:16:36.939 { 00:16:36.939 "name": "BaseBdev1", 00:16:36.939 "uuid": "19c436e3-ca5a-42dd-83ce-d660ab68d4ef", 00:16:36.939 "is_configured": true, 00:16:36.939 "data_offset": 2048, 00:16:36.939 "data_size": 63488 00:16:36.939 }, 00:16:36.939 { 00:16:36.939 "name": "BaseBdev2", 00:16:36.939 "uuid": "51417cf0-1b8c-4f1c-95f2-294e497bb30a", 00:16:36.939 "is_configured": true, 00:16:36.939 "data_offset": 2048, 00:16:36.939 "data_size": 63488 00:16:36.939 }, 00:16:36.939 { 00:16:36.939 "name": "BaseBdev3", 00:16:36.939 "uuid": "b8b0cee2-4aaa-435f-8cf3-9baf1640e1be", 00:16:36.939 "is_configured": true, 00:16:36.939 "data_offset": 2048, 00:16:36.939 "data_size": 63488 00:16:36.939 } 00:16:36.939 ] 00:16:36.939 }' 00:16:36.939 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.939 06:32:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:37.523 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:37.523 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:37.523 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:37.523 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:37.523 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:37.523 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:37.523 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:37.523 06:32:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:37.523 [2024-07-25 06:32:51.041511] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:37.523 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:37.523 "name": "Existed_Raid", 00:16:37.523 "aliases": [ 00:16:37.523 "0e7bc994-4537-4c48-babb-8da57e7d02d4" 00:16:37.523 ], 00:16:37.523 "product_name": "Raid Volume", 00:16:37.523 "block_size": 512, 00:16:37.523 "num_blocks": 190464, 00:16:37.523 "uuid": "0e7bc994-4537-4c48-babb-8da57e7d02d4", 00:16:37.523 "assigned_rate_limits": { 00:16:37.523 "rw_ios_per_sec": 0, 00:16:37.523 "rw_mbytes_per_sec": 0, 00:16:37.523 "r_mbytes_per_sec": 0, 00:16:37.523 "w_mbytes_per_sec": 0 00:16:37.523 }, 00:16:37.523 "claimed": false, 00:16:37.523 "zoned": false, 00:16:37.523 "supported_io_types": { 00:16:37.523 "read": true, 00:16:37.523 "write": true, 00:16:37.523 "unmap": true, 00:16:37.523 "flush": true, 00:16:37.523 "reset": true, 00:16:37.523 "nvme_admin": false, 00:16:37.523 "nvme_io": false, 00:16:37.523 "nvme_io_md": false, 00:16:37.523 "write_zeroes": true, 00:16:37.523 "zcopy": false, 00:16:37.523 "get_zone_info": false, 00:16:37.523 "zone_management": false, 00:16:37.523 "zone_append": false, 00:16:37.523 "compare": false, 00:16:37.523 "compare_and_write": false, 00:16:37.523 "abort": false, 00:16:37.523 "seek_hole": false, 00:16:37.523 "seek_data": false, 00:16:37.523 "copy": false, 00:16:37.523 "nvme_iov_md": false 00:16:37.523 }, 00:16:37.523 "memory_domains": [ 00:16:37.523 { 00:16:37.523 "dma_device_id": "system", 00:16:37.523 "dma_device_type": 1 00:16:37.523 }, 00:16:37.523 { 00:16:37.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.523 "dma_device_type": 2 00:16:37.523 }, 00:16:37.523 { 00:16:37.523 "dma_device_id": "system", 00:16:37.523 "dma_device_type": 1 00:16:37.523 }, 00:16:37.523 { 00:16:37.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.523 "dma_device_type": 2 00:16:37.523 }, 00:16:37.523 { 00:16:37.523 "dma_device_id": "system", 00:16:37.523 "dma_device_type": 1 00:16:37.523 }, 00:16:37.523 { 00:16:37.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.523 "dma_device_type": 2 00:16:37.523 } 00:16:37.523 ], 00:16:37.523 "driver_specific": { 00:16:37.523 "raid": { 00:16:37.523 "uuid": "0e7bc994-4537-4c48-babb-8da57e7d02d4", 00:16:37.523 "strip_size_kb": 64, 00:16:37.523 "state": "online", 00:16:37.523 "raid_level": "raid0", 00:16:37.523 "superblock": true, 00:16:37.523 "num_base_bdevs": 3, 00:16:37.523 "num_base_bdevs_discovered": 3, 00:16:37.523 "num_base_bdevs_operational": 3, 00:16:37.523 "base_bdevs_list": [ 00:16:37.523 { 00:16:37.523 "name": "BaseBdev1", 00:16:37.523 "uuid": "19c436e3-ca5a-42dd-83ce-d660ab68d4ef", 00:16:37.523 "is_configured": true, 00:16:37.523 "data_offset": 2048, 00:16:37.523 "data_size": 63488 00:16:37.523 }, 00:16:37.523 { 00:16:37.523 "name": "BaseBdev2", 00:16:37.523 "uuid": "51417cf0-1b8c-4f1c-95f2-294e497bb30a", 00:16:37.523 "is_configured": true, 00:16:37.523 "data_offset": 2048, 00:16:37.523 "data_size": 63488 00:16:37.523 }, 00:16:37.523 { 00:16:37.523 "name": "BaseBdev3", 00:16:37.523 "uuid": "b8b0cee2-4aaa-435f-8cf3-9baf1640e1be", 00:16:37.523 "is_configured": true, 00:16:37.523 "data_offset": 2048, 00:16:37.523 "data_size": 63488 00:16:37.523 } 00:16:37.523 ] 00:16:37.523 } 00:16:37.523 } 00:16:37.523 }' 00:16:37.523 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:37.781 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:37.781 BaseBdev2 00:16:37.781 BaseBdev3' 00:16:37.781 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.781 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:37.781 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.039 "name": "BaseBdev1", 00:16:38.039 "aliases": [ 00:16:38.039 "19c436e3-ca5a-42dd-83ce-d660ab68d4ef" 00:16:38.039 ], 00:16:38.039 "product_name": "Malloc disk", 00:16:38.039 "block_size": 512, 00:16:38.039 "num_blocks": 65536, 00:16:38.039 "uuid": "19c436e3-ca5a-42dd-83ce-d660ab68d4ef", 00:16:38.039 "assigned_rate_limits": { 00:16:38.039 "rw_ios_per_sec": 0, 00:16:38.039 "rw_mbytes_per_sec": 0, 00:16:38.039 "r_mbytes_per_sec": 0, 00:16:38.039 "w_mbytes_per_sec": 0 00:16:38.039 }, 00:16:38.039 "claimed": true, 00:16:38.039 "claim_type": "exclusive_write", 00:16:38.039 "zoned": false, 00:16:38.039 "supported_io_types": { 00:16:38.039 "read": true, 00:16:38.039 "write": true, 00:16:38.039 "unmap": true, 00:16:38.039 "flush": true, 00:16:38.039 "reset": true, 00:16:38.039 "nvme_admin": false, 00:16:38.039 "nvme_io": false, 00:16:38.039 "nvme_io_md": false, 00:16:38.039 "write_zeroes": true, 00:16:38.039 "zcopy": true, 00:16:38.039 "get_zone_info": false, 00:16:38.039 "zone_management": false, 00:16:38.039 "zone_append": false, 00:16:38.039 "compare": false, 00:16:38.039 "compare_and_write": false, 00:16:38.039 "abort": true, 00:16:38.039 "seek_hole": false, 00:16:38.039 "seek_data": false, 00:16:38.039 "copy": true, 00:16:38.039 "nvme_iov_md": false 00:16:38.039 }, 00:16:38.039 "memory_domains": [ 00:16:38.039 { 00:16:38.039 "dma_device_id": "system", 00:16:38.039 "dma_device_type": 1 00:16:38.039 }, 00:16:38.039 { 00:16:38.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.039 "dma_device_type": 2 00:16:38.039 } 00:16:38.039 ], 00:16:38.039 "driver_specific": {} 00:16:38.039 }' 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.039 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.297 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.297 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.297 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.297 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.297 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.297 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:38.297 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.555 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.555 "name": "BaseBdev2", 00:16:38.555 "aliases": [ 00:16:38.555 "51417cf0-1b8c-4f1c-95f2-294e497bb30a" 00:16:38.555 ], 00:16:38.555 "product_name": "Malloc disk", 00:16:38.555 "block_size": 512, 00:16:38.555 "num_blocks": 65536, 00:16:38.555 "uuid": "51417cf0-1b8c-4f1c-95f2-294e497bb30a", 00:16:38.555 "assigned_rate_limits": { 00:16:38.555 "rw_ios_per_sec": 0, 00:16:38.555 "rw_mbytes_per_sec": 0, 00:16:38.555 "r_mbytes_per_sec": 0, 00:16:38.555 "w_mbytes_per_sec": 0 00:16:38.555 }, 00:16:38.555 "claimed": true, 00:16:38.555 "claim_type": "exclusive_write", 00:16:38.555 "zoned": false, 00:16:38.555 "supported_io_types": { 00:16:38.555 "read": true, 00:16:38.555 "write": true, 00:16:38.555 "unmap": true, 00:16:38.555 "flush": true, 00:16:38.555 "reset": true, 00:16:38.555 "nvme_admin": false, 00:16:38.555 "nvme_io": false, 00:16:38.555 "nvme_io_md": false, 00:16:38.555 "write_zeroes": true, 00:16:38.555 "zcopy": true, 00:16:38.555 "get_zone_info": false, 00:16:38.555 "zone_management": false, 00:16:38.555 "zone_append": false, 00:16:38.555 "compare": false, 00:16:38.555 "compare_and_write": false, 00:16:38.555 "abort": true, 00:16:38.555 "seek_hole": false, 00:16:38.555 "seek_data": false, 00:16:38.555 "copy": true, 00:16:38.555 "nvme_iov_md": false 00:16:38.555 }, 00:16:38.555 "memory_domains": [ 00:16:38.555 { 00:16:38.555 "dma_device_id": "system", 00:16:38.555 "dma_device_type": 1 00:16:38.555 }, 00:16:38.555 { 00:16:38.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.555 "dma_device_type": 2 00:16:38.555 } 00:16:38.555 ], 00:16:38.555 "driver_specific": {} 00:16:38.555 }' 00:16:38.555 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.555 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.555 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.555 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.555 06:32:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.555 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.555 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.555 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.813 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.813 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.813 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.813 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.813 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.813 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.813 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.072 "name": "BaseBdev3", 00:16:39.072 "aliases": [ 00:16:39.072 "b8b0cee2-4aaa-435f-8cf3-9baf1640e1be" 00:16:39.072 ], 00:16:39.072 "product_name": "Malloc disk", 00:16:39.072 "block_size": 512, 00:16:39.072 "num_blocks": 65536, 00:16:39.072 "uuid": "b8b0cee2-4aaa-435f-8cf3-9baf1640e1be", 00:16:39.072 "assigned_rate_limits": { 00:16:39.072 "rw_ios_per_sec": 0, 00:16:39.072 "rw_mbytes_per_sec": 0, 00:16:39.072 "r_mbytes_per_sec": 0, 00:16:39.072 "w_mbytes_per_sec": 0 00:16:39.072 }, 00:16:39.072 "claimed": true, 00:16:39.072 "claim_type": "exclusive_write", 00:16:39.072 "zoned": false, 00:16:39.072 "supported_io_types": { 00:16:39.072 "read": true, 00:16:39.072 "write": true, 00:16:39.072 "unmap": true, 00:16:39.072 "flush": true, 00:16:39.072 "reset": true, 00:16:39.072 "nvme_admin": false, 00:16:39.072 "nvme_io": false, 00:16:39.072 "nvme_io_md": false, 00:16:39.072 "write_zeroes": true, 00:16:39.072 "zcopy": true, 00:16:39.072 "get_zone_info": false, 00:16:39.072 "zone_management": false, 00:16:39.072 "zone_append": false, 00:16:39.072 "compare": false, 00:16:39.072 "compare_and_write": false, 00:16:39.072 "abort": true, 00:16:39.072 "seek_hole": false, 00:16:39.072 "seek_data": false, 00:16:39.072 "copy": true, 00:16:39.072 "nvme_iov_md": false 00:16:39.072 }, 00:16:39.072 "memory_domains": [ 00:16:39.072 { 00:16:39.072 "dma_device_id": "system", 00:16:39.072 "dma_device_type": 1 00:16:39.072 }, 00:16:39.072 { 00:16:39.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.072 "dma_device_type": 2 00:16:39.072 } 00:16:39.072 ], 00:16:39.072 "driver_specific": {} 00:16:39.072 }' 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.072 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.330 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.330 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.330 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:39.588 [2024-07-25 06:32:52.894155] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:39.588 [2024-07-25 06:32:52.894180] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:39.588 [2024-07-25 06:32:52.894221] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.588 06:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.846 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.846 "name": "Existed_Raid", 00:16:39.846 "uuid": "0e7bc994-4537-4c48-babb-8da57e7d02d4", 00:16:39.846 "strip_size_kb": 64, 00:16:39.846 "state": "offline", 00:16:39.846 "raid_level": "raid0", 00:16:39.846 "superblock": true, 00:16:39.846 "num_base_bdevs": 3, 00:16:39.846 "num_base_bdevs_discovered": 2, 00:16:39.846 "num_base_bdevs_operational": 2, 00:16:39.846 "base_bdevs_list": [ 00:16:39.846 { 00:16:39.846 "name": null, 00:16:39.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.846 "is_configured": false, 00:16:39.846 "data_offset": 2048, 00:16:39.846 "data_size": 63488 00:16:39.846 }, 00:16:39.846 { 00:16:39.846 "name": "BaseBdev2", 00:16:39.846 "uuid": "51417cf0-1b8c-4f1c-95f2-294e497bb30a", 00:16:39.846 "is_configured": true, 00:16:39.846 "data_offset": 2048, 00:16:39.846 "data_size": 63488 00:16:39.846 }, 00:16:39.846 { 00:16:39.846 "name": "BaseBdev3", 00:16:39.846 "uuid": "b8b0cee2-4aaa-435f-8cf3-9baf1640e1be", 00:16:39.846 "is_configured": true, 00:16:39.846 "data_offset": 2048, 00:16:39.846 "data_size": 63488 00:16:39.846 } 00:16:39.846 ] 00:16:39.846 }' 00:16:39.846 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.846 06:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:40.410 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:40.410 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:40.410 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.410 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:40.410 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:40.410 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:40.410 06:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:40.668 [2024-07-25 06:32:54.150535] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:40.668 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:40.668 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:40.668 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.668 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:40.925 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:40.925 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:40.925 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:41.183 [2024-07-25 06:32:54.614113] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:41.183 [2024-07-25 06:32:54.614162] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2512380 name Existed_Raid, state offline 00:16:41.183 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:41.183 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.183 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.183 06:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:41.750 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:41.750 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:41.750 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:41.750 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:41.750 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:41.750 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:42.008 BaseBdev2 00:16:42.008 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:42.008 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:42.008 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:42.008 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:42.008 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:42.008 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:42.008 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.008 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:42.266 [ 00:16:42.267 { 00:16:42.267 "name": "BaseBdev2", 00:16:42.267 "aliases": [ 00:16:42.267 "de4cfd22-ca60-4eea-a638-4a8c229c283b" 00:16:42.267 ], 00:16:42.267 "product_name": "Malloc disk", 00:16:42.267 "block_size": 512, 00:16:42.267 "num_blocks": 65536, 00:16:42.267 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:42.267 "assigned_rate_limits": { 00:16:42.267 "rw_ios_per_sec": 0, 00:16:42.267 "rw_mbytes_per_sec": 0, 00:16:42.267 "r_mbytes_per_sec": 0, 00:16:42.267 "w_mbytes_per_sec": 0 00:16:42.267 }, 00:16:42.267 "claimed": false, 00:16:42.267 "zoned": false, 00:16:42.267 "supported_io_types": { 00:16:42.267 "read": true, 00:16:42.267 "write": true, 00:16:42.267 "unmap": true, 00:16:42.267 "flush": true, 00:16:42.267 "reset": true, 00:16:42.267 "nvme_admin": false, 00:16:42.267 "nvme_io": false, 00:16:42.267 "nvme_io_md": false, 00:16:42.267 "write_zeroes": true, 00:16:42.267 "zcopy": true, 00:16:42.267 "get_zone_info": false, 00:16:42.267 "zone_management": false, 00:16:42.267 "zone_append": false, 00:16:42.267 "compare": false, 00:16:42.267 "compare_and_write": false, 00:16:42.267 "abort": true, 00:16:42.267 "seek_hole": false, 00:16:42.267 "seek_data": false, 00:16:42.267 "copy": true, 00:16:42.267 "nvme_iov_md": false 00:16:42.267 }, 00:16:42.267 "memory_domains": [ 00:16:42.267 { 00:16:42.267 "dma_device_id": "system", 00:16:42.267 "dma_device_type": 1 00:16:42.267 }, 00:16:42.267 { 00:16:42.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.267 "dma_device_type": 2 00:16:42.267 } 00:16:42.267 ], 00:16:42.267 "driver_specific": {} 00:16:42.267 } 00:16:42.267 ] 00:16:42.267 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:42.267 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:42.267 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:42.267 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:42.524 BaseBdev3 00:16:42.525 06:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:42.525 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:42.525 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:42.525 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:42.525 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:42.525 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:42.525 06:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.782 06:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:42.782 [ 00:16:42.782 { 00:16:42.782 "name": "BaseBdev3", 00:16:42.782 "aliases": [ 00:16:42.782 "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2" 00:16:42.782 ], 00:16:42.782 "product_name": "Malloc disk", 00:16:42.782 "block_size": 512, 00:16:42.782 "num_blocks": 65536, 00:16:42.782 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:42.782 "assigned_rate_limits": { 00:16:42.782 "rw_ios_per_sec": 0, 00:16:42.782 "rw_mbytes_per_sec": 0, 00:16:42.782 "r_mbytes_per_sec": 0, 00:16:42.782 "w_mbytes_per_sec": 0 00:16:42.782 }, 00:16:42.782 "claimed": false, 00:16:42.782 "zoned": false, 00:16:42.782 "supported_io_types": { 00:16:42.782 "read": true, 00:16:42.782 "write": true, 00:16:42.782 "unmap": true, 00:16:42.782 "flush": true, 00:16:42.782 "reset": true, 00:16:42.782 "nvme_admin": false, 00:16:42.782 "nvme_io": false, 00:16:42.782 "nvme_io_md": false, 00:16:42.782 "write_zeroes": true, 00:16:42.782 "zcopy": true, 00:16:42.782 "get_zone_info": false, 00:16:42.782 "zone_management": false, 00:16:42.782 "zone_append": false, 00:16:42.782 "compare": false, 00:16:42.782 "compare_and_write": false, 00:16:42.782 "abort": true, 00:16:42.783 "seek_hole": false, 00:16:42.783 "seek_data": false, 00:16:42.783 "copy": true, 00:16:42.783 "nvme_iov_md": false 00:16:42.783 }, 00:16:42.783 "memory_domains": [ 00:16:42.783 { 00:16:42.783 "dma_device_id": "system", 00:16:42.783 "dma_device_type": 1 00:16:42.783 }, 00:16:42.783 { 00:16:42.783 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.783 "dma_device_type": 2 00:16:42.783 } 00:16:42.783 ], 00:16:42.783 "driver_specific": {} 00:16:42.783 } 00:16:42.783 ] 00:16:42.783 06:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:42.783 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:42.783 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:42.783 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:43.041 [2024-07-25 06:32:56.517697] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:43.041 [2024-07-25 06:32:56.517733] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:43.041 [2024-07-25 06:32:56.517750] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:43.041 [2024-07-25 06:32:56.518956] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.041 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.299 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.299 "name": "Existed_Raid", 00:16:43.299 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:43.299 "strip_size_kb": 64, 00:16:43.299 "state": "configuring", 00:16:43.299 "raid_level": "raid0", 00:16:43.299 "superblock": true, 00:16:43.299 "num_base_bdevs": 3, 00:16:43.299 "num_base_bdevs_discovered": 2, 00:16:43.299 "num_base_bdevs_operational": 3, 00:16:43.299 "base_bdevs_list": [ 00:16:43.299 { 00:16:43.299 "name": "BaseBdev1", 00:16:43.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.299 "is_configured": false, 00:16:43.299 "data_offset": 0, 00:16:43.299 "data_size": 0 00:16:43.299 }, 00:16:43.299 { 00:16:43.299 "name": "BaseBdev2", 00:16:43.299 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:43.299 "is_configured": true, 00:16:43.299 "data_offset": 2048, 00:16:43.299 "data_size": 63488 00:16:43.299 }, 00:16:43.299 { 00:16:43.299 "name": "BaseBdev3", 00:16:43.299 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:43.299 "is_configured": true, 00:16:43.299 "data_offset": 2048, 00:16:43.299 "data_size": 63488 00:16:43.299 } 00:16:43.299 ] 00:16:43.299 }' 00:16:43.299 06:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.299 06:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.864 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:43.864 [2024-07-25 06:32:57.403997] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.123 "name": "Existed_Raid", 00:16:44.123 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:44.123 "strip_size_kb": 64, 00:16:44.123 "state": "configuring", 00:16:44.123 "raid_level": "raid0", 00:16:44.123 "superblock": true, 00:16:44.123 "num_base_bdevs": 3, 00:16:44.123 "num_base_bdevs_discovered": 1, 00:16:44.123 "num_base_bdevs_operational": 3, 00:16:44.123 "base_bdevs_list": [ 00:16:44.123 { 00:16:44.123 "name": "BaseBdev1", 00:16:44.123 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.123 "is_configured": false, 00:16:44.123 "data_offset": 0, 00:16:44.123 "data_size": 0 00:16:44.123 }, 00:16:44.123 { 00:16:44.123 "name": null, 00:16:44.123 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:44.123 "is_configured": false, 00:16:44.123 "data_offset": 2048, 00:16:44.123 "data_size": 63488 00:16:44.123 }, 00:16:44.123 { 00:16:44.123 "name": "BaseBdev3", 00:16:44.123 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:44.123 "is_configured": true, 00:16:44.123 "data_offset": 2048, 00:16:44.123 "data_size": 63488 00:16:44.123 } 00:16:44.123 ] 00:16:44.123 }' 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.123 06:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:44.690 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.690 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:44.951 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:44.951 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:45.209 [2024-07-25 06:32:58.582198] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:45.209 BaseBdev1 00:16:45.209 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:45.209 06:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:45.209 06:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:45.209 06:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:45.209 06:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:45.209 06:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:45.209 06:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.209 06:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:45.466 [ 00:16:45.466 { 00:16:45.466 "name": "BaseBdev1", 00:16:45.466 "aliases": [ 00:16:45.466 "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e" 00:16:45.466 ], 00:16:45.466 "product_name": "Malloc disk", 00:16:45.466 "block_size": 512, 00:16:45.466 "num_blocks": 65536, 00:16:45.466 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:45.466 "assigned_rate_limits": { 00:16:45.466 "rw_ios_per_sec": 0, 00:16:45.466 "rw_mbytes_per_sec": 0, 00:16:45.466 "r_mbytes_per_sec": 0, 00:16:45.466 "w_mbytes_per_sec": 0 00:16:45.466 }, 00:16:45.466 "claimed": true, 00:16:45.466 "claim_type": "exclusive_write", 00:16:45.466 "zoned": false, 00:16:45.466 "supported_io_types": { 00:16:45.466 "read": true, 00:16:45.466 "write": true, 00:16:45.466 "unmap": true, 00:16:45.466 "flush": true, 00:16:45.466 "reset": true, 00:16:45.467 "nvme_admin": false, 00:16:45.467 "nvme_io": false, 00:16:45.467 "nvme_io_md": false, 00:16:45.467 "write_zeroes": true, 00:16:45.467 "zcopy": true, 00:16:45.467 "get_zone_info": false, 00:16:45.467 "zone_management": false, 00:16:45.467 "zone_append": false, 00:16:45.467 "compare": false, 00:16:45.467 "compare_and_write": false, 00:16:45.467 "abort": true, 00:16:45.467 "seek_hole": false, 00:16:45.467 "seek_data": false, 00:16:45.467 "copy": true, 00:16:45.467 "nvme_iov_md": false 00:16:45.467 }, 00:16:45.467 "memory_domains": [ 00:16:45.467 { 00:16:45.467 "dma_device_id": "system", 00:16:45.467 "dma_device_type": 1 00:16:45.467 }, 00:16:45.467 { 00:16:45.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.467 "dma_device_type": 2 00:16:45.467 } 00:16:45.467 ], 00:16:45.467 "driver_specific": {} 00:16:45.467 } 00:16:45.467 ] 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.467 06:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.725 06:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.725 "name": "Existed_Raid", 00:16:45.725 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:45.725 "strip_size_kb": 64, 00:16:45.725 "state": "configuring", 00:16:45.725 "raid_level": "raid0", 00:16:45.725 "superblock": true, 00:16:45.725 "num_base_bdevs": 3, 00:16:45.725 "num_base_bdevs_discovered": 2, 00:16:45.725 "num_base_bdevs_operational": 3, 00:16:45.725 "base_bdevs_list": [ 00:16:45.725 { 00:16:45.725 "name": "BaseBdev1", 00:16:45.725 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:45.725 "is_configured": true, 00:16:45.725 "data_offset": 2048, 00:16:45.725 "data_size": 63488 00:16:45.725 }, 00:16:45.725 { 00:16:45.725 "name": null, 00:16:45.725 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:45.725 "is_configured": false, 00:16:45.725 "data_offset": 2048, 00:16:45.725 "data_size": 63488 00:16:45.725 }, 00:16:45.725 { 00:16:45.725 "name": "BaseBdev3", 00:16:45.725 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:45.725 "is_configured": true, 00:16:45.725 "data_offset": 2048, 00:16:45.725 "data_size": 63488 00:16:45.725 } 00:16:45.725 ] 00:16:45.725 }' 00:16:45.725 06:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.725 06:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:46.292 06:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.292 06:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:46.549 06:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:46.549 06:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:46.808 [2024-07-25 06:33:00.186434] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.808 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.066 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.066 "name": "Existed_Raid", 00:16:47.066 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:47.066 "strip_size_kb": 64, 00:16:47.066 "state": "configuring", 00:16:47.066 "raid_level": "raid0", 00:16:47.066 "superblock": true, 00:16:47.066 "num_base_bdevs": 3, 00:16:47.066 "num_base_bdevs_discovered": 1, 00:16:47.066 "num_base_bdevs_operational": 3, 00:16:47.066 "base_bdevs_list": [ 00:16:47.066 { 00:16:47.066 "name": "BaseBdev1", 00:16:47.066 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:47.066 "is_configured": true, 00:16:47.066 "data_offset": 2048, 00:16:47.066 "data_size": 63488 00:16:47.066 }, 00:16:47.066 { 00:16:47.066 "name": null, 00:16:47.066 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:47.066 "is_configured": false, 00:16:47.066 "data_offset": 2048, 00:16:47.066 "data_size": 63488 00:16:47.066 }, 00:16:47.066 { 00:16:47.066 "name": null, 00:16:47.066 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:47.066 "is_configured": false, 00:16:47.066 "data_offset": 2048, 00:16:47.066 "data_size": 63488 00:16:47.066 } 00:16:47.066 ] 00:16:47.066 }' 00:16:47.066 06:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.066 06:33:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.635 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.635 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:47.894 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:47.894 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:47.894 [2024-07-25 06:33:01.445756] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.153 "name": "Existed_Raid", 00:16:48.153 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:48.153 "strip_size_kb": 64, 00:16:48.153 "state": "configuring", 00:16:48.153 "raid_level": "raid0", 00:16:48.153 "superblock": true, 00:16:48.153 "num_base_bdevs": 3, 00:16:48.153 "num_base_bdevs_discovered": 2, 00:16:48.153 "num_base_bdevs_operational": 3, 00:16:48.153 "base_bdevs_list": [ 00:16:48.153 { 00:16:48.153 "name": "BaseBdev1", 00:16:48.153 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:48.153 "is_configured": true, 00:16:48.153 "data_offset": 2048, 00:16:48.153 "data_size": 63488 00:16:48.153 }, 00:16:48.153 { 00:16:48.153 "name": null, 00:16:48.153 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:48.153 "is_configured": false, 00:16:48.153 "data_offset": 2048, 00:16:48.153 "data_size": 63488 00:16:48.153 }, 00:16:48.153 { 00:16:48.153 "name": "BaseBdev3", 00:16:48.153 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:48.153 "is_configured": true, 00:16:48.153 "data_offset": 2048, 00:16:48.153 "data_size": 63488 00:16:48.153 } 00:16:48.153 ] 00:16:48.153 }' 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.153 06:33:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:49.089 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.089 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:49.089 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:49.089 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:49.348 [2024-07-25 06:33:02.721161] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.348 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.607 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.607 "name": "Existed_Raid", 00:16:49.607 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:49.607 "strip_size_kb": 64, 00:16:49.607 "state": "configuring", 00:16:49.607 "raid_level": "raid0", 00:16:49.607 "superblock": true, 00:16:49.607 "num_base_bdevs": 3, 00:16:49.607 "num_base_bdevs_discovered": 1, 00:16:49.607 "num_base_bdevs_operational": 3, 00:16:49.607 "base_bdevs_list": [ 00:16:49.607 { 00:16:49.607 "name": null, 00:16:49.607 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:49.607 "is_configured": false, 00:16:49.607 "data_offset": 2048, 00:16:49.607 "data_size": 63488 00:16:49.607 }, 00:16:49.607 { 00:16:49.607 "name": null, 00:16:49.607 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:49.607 "is_configured": false, 00:16:49.607 "data_offset": 2048, 00:16:49.607 "data_size": 63488 00:16:49.607 }, 00:16:49.607 { 00:16:49.607 "name": "BaseBdev3", 00:16:49.607 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:49.607 "is_configured": true, 00:16:49.607 "data_offset": 2048, 00:16:49.607 "data_size": 63488 00:16:49.607 } 00:16:49.607 ] 00:16:49.607 }' 00:16:49.607 06:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.607 06:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:50.175 06:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.175 06:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:50.494 06:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:50.494 06:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:50.753 [2024-07-25 06:33:04.070698] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.753 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:51.010 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.010 "name": "Existed_Raid", 00:16:51.010 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:51.010 "strip_size_kb": 64, 00:16:51.010 "state": "configuring", 00:16:51.011 "raid_level": "raid0", 00:16:51.011 "superblock": true, 00:16:51.011 "num_base_bdevs": 3, 00:16:51.011 "num_base_bdevs_discovered": 2, 00:16:51.011 "num_base_bdevs_operational": 3, 00:16:51.011 "base_bdevs_list": [ 00:16:51.011 { 00:16:51.011 "name": null, 00:16:51.011 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:51.011 "is_configured": false, 00:16:51.011 "data_offset": 2048, 00:16:51.011 "data_size": 63488 00:16:51.011 }, 00:16:51.011 { 00:16:51.011 "name": "BaseBdev2", 00:16:51.011 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:51.011 "is_configured": true, 00:16:51.011 "data_offset": 2048, 00:16:51.011 "data_size": 63488 00:16:51.011 }, 00:16:51.011 { 00:16:51.011 "name": "BaseBdev3", 00:16:51.011 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:51.011 "is_configured": true, 00:16:51.011 "data_offset": 2048, 00:16:51.011 "data_size": 63488 00:16:51.011 } 00:16:51.011 ] 00:16:51.011 }' 00:16:51.011 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.011 06:33:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:51.578 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.578 06:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:51.578 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:51.578 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.578 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:51.837 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e 00:16:52.096 [2024-07-25 06:33:05.505684] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:52.096 [2024-07-25 06:33:05.505820] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x250a310 00:16:52.096 [2024-07-25 06:33:05.505832] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:52.096 [2024-07-25 06:33:05.505989] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2509e10 00:16:52.096 [2024-07-25 06:33:05.506091] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250a310 00:16:52.096 [2024-07-25 06:33:05.506105] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x250a310 00:16:52.096 [2024-07-25 06:33:05.506202] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:52.096 NewBaseBdev 00:16:52.096 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:52.096 06:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:52.096 06:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:52.096 06:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:52.096 06:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:52.096 06:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:52.096 06:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:52.354 06:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:52.613 [ 00:16:52.613 { 00:16:52.613 "name": "NewBaseBdev", 00:16:52.613 "aliases": [ 00:16:52.613 "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e" 00:16:52.613 ], 00:16:52.613 "product_name": "Malloc disk", 00:16:52.613 "block_size": 512, 00:16:52.613 "num_blocks": 65536, 00:16:52.613 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:52.613 "assigned_rate_limits": { 00:16:52.613 "rw_ios_per_sec": 0, 00:16:52.613 "rw_mbytes_per_sec": 0, 00:16:52.613 "r_mbytes_per_sec": 0, 00:16:52.613 "w_mbytes_per_sec": 0 00:16:52.613 }, 00:16:52.613 "claimed": true, 00:16:52.613 "claim_type": "exclusive_write", 00:16:52.613 "zoned": false, 00:16:52.613 "supported_io_types": { 00:16:52.613 "read": true, 00:16:52.613 "write": true, 00:16:52.613 "unmap": true, 00:16:52.613 "flush": true, 00:16:52.613 "reset": true, 00:16:52.613 "nvme_admin": false, 00:16:52.613 "nvme_io": false, 00:16:52.613 "nvme_io_md": false, 00:16:52.613 "write_zeroes": true, 00:16:52.613 "zcopy": true, 00:16:52.613 "get_zone_info": false, 00:16:52.613 "zone_management": false, 00:16:52.613 "zone_append": false, 00:16:52.613 "compare": false, 00:16:52.613 "compare_and_write": false, 00:16:52.613 "abort": true, 00:16:52.613 "seek_hole": false, 00:16:52.613 "seek_data": false, 00:16:52.613 "copy": true, 00:16:52.613 "nvme_iov_md": false 00:16:52.613 }, 00:16:52.613 "memory_domains": [ 00:16:52.613 { 00:16:52.613 "dma_device_id": "system", 00:16:52.613 "dma_device_type": 1 00:16:52.613 }, 00:16:52.613 { 00:16:52.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.613 "dma_device_type": 2 00:16:52.613 } 00:16:52.613 ], 00:16:52.613 "driver_specific": {} 00:16:52.613 } 00:16:52.613 ] 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.613 06:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.873 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.873 "name": "Existed_Raid", 00:16:52.873 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:52.873 "strip_size_kb": 64, 00:16:52.873 "state": "online", 00:16:52.873 "raid_level": "raid0", 00:16:52.873 "superblock": true, 00:16:52.873 "num_base_bdevs": 3, 00:16:52.873 "num_base_bdevs_discovered": 3, 00:16:52.873 "num_base_bdevs_operational": 3, 00:16:52.873 "base_bdevs_list": [ 00:16:52.873 { 00:16:52.873 "name": "NewBaseBdev", 00:16:52.873 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:52.873 "is_configured": true, 00:16:52.873 "data_offset": 2048, 00:16:52.873 "data_size": 63488 00:16:52.873 }, 00:16:52.873 { 00:16:52.873 "name": "BaseBdev2", 00:16:52.873 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:52.873 "is_configured": true, 00:16:52.873 "data_offset": 2048, 00:16:52.873 "data_size": 63488 00:16:52.873 }, 00:16:52.873 { 00:16:52.873 "name": "BaseBdev3", 00:16:52.873 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:52.873 "is_configured": true, 00:16:52.873 "data_offset": 2048, 00:16:52.873 "data_size": 63488 00:16:52.873 } 00:16:52.873 ] 00:16:52.873 }' 00:16:52.873 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.873 06:33:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:53.441 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:53.441 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:53.441 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:53.441 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:53.441 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:53.441 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:53.441 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:53.442 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:53.442 [2024-07-25 06:33:06.977868] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:53.701 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:53.701 "name": "Existed_Raid", 00:16:53.701 "aliases": [ 00:16:53.701 "ab96080d-d48d-4ebb-a645-1c01182e6afc" 00:16:53.701 ], 00:16:53.701 "product_name": "Raid Volume", 00:16:53.701 "block_size": 512, 00:16:53.701 "num_blocks": 190464, 00:16:53.701 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:53.701 "assigned_rate_limits": { 00:16:53.701 "rw_ios_per_sec": 0, 00:16:53.701 "rw_mbytes_per_sec": 0, 00:16:53.701 "r_mbytes_per_sec": 0, 00:16:53.701 "w_mbytes_per_sec": 0 00:16:53.701 }, 00:16:53.701 "claimed": false, 00:16:53.701 "zoned": false, 00:16:53.701 "supported_io_types": { 00:16:53.701 "read": true, 00:16:53.701 "write": true, 00:16:53.701 "unmap": true, 00:16:53.701 "flush": true, 00:16:53.701 "reset": true, 00:16:53.701 "nvme_admin": false, 00:16:53.701 "nvme_io": false, 00:16:53.701 "nvme_io_md": false, 00:16:53.701 "write_zeroes": true, 00:16:53.701 "zcopy": false, 00:16:53.701 "get_zone_info": false, 00:16:53.701 "zone_management": false, 00:16:53.701 "zone_append": false, 00:16:53.701 "compare": false, 00:16:53.701 "compare_and_write": false, 00:16:53.701 "abort": false, 00:16:53.701 "seek_hole": false, 00:16:53.701 "seek_data": false, 00:16:53.701 "copy": false, 00:16:53.701 "nvme_iov_md": false 00:16:53.701 }, 00:16:53.701 "memory_domains": [ 00:16:53.701 { 00:16:53.701 "dma_device_id": "system", 00:16:53.701 "dma_device_type": 1 00:16:53.701 }, 00:16:53.701 { 00:16:53.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.701 "dma_device_type": 2 00:16:53.701 }, 00:16:53.701 { 00:16:53.701 "dma_device_id": "system", 00:16:53.701 "dma_device_type": 1 00:16:53.701 }, 00:16:53.701 { 00:16:53.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.701 "dma_device_type": 2 00:16:53.701 }, 00:16:53.701 { 00:16:53.701 "dma_device_id": "system", 00:16:53.701 "dma_device_type": 1 00:16:53.701 }, 00:16:53.701 { 00:16:53.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.701 "dma_device_type": 2 00:16:53.701 } 00:16:53.701 ], 00:16:53.701 "driver_specific": { 00:16:53.701 "raid": { 00:16:53.701 "uuid": "ab96080d-d48d-4ebb-a645-1c01182e6afc", 00:16:53.701 "strip_size_kb": 64, 00:16:53.701 "state": "online", 00:16:53.701 "raid_level": "raid0", 00:16:53.701 "superblock": true, 00:16:53.701 "num_base_bdevs": 3, 00:16:53.701 "num_base_bdevs_discovered": 3, 00:16:53.701 "num_base_bdevs_operational": 3, 00:16:53.701 "base_bdevs_list": [ 00:16:53.701 { 00:16:53.701 "name": "NewBaseBdev", 00:16:53.701 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:53.701 "is_configured": true, 00:16:53.701 "data_offset": 2048, 00:16:53.701 "data_size": 63488 00:16:53.701 }, 00:16:53.701 { 00:16:53.701 "name": "BaseBdev2", 00:16:53.701 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:53.701 "is_configured": true, 00:16:53.701 "data_offset": 2048, 00:16:53.701 "data_size": 63488 00:16:53.701 }, 00:16:53.701 { 00:16:53.701 "name": "BaseBdev3", 00:16:53.701 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:53.701 "is_configured": true, 00:16:53.701 "data_offset": 2048, 00:16:53.701 "data_size": 63488 00:16:53.701 } 00:16:53.701 ] 00:16:53.701 } 00:16:53.701 } 00:16:53.701 }' 00:16:53.701 06:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:53.701 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:53.701 BaseBdev2 00:16:53.701 BaseBdev3' 00:16:53.701 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.701 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:53.701 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.960 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.960 "name": "NewBaseBdev", 00:16:53.960 "aliases": [ 00:16:53.960 "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e" 00:16:53.960 ], 00:16:53.960 "product_name": "Malloc disk", 00:16:53.960 "block_size": 512, 00:16:53.960 "num_blocks": 65536, 00:16:53.960 "uuid": "1b64ec05-f2b3-4fb3-b579-e5750c3aaa0e", 00:16:53.960 "assigned_rate_limits": { 00:16:53.960 "rw_ios_per_sec": 0, 00:16:53.960 "rw_mbytes_per_sec": 0, 00:16:53.960 "r_mbytes_per_sec": 0, 00:16:53.960 "w_mbytes_per_sec": 0 00:16:53.960 }, 00:16:53.960 "claimed": true, 00:16:53.960 "claim_type": "exclusive_write", 00:16:53.960 "zoned": false, 00:16:53.960 "supported_io_types": { 00:16:53.960 "read": true, 00:16:53.960 "write": true, 00:16:53.960 "unmap": true, 00:16:53.961 "flush": true, 00:16:53.961 "reset": true, 00:16:53.961 "nvme_admin": false, 00:16:53.961 "nvme_io": false, 00:16:53.961 "nvme_io_md": false, 00:16:53.961 "write_zeroes": true, 00:16:53.961 "zcopy": true, 00:16:53.961 "get_zone_info": false, 00:16:53.961 "zone_management": false, 00:16:53.961 "zone_append": false, 00:16:53.961 "compare": false, 00:16:53.961 "compare_and_write": false, 00:16:53.961 "abort": true, 00:16:53.961 "seek_hole": false, 00:16:53.961 "seek_data": false, 00:16:53.961 "copy": true, 00:16:53.961 "nvme_iov_md": false 00:16:53.961 }, 00:16:53.961 "memory_domains": [ 00:16:53.961 { 00:16:53.961 "dma_device_id": "system", 00:16:53.961 "dma_device_type": 1 00:16:53.961 }, 00:16:53.961 { 00:16:53.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.961 "dma_device_type": 2 00:16:53.961 } 00:16:53.961 ], 00:16:53.961 "driver_specific": {} 00:16:53.961 }' 00:16:53.961 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.961 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.961 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.961 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.961 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.961 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.961 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.961 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.220 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.220 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.220 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.220 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.220 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.220 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:54.220 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.479 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.479 "name": "BaseBdev2", 00:16:54.479 "aliases": [ 00:16:54.479 "de4cfd22-ca60-4eea-a638-4a8c229c283b" 00:16:54.479 ], 00:16:54.479 "product_name": "Malloc disk", 00:16:54.479 "block_size": 512, 00:16:54.479 "num_blocks": 65536, 00:16:54.479 "uuid": "de4cfd22-ca60-4eea-a638-4a8c229c283b", 00:16:54.479 "assigned_rate_limits": { 00:16:54.479 "rw_ios_per_sec": 0, 00:16:54.479 "rw_mbytes_per_sec": 0, 00:16:54.479 "r_mbytes_per_sec": 0, 00:16:54.479 "w_mbytes_per_sec": 0 00:16:54.479 }, 00:16:54.479 "claimed": true, 00:16:54.479 "claim_type": "exclusive_write", 00:16:54.479 "zoned": false, 00:16:54.479 "supported_io_types": { 00:16:54.479 "read": true, 00:16:54.479 "write": true, 00:16:54.479 "unmap": true, 00:16:54.479 "flush": true, 00:16:54.479 "reset": true, 00:16:54.479 "nvme_admin": false, 00:16:54.479 "nvme_io": false, 00:16:54.479 "nvme_io_md": false, 00:16:54.479 "write_zeroes": true, 00:16:54.479 "zcopy": true, 00:16:54.479 "get_zone_info": false, 00:16:54.479 "zone_management": false, 00:16:54.479 "zone_append": false, 00:16:54.479 "compare": false, 00:16:54.479 "compare_and_write": false, 00:16:54.479 "abort": true, 00:16:54.479 "seek_hole": false, 00:16:54.479 "seek_data": false, 00:16:54.479 "copy": true, 00:16:54.479 "nvme_iov_md": false 00:16:54.479 }, 00:16:54.479 "memory_domains": [ 00:16:54.479 { 00:16:54.479 "dma_device_id": "system", 00:16:54.479 "dma_device_type": 1 00:16:54.479 }, 00:16:54.479 { 00:16:54.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.479 "dma_device_type": 2 00:16:54.479 } 00:16:54.479 ], 00:16:54.479 "driver_specific": {} 00:16:54.479 }' 00:16:54.479 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.479 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.479 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.479 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.479 06:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.479 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.479 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.738 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.738 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.738 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.738 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.738 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.739 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.739 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:54.739 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.998 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.998 "name": "BaseBdev3", 00:16:54.998 "aliases": [ 00:16:54.998 "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2" 00:16:54.998 ], 00:16:54.998 "product_name": "Malloc disk", 00:16:54.998 "block_size": 512, 00:16:54.998 "num_blocks": 65536, 00:16:54.998 "uuid": "72b4dab5-fbb2-4b2d-b040-82364d9b3ee2", 00:16:54.998 "assigned_rate_limits": { 00:16:54.998 "rw_ios_per_sec": 0, 00:16:54.998 "rw_mbytes_per_sec": 0, 00:16:54.998 "r_mbytes_per_sec": 0, 00:16:54.998 "w_mbytes_per_sec": 0 00:16:54.998 }, 00:16:54.998 "claimed": true, 00:16:54.998 "claim_type": "exclusive_write", 00:16:54.998 "zoned": false, 00:16:54.998 "supported_io_types": { 00:16:54.998 "read": true, 00:16:54.998 "write": true, 00:16:54.998 "unmap": true, 00:16:54.998 "flush": true, 00:16:54.998 "reset": true, 00:16:54.998 "nvme_admin": false, 00:16:54.998 "nvme_io": false, 00:16:54.998 "nvme_io_md": false, 00:16:54.998 "write_zeroes": true, 00:16:54.998 "zcopy": true, 00:16:54.998 "get_zone_info": false, 00:16:54.998 "zone_management": false, 00:16:54.998 "zone_append": false, 00:16:54.998 "compare": false, 00:16:54.998 "compare_and_write": false, 00:16:54.998 "abort": true, 00:16:54.998 "seek_hole": false, 00:16:54.998 "seek_data": false, 00:16:54.998 "copy": true, 00:16:54.998 "nvme_iov_md": false 00:16:54.998 }, 00:16:54.998 "memory_domains": [ 00:16:54.998 { 00:16:54.998 "dma_device_id": "system", 00:16:54.998 "dma_device_type": 1 00:16:54.998 }, 00:16:54.998 { 00:16:54.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.998 "dma_device_type": 2 00:16:54.998 } 00:16:54.998 ], 00:16:54.998 "driver_specific": {} 00:16:54.998 }' 00:16:54.998 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.998 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.998 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.998 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.257 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:55.517 [2024-07-25 06:33:08.970863] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:55.517 [2024-07-25 06:33:08.970888] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:55.517 [2024-07-25 06:33:08.970943] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:55.517 [2024-07-25 06:33:08.970991] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:55.517 [2024-07-25 06:33:08.971001] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250a310 name Existed_Raid, state offline 00:16:55.517 06:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1124316 00:16:55.517 06:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1124316 ']' 00:16:55.517 06:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1124316 00:16:55.517 06:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:16:55.517 06:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:55.517 06:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1124316 00:16:55.517 06:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:55.517 06:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:55.517 06:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1124316' 00:16:55.517 killing process with pid 1124316 00:16:55.517 06:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1124316 00:16:55.517 [2024-07-25 06:33:09.043534] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:55.517 06:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1124316 00:16:55.517 [2024-07-25 06:33:09.067065] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:55.776 06:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:55.776 00:16:55.776 real 0m26.595s 00:16:55.776 user 0m48.806s 00:16:55.776 sys 0m4.813s 00:16:55.776 06:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:55.776 06:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.776 ************************************ 00:16:55.776 END TEST raid_state_function_test_sb 00:16:55.776 ************************************ 00:16:55.776 06:33:09 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:16:55.776 06:33:09 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:16:55.776 06:33:09 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:55.776 06:33:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:56.036 ************************************ 00:16:56.036 START TEST raid_superblock_test 00:16:56.036 ************************************ 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 3 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1129951 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1129951 /var/tmp/spdk-raid.sock 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1129951 ']' 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:56.036 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:56.036 06:33:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:56.036 [2024-07-25 06:33:09.396353] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:16:56.036 [2024-07-25 06:33:09.396413] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1129951 ] 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:01.0 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:01.1 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:01.2 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:01.3 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:01.4 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:01.5 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:01.6 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:01.7 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:02.0 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:02.1 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:02.2 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.036 EAL: Requested device 0000:3d:02.3 cannot be used 00:16:56.036 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3d:02.4 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3d:02.5 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3d:02.6 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3d:02.7 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:01.0 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:01.1 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:01.2 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:01.3 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:01.4 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:01.5 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:01.6 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:01.7 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:02.0 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:02.1 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:02.2 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:02.3 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:02.4 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:02.5 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:02.6 cannot be used 00:16:56.037 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:16:56.037 EAL: Requested device 0000:3f:02.7 cannot be used 00:16:56.037 [2024-07-25 06:33:09.535025] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:56.037 [2024-07-25 06:33:09.578017] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:56.296 [2024-07-25 06:33:09.638481] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.296 [2024-07-25 06:33:09.638507] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:56.864 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:57.123 malloc1 00:16:57.123 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:57.383 [2024-07-25 06:33:10.740417] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:57.383 [2024-07-25 06:33:10.740464] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.383 [2024-07-25 06:33:10.740482] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x230ad70 00:16:57.383 [2024-07-25 06:33:10.740493] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.383 [2024-07-25 06:33:10.741894] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.383 [2024-07-25 06:33:10.741920] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:57.383 pt1 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:57.383 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:57.642 malloc2 00:16:57.642 06:33:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:57.642 [2024-07-25 06:33:11.197823] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:57.642 [2024-07-25 06:33:11.197861] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.642 [2024-07-25 06:33:11.197876] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2159790 00:16:57.642 [2024-07-25 06:33:11.197887] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.901 [2024-07-25 06:33:11.199113] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.901 [2024-07-25 06:33:11.199144] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:57.901 pt2 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:57.901 malloc3 00:16:57.901 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:58.160 [2024-07-25 06:33:11.655319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:58.160 [2024-07-25 06:33:11.655363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.160 [2024-07-25 06:33:11.655382] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22fe8c0 00:16:58.160 [2024-07-25 06:33:11.655394] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.160 [2024-07-25 06:33:11.656681] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.160 [2024-07-25 06:33:11.656708] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:58.160 pt3 00:16:58.160 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:58.160 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:58.160 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:58.419 [2024-07-25 06:33:11.883930] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:58.419 [2024-07-25 06:33:11.885015] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:58.419 [2024-07-25 06:33:11.885065] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:58.419 [2024-07-25 06:33:11.885217] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23000e0 00:16:58.419 [2024-07-25 06:33:11.885228] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:58.419 [2024-07-25 06:33:11.885404] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2150bd0 00:16:58.419 [2024-07-25 06:33:11.885529] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23000e0 00:16:58.419 [2024-07-25 06:33:11.885538] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23000e0 00:16:58.419 [2024-07-25 06:33:11.885620] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:58.419 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:58.419 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.419 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:58.419 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:58.419 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:58.419 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.419 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.419 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.420 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.420 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.420 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.420 06:33:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.679 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.679 "name": "raid_bdev1", 00:16:58.679 "uuid": "43df3fca-a911-4495-a7f8-d9d4277653c6", 00:16:58.679 "strip_size_kb": 64, 00:16:58.679 "state": "online", 00:16:58.679 "raid_level": "raid0", 00:16:58.679 "superblock": true, 00:16:58.679 "num_base_bdevs": 3, 00:16:58.679 "num_base_bdevs_discovered": 3, 00:16:58.679 "num_base_bdevs_operational": 3, 00:16:58.679 "base_bdevs_list": [ 00:16:58.679 { 00:16:58.679 "name": "pt1", 00:16:58.679 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.679 "is_configured": true, 00:16:58.679 "data_offset": 2048, 00:16:58.679 "data_size": 63488 00:16:58.679 }, 00:16:58.679 { 00:16:58.679 "name": "pt2", 00:16:58.679 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.679 "is_configured": true, 00:16:58.679 "data_offset": 2048, 00:16:58.679 "data_size": 63488 00:16:58.679 }, 00:16:58.679 { 00:16:58.679 "name": "pt3", 00:16:58.679 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.679 "is_configured": true, 00:16:58.679 "data_offset": 2048, 00:16:58.679 "data_size": 63488 00:16:58.679 } 00:16:58.679 ] 00:16:58.679 }' 00:16:58.679 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.679 06:33:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.247 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:16:59.247 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:59.247 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:59.247 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:59.247 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:59.247 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:59.247 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:59.247 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:59.505 [2024-07-25 06:33:12.922899] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:59.505 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:59.505 "name": "raid_bdev1", 00:16:59.505 "aliases": [ 00:16:59.505 "43df3fca-a911-4495-a7f8-d9d4277653c6" 00:16:59.505 ], 00:16:59.505 "product_name": "Raid Volume", 00:16:59.505 "block_size": 512, 00:16:59.505 "num_blocks": 190464, 00:16:59.505 "uuid": "43df3fca-a911-4495-a7f8-d9d4277653c6", 00:16:59.505 "assigned_rate_limits": { 00:16:59.505 "rw_ios_per_sec": 0, 00:16:59.505 "rw_mbytes_per_sec": 0, 00:16:59.505 "r_mbytes_per_sec": 0, 00:16:59.505 "w_mbytes_per_sec": 0 00:16:59.505 }, 00:16:59.505 "claimed": false, 00:16:59.505 "zoned": false, 00:16:59.505 "supported_io_types": { 00:16:59.505 "read": true, 00:16:59.505 "write": true, 00:16:59.505 "unmap": true, 00:16:59.505 "flush": true, 00:16:59.505 "reset": true, 00:16:59.505 "nvme_admin": false, 00:16:59.505 "nvme_io": false, 00:16:59.505 "nvme_io_md": false, 00:16:59.505 "write_zeroes": true, 00:16:59.505 "zcopy": false, 00:16:59.505 "get_zone_info": false, 00:16:59.505 "zone_management": false, 00:16:59.505 "zone_append": false, 00:16:59.505 "compare": false, 00:16:59.505 "compare_and_write": false, 00:16:59.505 "abort": false, 00:16:59.505 "seek_hole": false, 00:16:59.505 "seek_data": false, 00:16:59.505 "copy": false, 00:16:59.505 "nvme_iov_md": false 00:16:59.505 }, 00:16:59.505 "memory_domains": [ 00:16:59.505 { 00:16:59.505 "dma_device_id": "system", 00:16:59.505 "dma_device_type": 1 00:16:59.505 }, 00:16:59.505 { 00:16:59.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.505 "dma_device_type": 2 00:16:59.506 }, 00:16:59.506 { 00:16:59.506 "dma_device_id": "system", 00:16:59.506 "dma_device_type": 1 00:16:59.506 }, 00:16:59.506 { 00:16:59.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.506 "dma_device_type": 2 00:16:59.506 }, 00:16:59.506 { 00:16:59.506 "dma_device_id": "system", 00:16:59.506 "dma_device_type": 1 00:16:59.506 }, 00:16:59.506 { 00:16:59.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.506 "dma_device_type": 2 00:16:59.506 } 00:16:59.506 ], 00:16:59.506 "driver_specific": { 00:16:59.506 "raid": { 00:16:59.506 "uuid": "43df3fca-a911-4495-a7f8-d9d4277653c6", 00:16:59.506 "strip_size_kb": 64, 00:16:59.506 "state": "online", 00:16:59.506 "raid_level": "raid0", 00:16:59.506 "superblock": true, 00:16:59.506 "num_base_bdevs": 3, 00:16:59.506 "num_base_bdevs_discovered": 3, 00:16:59.506 "num_base_bdevs_operational": 3, 00:16:59.506 "base_bdevs_list": [ 00:16:59.506 { 00:16:59.506 "name": "pt1", 00:16:59.506 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.506 "is_configured": true, 00:16:59.506 "data_offset": 2048, 00:16:59.506 "data_size": 63488 00:16:59.506 }, 00:16:59.506 { 00:16:59.506 "name": "pt2", 00:16:59.506 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:59.506 "is_configured": true, 00:16:59.506 "data_offset": 2048, 00:16:59.506 "data_size": 63488 00:16:59.506 }, 00:16:59.506 { 00:16:59.506 "name": "pt3", 00:16:59.506 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:59.506 "is_configured": true, 00:16:59.506 "data_offset": 2048, 00:16:59.506 "data_size": 63488 00:16:59.506 } 00:16:59.506 ] 00:16:59.506 } 00:16:59.506 } 00:16:59.506 }' 00:16:59.506 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:59.506 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:59.506 pt2 00:16:59.506 pt3' 00:16:59.506 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:59.506 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:59.506 06:33:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:59.764 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:59.764 "name": "pt1", 00:16:59.764 "aliases": [ 00:16:59.764 "00000000-0000-0000-0000-000000000001" 00:16:59.764 ], 00:16:59.764 "product_name": "passthru", 00:16:59.764 "block_size": 512, 00:16:59.764 "num_blocks": 65536, 00:16:59.764 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.764 "assigned_rate_limits": { 00:16:59.764 "rw_ios_per_sec": 0, 00:16:59.764 "rw_mbytes_per_sec": 0, 00:16:59.764 "r_mbytes_per_sec": 0, 00:16:59.764 "w_mbytes_per_sec": 0 00:16:59.764 }, 00:16:59.764 "claimed": true, 00:16:59.764 "claim_type": "exclusive_write", 00:16:59.764 "zoned": false, 00:16:59.764 "supported_io_types": { 00:16:59.764 "read": true, 00:16:59.764 "write": true, 00:16:59.764 "unmap": true, 00:16:59.764 "flush": true, 00:16:59.764 "reset": true, 00:16:59.764 "nvme_admin": false, 00:16:59.764 "nvme_io": false, 00:16:59.764 "nvme_io_md": false, 00:16:59.764 "write_zeroes": true, 00:16:59.764 "zcopy": true, 00:16:59.764 "get_zone_info": false, 00:16:59.764 "zone_management": false, 00:16:59.764 "zone_append": false, 00:16:59.764 "compare": false, 00:16:59.764 "compare_and_write": false, 00:16:59.764 "abort": true, 00:16:59.764 "seek_hole": false, 00:16:59.764 "seek_data": false, 00:16:59.764 "copy": true, 00:16:59.764 "nvme_iov_md": false 00:16:59.764 }, 00:16:59.764 "memory_domains": [ 00:16:59.764 { 00:16:59.764 "dma_device_id": "system", 00:16:59.764 "dma_device_type": 1 00:16:59.764 }, 00:16:59.764 { 00:16:59.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.764 "dma_device_type": 2 00:16:59.764 } 00:16:59.764 ], 00:16:59.764 "driver_specific": { 00:16:59.764 "passthru": { 00:16:59.764 "name": "pt1", 00:16:59.764 "base_bdev_name": "malloc1" 00:16:59.764 } 00:16:59.764 } 00:16:59.764 }' 00:16:59.764 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.764 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.764 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:59.764 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:00.023 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:00.282 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:00.282 "name": "pt2", 00:17:00.282 "aliases": [ 00:17:00.282 "00000000-0000-0000-0000-000000000002" 00:17:00.282 ], 00:17:00.282 "product_name": "passthru", 00:17:00.282 "block_size": 512, 00:17:00.282 "num_blocks": 65536, 00:17:00.282 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.282 "assigned_rate_limits": { 00:17:00.282 "rw_ios_per_sec": 0, 00:17:00.282 "rw_mbytes_per_sec": 0, 00:17:00.282 "r_mbytes_per_sec": 0, 00:17:00.282 "w_mbytes_per_sec": 0 00:17:00.282 }, 00:17:00.282 "claimed": true, 00:17:00.282 "claim_type": "exclusive_write", 00:17:00.282 "zoned": false, 00:17:00.282 "supported_io_types": { 00:17:00.282 "read": true, 00:17:00.282 "write": true, 00:17:00.282 "unmap": true, 00:17:00.282 "flush": true, 00:17:00.282 "reset": true, 00:17:00.282 "nvme_admin": false, 00:17:00.282 "nvme_io": false, 00:17:00.282 "nvme_io_md": false, 00:17:00.282 "write_zeroes": true, 00:17:00.282 "zcopy": true, 00:17:00.282 "get_zone_info": false, 00:17:00.282 "zone_management": false, 00:17:00.282 "zone_append": false, 00:17:00.282 "compare": false, 00:17:00.282 "compare_and_write": false, 00:17:00.282 "abort": true, 00:17:00.282 "seek_hole": false, 00:17:00.282 "seek_data": false, 00:17:00.282 "copy": true, 00:17:00.282 "nvme_iov_md": false 00:17:00.282 }, 00:17:00.282 "memory_domains": [ 00:17:00.282 { 00:17:00.282 "dma_device_id": "system", 00:17:00.282 "dma_device_type": 1 00:17:00.282 }, 00:17:00.282 { 00:17:00.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.282 "dma_device_type": 2 00:17:00.282 } 00:17:00.282 ], 00:17:00.282 "driver_specific": { 00:17:00.282 "passthru": { 00:17:00.282 "name": "pt2", 00:17:00.282 "base_bdev_name": "malloc2" 00:17:00.282 } 00:17:00.282 } 00:17:00.282 }' 00:17:00.282 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.282 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.540 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:00.540 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.540 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.540 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:00.540 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.541 06:33:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.541 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.541 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.541 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.799 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.799 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.799 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:00.799 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:00.799 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:00.799 "name": "pt3", 00:17:00.799 "aliases": [ 00:17:00.799 "00000000-0000-0000-0000-000000000003" 00:17:00.799 ], 00:17:00.799 "product_name": "passthru", 00:17:00.799 "block_size": 512, 00:17:00.799 "num_blocks": 65536, 00:17:00.799 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:00.799 "assigned_rate_limits": { 00:17:00.799 "rw_ios_per_sec": 0, 00:17:00.799 "rw_mbytes_per_sec": 0, 00:17:00.799 "r_mbytes_per_sec": 0, 00:17:00.799 "w_mbytes_per_sec": 0 00:17:00.799 }, 00:17:00.799 "claimed": true, 00:17:00.799 "claim_type": "exclusive_write", 00:17:00.799 "zoned": false, 00:17:00.799 "supported_io_types": { 00:17:00.799 "read": true, 00:17:00.799 "write": true, 00:17:00.799 "unmap": true, 00:17:00.799 "flush": true, 00:17:00.799 "reset": true, 00:17:00.799 "nvme_admin": false, 00:17:00.799 "nvme_io": false, 00:17:00.799 "nvme_io_md": false, 00:17:00.799 "write_zeroes": true, 00:17:00.799 "zcopy": true, 00:17:00.800 "get_zone_info": false, 00:17:00.800 "zone_management": false, 00:17:00.800 "zone_append": false, 00:17:00.800 "compare": false, 00:17:00.800 "compare_and_write": false, 00:17:00.800 "abort": true, 00:17:00.800 "seek_hole": false, 00:17:00.800 "seek_data": false, 00:17:00.800 "copy": true, 00:17:00.800 "nvme_iov_md": false 00:17:00.800 }, 00:17:00.800 "memory_domains": [ 00:17:00.800 { 00:17:00.800 "dma_device_id": "system", 00:17:00.800 "dma_device_type": 1 00:17:00.800 }, 00:17:00.800 { 00:17:00.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.800 "dma_device_type": 2 00:17:00.800 } 00:17:00.800 ], 00:17:00.800 "driver_specific": { 00:17:00.800 "passthru": { 00:17:00.800 "name": "pt3", 00:17:00.800 "base_bdev_name": "malloc3" 00:17:00.800 } 00:17:00.800 } 00:17:00.800 }' 00:17:00.800 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.059 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.318 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.318 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.318 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.318 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:17:01.577 [2024-07-25 06:33:14.912178] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.577 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=43df3fca-a911-4495-a7f8-d9d4277653c6 00:17:01.577 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 43df3fca-a911-4495-a7f8-d9d4277653c6 ']' 00:17:01.577 06:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:01.836 [2024-07-25 06:33:15.140501] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:01.836 [2024-07-25 06:33:15.140522] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:01.836 [2024-07-25 06:33:15.140571] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:01.836 [2024-07-25 06:33:15.140625] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:01.836 [2024-07-25 06:33:15.140636] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23000e0 name raid_bdev1, state offline 00:17:01.836 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.836 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:17:02.096 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:17:02.096 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:17:02.096 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.096 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:02.096 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.096 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:02.355 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.355 06:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:02.614 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:02.614 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:02.874 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:03.138 [2024-07-25 06:33:16.516076] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:03.138 [2024-07-25 06:33:16.517330] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:03.139 [2024-07-25 06:33:16.517372] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:03.139 [2024-07-25 06:33:16.517413] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:03.139 [2024-07-25 06:33:16.517451] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:03.139 [2024-07-25 06:33:16.517472] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:03.139 [2024-07-25 06:33:16.517489] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:03.139 [2024-07-25 06:33:16.517498] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x214fdc0 name raid_bdev1, state configuring 00:17:03.139 request: 00:17:03.139 { 00:17:03.139 "name": "raid_bdev1", 00:17:03.139 "raid_level": "raid0", 00:17:03.139 "base_bdevs": [ 00:17:03.139 "malloc1", 00:17:03.139 "malloc2", 00:17:03.139 "malloc3" 00:17:03.139 ], 00:17:03.139 "strip_size_kb": 64, 00:17:03.139 "superblock": false, 00:17:03.139 "method": "bdev_raid_create", 00:17:03.139 "req_id": 1 00:17:03.139 } 00:17:03.139 Got JSON-RPC error response 00:17:03.139 response: 00:17:03.139 { 00:17:03.139 "code": -17, 00:17:03.139 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:03.139 } 00:17:03.139 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:17:03.139 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:17:03.139 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:17:03.139 06:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:17:03.139 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.139 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:03.435 [2024-07-25 06:33:16.921074] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:03.435 [2024-07-25 06:33:16.921122] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:03.435 [2024-07-25 06:33:16.921148] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22fbf60 00:17:03.435 [2024-07-25 06:33:16.921162] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:03.435 [2024-07-25 06:33:16.922647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:03.435 [2024-07-25 06:33:16.922675] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:03.435 [2024-07-25 06:33:16.922746] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:03.435 [2024-07-25 06:33:16.922768] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:03.435 pt1 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:03.435 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.436 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.436 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.436 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.436 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.436 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.436 06:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:03.694 06:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.694 "name": "raid_bdev1", 00:17:03.694 "uuid": "43df3fca-a911-4495-a7f8-d9d4277653c6", 00:17:03.694 "strip_size_kb": 64, 00:17:03.694 "state": "configuring", 00:17:03.694 "raid_level": "raid0", 00:17:03.694 "superblock": true, 00:17:03.694 "num_base_bdevs": 3, 00:17:03.694 "num_base_bdevs_discovered": 1, 00:17:03.694 "num_base_bdevs_operational": 3, 00:17:03.694 "base_bdevs_list": [ 00:17:03.694 { 00:17:03.694 "name": "pt1", 00:17:03.694 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:03.694 "is_configured": true, 00:17:03.694 "data_offset": 2048, 00:17:03.694 "data_size": 63488 00:17:03.694 }, 00:17:03.694 { 00:17:03.694 "name": null, 00:17:03.694 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:03.694 "is_configured": false, 00:17:03.694 "data_offset": 2048, 00:17:03.694 "data_size": 63488 00:17:03.694 }, 00:17:03.694 { 00:17:03.694 "name": null, 00:17:03.694 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.694 "is_configured": false, 00:17:03.694 "data_offset": 2048, 00:17:03.694 "data_size": 63488 00:17:03.694 } 00:17:03.694 ] 00:17:03.694 }' 00:17:03.694 06:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.694 06:33:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.260 06:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:17:04.260 06:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:04.519 [2024-07-25 06:33:17.955817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:04.519 [2024-07-25 06:33:17.955868] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:04.519 [2024-07-25 06:33:17.955887] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x230a6b0 00:17:04.519 [2024-07-25 06:33:17.955898] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:04.519 [2024-07-25 06:33:17.956233] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:04.519 [2024-07-25 06:33:17.956251] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:04.519 [2024-07-25 06:33:17.956312] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:04.519 [2024-07-25 06:33:17.956331] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:04.519 pt2 00:17:04.519 06:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:04.777 [2024-07-25 06:33:18.184439] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.777 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:05.035 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.035 "name": "raid_bdev1", 00:17:05.035 "uuid": "43df3fca-a911-4495-a7f8-d9d4277653c6", 00:17:05.035 "strip_size_kb": 64, 00:17:05.035 "state": "configuring", 00:17:05.035 "raid_level": "raid0", 00:17:05.035 "superblock": true, 00:17:05.035 "num_base_bdevs": 3, 00:17:05.035 "num_base_bdevs_discovered": 1, 00:17:05.035 "num_base_bdevs_operational": 3, 00:17:05.035 "base_bdevs_list": [ 00:17:05.035 { 00:17:05.035 "name": "pt1", 00:17:05.035 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:05.035 "is_configured": true, 00:17:05.035 "data_offset": 2048, 00:17:05.035 "data_size": 63488 00:17:05.035 }, 00:17:05.035 { 00:17:05.035 "name": null, 00:17:05.036 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:05.036 "is_configured": false, 00:17:05.036 "data_offset": 2048, 00:17:05.036 "data_size": 63488 00:17:05.036 }, 00:17:05.036 { 00:17:05.036 "name": null, 00:17:05.036 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:05.036 "is_configured": false, 00:17:05.036 "data_offset": 2048, 00:17:05.036 "data_size": 63488 00:17:05.036 } 00:17:05.036 ] 00:17:05.036 }' 00:17:05.036 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.036 06:33:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.601 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:17:05.601 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:05.601 06:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:05.858 [2024-07-25 06:33:19.187057] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:05.858 [2024-07-25 06:33:19.187107] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.858 [2024-07-25 06:33:19.187126] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22fd030 00:17:05.858 [2024-07-25 06:33:19.187146] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.858 [2024-07-25 06:33:19.187484] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.858 [2024-07-25 06:33:19.187502] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:05.858 [2024-07-25 06:33:19.187559] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:05.858 [2024-07-25 06:33:19.187575] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:05.858 pt2 00:17:05.858 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:17:05.858 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:05.858 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:06.117 [2024-07-25 06:33:19.415654] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:06.117 [2024-07-25 06:33:19.415686] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.117 [2024-07-25 06:33:19.415701] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x214f400 00:17:06.117 [2024-07-25 06:33:19.415712] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.117 [2024-07-25 06:33:19.415981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.117 [2024-07-25 06:33:19.415998] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:06.117 [2024-07-25 06:33:19.416049] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:06.117 [2024-07-25 06:33:19.416066] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:06.117 [2024-07-25 06:33:19.416175] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x214f720 00:17:06.117 [2024-07-25 06:33:19.416185] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:06.117 [2024-07-25 06:33:19.416337] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2153680 00:17:06.117 [2024-07-25 06:33:19.416448] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x214f720 00:17:06.117 [2024-07-25 06:33:19.416457] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x214f720 00:17:06.117 [2024-07-25 06:33:19.416541] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:06.117 pt3 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.117 "name": "raid_bdev1", 00:17:06.117 "uuid": "43df3fca-a911-4495-a7f8-d9d4277653c6", 00:17:06.117 "strip_size_kb": 64, 00:17:06.117 "state": "online", 00:17:06.117 "raid_level": "raid0", 00:17:06.117 "superblock": true, 00:17:06.117 "num_base_bdevs": 3, 00:17:06.117 "num_base_bdevs_discovered": 3, 00:17:06.117 "num_base_bdevs_operational": 3, 00:17:06.117 "base_bdevs_list": [ 00:17:06.117 { 00:17:06.117 "name": "pt1", 00:17:06.117 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:06.117 "is_configured": true, 00:17:06.117 "data_offset": 2048, 00:17:06.117 "data_size": 63488 00:17:06.117 }, 00:17:06.117 { 00:17:06.117 "name": "pt2", 00:17:06.117 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:06.117 "is_configured": true, 00:17:06.117 "data_offset": 2048, 00:17:06.117 "data_size": 63488 00:17:06.117 }, 00:17:06.117 { 00:17:06.117 "name": "pt3", 00:17:06.117 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:06.117 "is_configured": true, 00:17:06.117 "data_offset": 2048, 00:17:06.117 "data_size": 63488 00:17:06.117 } 00:17:06.117 ] 00:17:06.117 }' 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.117 06:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:07.051 [2024-07-25 06:33:20.454729] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:07.051 "name": "raid_bdev1", 00:17:07.051 "aliases": [ 00:17:07.051 "43df3fca-a911-4495-a7f8-d9d4277653c6" 00:17:07.051 ], 00:17:07.051 "product_name": "Raid Volume", 00:17:07.051 "block_size": 512, 00:17:07.051 "num_blocks": 190464, 00:17:07.051 "uuid": "43df3fca-a911-4495-a7f8-d9d4277653c6", 00:17:07.051 "assigned_rate_limits": { 00:17:07.051 "rw_ios_per_sec": 0, 00:17:07.051 "rw_mbytes_per_sec": 0, 00:17:07.051 "r_mbytes_per_sec": 0, 00:17:07.051 "w_mbytes_per_sec": 0 00:17:07.051 }, 00:17:07.051 "claimed": false, 00:17:07.051 "zoned": false, 00:17:07.051 "supported_io_types": { 00:17:07.051 "read": true, 00:17:07.051 "write": true, 00:17:07.051 "unmap": true, 00:17:07.051 "flush": true, 00:17:07.051 "reset": true, 00:17:07.051 "nvme_admin": false, 00:17:07.051 "nvme_io": false, 00:17:07.051 "nvme_io_md": false, 00:17:07.051 "write_zeroes": true, 00:17:07.051 "zcopy": false, 00:17:07.051 "get_zone_info": false, 00:17:07.051 "zone_management": false, 00:17:07.051 "zone_append": false, 00:17:07.051 "compare": false, 00:17:07.051 "compare_and_write": false, 00:17:07.051 "abort": false, 00:17:07.051 "seek_hole": false, 00:17:07.051 "seek_data": false, 00:17:07.051 "copy": false, 00:17:07.051 "nvme_iov_md": false 00:17:07.051 }, 00:17:07.051 "memory_domains": [ 00:17:07.051 { 00:17:07.051 "dma_device_id": "system", 00:17:07.051 "dma_device_type": 1 00:17:07.051 }, 00:17:07.051 { 00:17:07.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.051 "dma_device_type": 2 00:17:07.051 }, 00:17:07.051 { 00:17:07.051 "dma_device_id": "system", 00:17:07.051 "dma_device_type": 1 00:17:07.051 }, 00:17:07.051 { 00:17:07.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.051 "dma_device_type": 2 00:17:07.051 }, 00:17:07.051 { 00:17:07.051 "dma_device_id": "system", 00:17:07.051 "dma_device_type": 1 00:17:07.051 }, 00:17:07.051 { 00:17:07.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.051 "dma_device_type": 2 00:17:07.051 } 00:17:07.051 ], 00:17:07.051 "driver_specific": { 00:17:07.051 "raid": { 00:17:07.051 "uuid": "43df3fca-a911-4495-a7f8-d9d4277653c6", 00:17:07.051 "strip_size_kb": 64, 00:17:07.051 "state": "online", 00:17:07.051 "raid_level": "raid0", 00:17:07.051 "superblock": true, 00:17:07.051 "num_base_bdevs": 3, 00:17:07.051 "num_base_bdevs_discovered": 3, 00:17:07.051 "num_base_bdevs_operational": 3, 00:17:07.051 "base_bdevs_list": [ 00:17:07.051 { 00:17:07.051 "name": "pt1", 00:17:07.051 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.051 "is_configured": true, 00:17:07.051 "data_offset": 2048, 00:17:07.051 "data_size": 63488 00:17:07.051 }, 00:17:07.051 { 00:17:07.051 "name": "pt2", 00:17:07.051 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.051 "is_configured": true, 00:17:07.051 "data_offset": 2048, 00:17:07.051 "data_size": 63488 00:17:07.051 }, 00:17:07.051 { 00:17:07.051 "name": "pt3", 00:17:07.051 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:07.051 "is_configured": true, 00:17:07.051 "data_offset": 2048, 00:17:07.051 "data_size": 63488 00:17:07.051 } 00:17:07.051 ] 00:17:07.051 } 00:17:07.051 } 00:17:07.051 }' 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:07.051 pt2 00:17:07.051 pt3' 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:07.051 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.309 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.309 "name": "pt1", 00:17:07.309 "aliases": [ 00:17:07.309 "00000000-0000-0000-0000-000000000001" 00:17:07.309 ], 00:17:07.309 "product_name": "passthru", 00:17:07.309 "block_size": 512, 00:17:07.309 "num_blocks": 65536, 00:17:07.309 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.309 "assigned_rate_limits": { 00:17:07.309 "rw_ios_per_sec": 0, 00:17:07.309 "rw_mbytes_per_sec": 0, 00:17:07.309 "r_mbytes_per_sec": 0, 00:17:07.309 "w_mbytes_per_sec": 0 00:17:07.309 }, 00:17:07.309 "claimed": true, 00:17:07.309 "claim_type": "exclusive_write", 00:17:07.309 "zoned": false, 00:17:07.309 "supported_io_types": { 00:17:07.309 "read": true, 00:17:07.309 "write": true, 00:17:07.309 "unmap": true, 00:17:07.309 "flush": true, 00:17:07.309 "reset": true, 00:17:07.309 "nvme_admin": false, 00:17:07.309 "nvme_io": false, 00:17:07.309 "nvme_io_md": false, 00:17:07.309 "write_zeroes": true, 00:17:07.309 "zcopy": true, 00:17:07.309 "get_zone_info": false, 00:17:07.309 "zone_management": false, 00:17:07.309 "zone_append": false, 00:17:07.309 "compare": false, 00:17:07.309 "compare_and_write": false, 00:17:07.309 "abort": true, 00:17:07.309 "seek_hole": false, 00:17:07.309 "seek_data": false, 00:17:07.309 "copy": true, 00:17:07.309 "nvme_iov_md": false 00:17:07.309 }, 00:17:07.309 "memory_domains": [ 00:17:07.309 { 00:17:07.309 "dma_device_id": "system", 00:17:07.309 "dma_device_type": 1 00:17:07.309 }, 00:17:07.309 { 00:17:07.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.309 "dma_device_type": 2 00:17:07.309 } 00:17:07.309 ], 00:17:07.309 "driver_specific": { 00:17:07.309 "passthru": { 00:17:07.309 "name": "pt1", 00:17:07.310 "base_bdev_name": "malloc1" 00:17:07.310 } 00:17:07.310 } 00:17:07.310 }' 00:17:07.310 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.310 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.310 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.310 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.567 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.567 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.567 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.567 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.567 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.567 06:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.567 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.567 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.567 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.567 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:07.567 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.825 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.825 "name": "pt2", 00:17:07.825 "aliases": [ 00:17:07.825 "00000000-0000-0000-0000-000000000002" 00:17:07.825 ], 00:17:07.825 "product_name": "passthru", 00:17:07.825 "block_size": 512, 00:17:07.825 "num_blocks": 65536, 00:17:07.825 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.825 "assigned_rate_limits": { 00:17:07.825 "rw_ios_per_sec": 0, 00:17:07.825 "rw_mbytes_per_sec": 0, 00:17:07.825 "r_mbytes_per_sec": 0, 00:17:07.825 "w_mbytes_per_sec": 0 00:17:07.825 }, 00:17:07.825 "claimed": true, 00:17:07.825 "claim_type": "exclusive_write", 00:17:07.825 "zoned": false, 00:17:07.825 "supported_io_types": { 00:17:07.825 "read": true, 00:17:07.825 "write": true, 00:17:07.825 "unmap": true, 00:17:07.825 "flush": true, 00:17:07.825 "reset": true, 00:17:07.825 "nvme_admin": false, 00:17:07.825 "nvme_io": false, 00:17:07.825 "nvme_io_md": false, 00:17:07.825 "write_zeroes": true, 00:17:07.825 "zcopy": true, 00:17:07.825 "get_zone_info": false, 00:17:07.825 "zone_management": false, 00:17:07.825 "zone_append": false, 00:17:07.825 "compare": false, 00:17:07.825 "compare_and_write": false, 00:17:07.825 "abort": true, 00:17:07.825 "seek_hole": false, 00:17:07.825 "seek_data": false, 00:17:07.825 "copy": true, 00:17:07.825 "nvme_iov_md": false 00:17:07.825 }, 00:17:07.825 "memory_domains": [ 00:17:07.825 { 00:17:07.825 "dma_device_id": "system", 00:17:07.825 "dma_device_type": 1 00:17:07.825 }, 00:17:07.825 { 00:17:07.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.825 "dma_device_type": 2 00:17:07.825 } 00:17:07.825 ], 00:17:07.825 "driver_specific": { 00:17:07.825 "passthru": { 00:17:07.825 "name": "pt2", 00:17:07.825 "base_bdev_name": "malloc2" 00:17:07.825 } 00:17:07.825 } 00:17:07.825 }' 00:17:07.825 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.825 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.083 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.341 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.341 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.341 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:08.341 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:08.341 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:08.341 "name": "pt3", 00:17:08.341 "aliases": [ 00:17:08.341 "00000000-0000-0000-0000-000000000003" 00:17:08.341 ], 00:17:08.341 "product_name": "passthru", 00:17:08.341 "block_size": 512, 00:17:08.341 "num_blocks": 65536, 00:17:08.341 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:08.341 "assigned_rate_limits": { 00:17:08.341 "rw_ios_per_sec": 0, 00:17:08.341 "rw_mbytes_per_sec": 0, 00:17:08.341 "r_mbytes_per_sec": 0, 00:17:08.341 "w_mbytes_per_sec": 0 00:17:08.341 }, 00:17:08.341 "claimed": true, 00:17:08.341 "claim_type": "exclusive_write", 00:17:08.341 "zoned": false, 00:17:08.341 "supported_io_types": { 00:17:08.341 "read": true, 00:17:08.341 "write": true, 00:17:08.341 "unmap": true, 00:17:08.341 "flush": true, 00:17:08.341 "reset": true, 00:17:08.341 "nvme_admin": false, 00:17:08.341 "nvme_io": false, 00:17:08.341 "nvme_io_md": false, 00:17:08.341 "write_zeroes": true, 00:17:08.341 "zcopy": true, 00:17:08.341 "get_zone_info": false, 00:17:08.341 "zone_management": false, 00:17:08.341 "zone_append": false, 00:17:08.341 "compare": false, 00:17:08.341 "compare_and_write": false, 00:17:08.341 "abort": true, 00:17:08.341 "seek_hole": false, 00:17:08.341 "seek_data": false, 00:17:08.341 "copy": true, 00:17:08.341 "nvme_iov_md": false 00:17:08.341 }, 00:17:08.341 "memory_domains": [ 00:17:08.341 { 00:17:08.341 "dma_device_id": "system", 00:17:08.341 "dma_device_type": 1 00:17:08.341 }, 00:17:08.341 { 00:17:08.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.341 "dma_device_type": 2 00:17:08.341 } 00:17:08.341 ], 00:17:08.341 "driver_specific": { 00:17:08.341 "passthru": { 00:17:08.341 "name": "pt3", 00:17:08.341 "base_bdev_name": "malloc3" 00:17:08.341 } 00:17:08.341 } 00:17:08.341 }' 00:17:08.341 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.598 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.598 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.598 06:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.598 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.598 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.598 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.598 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.598 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.598 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.856 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.856 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.856 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:08.856 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:17:09.114 [2024-07-25 06:33:22.451972] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 43df3fca-a911-4495-a7f8-d9d4277653c6 '!=' 43df3fca-a911-4495-a7f8-d9d4277653c6 ']' 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1129951 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1129951 ']' 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1129951 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1129951 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1129951' 00:17:09.114 killing process with pid 1129951 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1129951 00:17:09.114 [2024-07-25 06:33:22.531730] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:09.114 [2024-07-25 06:33:22.531788] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:09.114 [2024-07-25 06:33:22.531836] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:09.114 [2024-07-25 06:33:22.531846] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x214f720 name raid_bdev1, state offline 00:17:09.114 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1129951 00:17:09.114 [2024-07-25 06:33:22.555613] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:09.372 06:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:17:09.372 00:17:09.372 real 0m13.399s 00:17:09.372 user 0m24.062s 00:17:09.372 sys 0m2.507s 00:17:09.372 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:09.372 06:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.372 ************************************ 00:17:09.372 END TEST raid_superblock_test 00:17:09.372 ************************************ 00:17:09.372 06:33:22 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:17:09.372 06:33:22 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:09.372 06:33:22 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:09.372 06:33:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:09.372 ************************************ 00:17:09.372 START TEST raid_read_error_test 00:17:09.372 ************************************ 00:17:09.372 06:33:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 read 00:17:09.372 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:17:09.372 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:17:09.372 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:17:09.372 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:17:09.372 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:09.372 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:17:09.372 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.RS8ZzxfNJ3 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1132609 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1132609 /var/tmp/spdk-raid.sock 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1132609 ']' 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:09.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:09.373 06:33:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.373 [2024-07-25 06:33:22.898326] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:17:09.373 [2024-07-25 06:33:22.898384] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132609 ] 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:01.0 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:01.1 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:01.2 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:01.3 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:01.4 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:01.5 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:01.6 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:01.7 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:02.0 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:02.1 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:02.2 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:02.3 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:02.4 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:02.5 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:02.6 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3d:02.7 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:01.0 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:01.1 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:01.2 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:01.3 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:01.4 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:01.5 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:01.6 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:01.7 cannot be used 00:17:09.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.631 EAL: Requested device 0000:3f:02.0 cannot be used 00:17:09.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.632 EAL: Requested device 0000:3f:02.1 cannot be used 00:17:09.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.632 EAL: Requested device 0000:3f:02.2 cannot be used 00:17:09.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.632 EAL: Requested device 0000:3f:02.3 cannot be used 00:17:09.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.632 EAL: Requested device 0000:3f:02.4 cannot be used 00:17:09.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.632 EAL: Requested device 0000:3f:02.5 cannot be used 00:17:09.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.632 EAL: Requested device 0000:3f:02.6 cannot be used 00:17:09.632 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:09.632 EAL: Requested device 0000:3f:02.7 cannot be used 00:17:09.632 [2024-07-25 06:33:23.035120] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:09.632 [2024-07-25 06:33:23.079954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:09.632 [2024-07-25 06:33:23.137747] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:09.632 [2024-07-25 06:33:23.137781] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:10.564 06:33:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:10.564 06:33:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:17:10.564 06:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:10.564 06:33:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:10.564 BaseBdev1_malloc 00:17:10.564 06:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:10.821 true 00:17:10.821 06:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:11.080 [2024-07-25 06:33:24.489602] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:11.080 [2024-07-25 06:33:24.489641] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.080 [2024-07-25 06:33:24.489659] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ecca60 00:17:11.080 [2024-07-25 06:33:24.489671] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.080 [2024-07-25 06:33:24.491115] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.080 [2024-07-25 06:33:24.491150] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:11.080 BaseBdev1 00:17:11.080 06:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:11.080 06:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:11.338 BaseBdev2_malloc 00:17:11.338 06:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:11.596 true 00:17:11.596 06:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:11.854 [2024-07-25 06:33:25.159673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:11.854 [2024-07-25 06:33:25.159711] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.854 [2024-07-25 06:33:25.159731] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ed1dc0 00:17:11.854 [2024-07-25 06:33:25.159743] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.854 [2024-07-25 06:33:25.161058] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.854 [2024-07-25 06:33:25.161084] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:11.854 BaseBdev2 00:17:11.854 06:33:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:11.854 06:33:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:11.854 BaseBdev3_malloc 00:17:11.854 06:33:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:12.112 true 00:17:12.113 06:33:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:12.370 [2024-07-25 06:33:25.825613] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:12.371 [2024-07-25 06:33:25.825651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.371 [2024-07-25 06:33:25.825669] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ed2420 00:17:12.371 [2024-07-25 06:33:25.825681] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.371 [2024-07-25 06:33:25.826988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.371 [2024-07-25 06:33:25.827016] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:12.371 BaseBdev3 00:17:12.371 06:33:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:12.629 [2024-07-25 06:33:26.050238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:12.629 [2024-07-25 06:33:26.051397] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:12.629 [2024-07-25 06:33:26.051462] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:12.629 [2024-07-25 06:33:26.051645] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ed51d0 00:17:12.629 [2024-07-25 06:33:26.051655] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:12.629 [2024-07-25 06:33:26.051829] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d283e0 00:17:12.629 [2024-07-25 06:33:26.051966] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ed51d0 00:17:12.629 [2024-07-25 06:33:26.051975] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ed51d0 00:17:12.629 [2024-07-25 06:33:26.052067] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.629 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:12.887 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.887 "name": "raid_bdev1", 00:17:12.887 "uuid": "19ffaa0b-a587-496f-bad0-5539ae047a03", 00:17:12.887 "strip_size_kb": 64, 00:17:12.887 "state": "online", 00:17:12.887 "raid_level": "raid0", 00:17:12.887 "superblock": true, 00:17:12.887 "num_base_bdevs": 3, 00:17:12.887 "num_base_bdevs_discovered": 3, 00:17:12.887 "num_base_bdevs_operational": 3, 00:17:12.887 "base_bdevs_list": [ 00:17:12.887 { 00:17:12.887 "name": "BaseBdev1", 00:17:12.887 "uuid": "eae10701-5fe0-57df-b37a-9383049d2d56", 00:17:12.887 "is_configured": true, 00:17:12.887 "data_offset": 2048, 00:17:12.887 "data_size": 63488 00:17:12.887 }, 00:17:12.887 { 00:17:12.887 "name": "BaseBdev2", 00:17:12.887 "uuid": "5f27ae46-6250-53c7-90f4-ceecb796bed6", 00:17:12.887 "is_configured": true, 00:17:12.887 "data_offset": 2048, 00:17:12.887 "data_size": 63488 00:17:12.887 }, 00:17:12.887 { 00:17:12.887 "name": "BaseBdev3", 00:17:12.887 "uuid": "b7b30926-1c84-577a-a120-91c855339726", 00:17:12.887 "is_configured": true, 00:17:12.887 "data_offset": 2048, 00:17:12.887 "data_size": 63488 00:17:12.887 } 00:17:12.887 ] 00:17:12.887 }' 00:17:12.888 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.888 06:33:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.454 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:17:13.454 06:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:13.454 [2024-07-25 06:33:26.960845] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ed1650 00:17:14.387 06:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.646 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:14.904 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.904 "name": "raid_bdev1", 00:17:14.904 "uuid": "19ffaa0b-a587-496f-bad0-5539ae047a03", 00:17:14.904 "strip_size_kb": 64, 00:17:14.904 "state": "online", 00:17:14.904 "raid_level": "raid0", 00:17:14.904 "superblock": true, 00:17:14.904 "num_base_bdevs": 3, 00:17:14.904 "num_base_bdevs_discovered": 3, 00:17:14.904 "num_base_bdevs_operational": 3, 00:17:14.904 "base_bdevs_list": [ 00:17:14.904 { 00:17:14.904 "name": "BaseBdev1", 00:17:14.904 "uuid": "eae10701-5fe0-57df-b37a-9383049d2d56", 00:17:14.904 "is_configured": true, 00:17:14.904 "data_offset": 2048, 00:17:14.904 "data_size": 63488 00:17:14.904 }, 00:17:14.904 { 00:17:14.904 "name": "BaseBdev2", 00:17:14.904 "uuid": "5f27ae46-6250-53c7-90f4-ceecb796bed6", 00:17:14.904 "is_configured": true, 00:17:14.904 "data_offset": 2048, 00:17:14.904 "data_size": 63488 00:17:14.904 }, 00:17:14.904 { 00:17:14.905 "name": "BaseBdev3", 00:17:14.905 "uuid": "b7b30926-1c84-577a-a120-91c855339726", 00:17:14.905 "is_configured": true, 00:17:14.905 "data_offset": 2048, 00:17:14.905 "data_size": 63488 00:17:14.905 } 00:17:14.905 ] 00:17:14.905 }' 00:17:14.905 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.905 06:33:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.471 06:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:15.729 [2024-07-25 06:33:29.116472] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:15.729 [2024-07-25 06:33:29.116503] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:15.729 [2024-07-25 06:33:29.119394] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:15.729 [2024-07-25 06:33:29.119424] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:15.729 [2024-07-25 06:33:29.119454] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:15.729 [2024-07-25 06:33:29.119464] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ed51d0 name raid_bdev1, state offline 00:17:15.729 0 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1132609 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1132609 ']' 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1132609 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1132609 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1132609' 00:17:15.729 killing process with pid 1132609 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1132609 00:17:15.729 [2024-07-25 06:33:29.193927] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:15.729 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1132609 00:17:15.729 [2024-07-25 06:33:29.212212] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:15.987 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.RS8ZzxfNJ3 00:17:15.987 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:17:15.987 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:17:15.987 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:17:15.987 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:17:15.987 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:15.988 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:15.988 06:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:17:15.988 00:17:15.988 real 0m6.584s 00:17:15.988 user 0m10.414s 00:17:15.988 sys 0m1.138s 00:17:15.988 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:15.988 06:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.988 ************************************ 00:17:15.988 END TEST raid_read_error_test 00:17:15.988 ************************************ 00:17:15.988 06:33:29 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:17:15.988 06:33:29 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:15.988 06:33:29 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:15.988 06:33:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:15.988 ************************************ 00:17:15.988 START TEST raid_write_error_test 00:17:15.988 ************************************ 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 write 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.0119rf85n5 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1133776 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1133776 /var/tmp/spdk-raid.sock 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1133776 ']' 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:15.988 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:15.988 06:33:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.246 [2024-07-25 06:33:29.565603] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:17:16.246 [2024-07-25 06:33:29.565659] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133776 ] 00:17:16.246 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.246 EAL: Requested device 0000:3d:01.0 cannot be used 00:17:16.246 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.246 EAL: Requested device 0000:3d:01.1 cannot be used 00:17:16.246 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.246 EAL: Requested device 0000:3d:01.2 cannot be used 00:17:16.246 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.246 EAL: Requested device 0000:3d:01.3 cannot be used 00:17:16.246 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.246 EAL: Requested device 0000:3d:01.4 cannot be used 00:17:16.246 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:01.5 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:01.6 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:01.7 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:02.0 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:02.1 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:02.2 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:02.3 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:02.4 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:02.5 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:02.6 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3d:02.7 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:01.0 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:01.1 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:01.2 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:01.3 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:01.4 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:01.5 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:01.6 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:01.7 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:02.0 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:02.1 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:02.2 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:02.3 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:02.4 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:02.5 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:02.6 cannot be used 00:17:16.247 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:16.247 EAL: Requested device 0000:3f:02.7 cannot be used 00:17:16.247 [2024-07-25 06:33:29.690503] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:16.247 [2024-07-25 06:33:29.735318] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:16.247 [2024-07-25 06:33:29.793548] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:16.247 [2024-07-25 06:33:29.793585] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:17.238 06:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:17.238 06:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:17:17.238 06:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:17.238 06:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:17.238 BaseBdev1_malloc 00:17:17.238 06:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:17.495 true 00:17:17.495 06:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:17.752 [2024-07-25 06:33:31.128719] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:17.752 [2024-07-25 06:33:31.128758] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:17.752 [2024-07-25 06:33:31.128777] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa40a60 00:17:17.752 [2024-07-25 06:33:31.128788] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:17.752 [2024-07-25 06:33:31.130251] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:17.752 [2024-07-25 06:33:31.130278] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:17.752 BaseBdev1 00:17:17.753 06:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:17.753 06:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:18.010 BaseBdev2_malloc 00:17:18.010 06:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:18.267 true 00:17:18.267 06:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:18.267 [2024-07-25 06:33:31.778582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:18.267 [2024-07-25 06:33:31.778620] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.267 [2024-07-25 06:33:31.778639] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa45dc0 00:17:18.267 [2024-07-25 06:33:31.778651] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.267 [2024-07-25 06:33:31.779961] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.267 [2024-07-25 06:33:31.779987] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:18.267 BaseBdev2 00:17:18.267 06:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:18.267 06:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:18.524 BaseBdev3_malloc 00:17:18.524 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:18.781 true 00:17:18.781 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:19.038 [2024-07-25 06:33:32.468615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:19.038 [2024-07-25 06:33:32.468652] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:19.038 [2024-07-25 06:33:32.468669] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa46420 00:17:19.038 [2024-07-25 06:33:32.468681] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:19.038 [2024-07-25 06:33:32.470024] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:19.038 [2024-07-25 06:33:32.470050] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:19.038 BaseBdev3 00:17:19.038 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:19.296 [2024-07-25 06:33:32.697247] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:19.296 [2024-07-25 06:33:32.698409] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:19.296 [2024-07-25 06:33:32.698473] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:19.296 [2024-07-25 06:33:32.698656] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa491d0 00:17:19.296 [2024-07-25 06:33:32.698667] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:19.296 [2024-07-25 06:33:32.698847] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x89c3e0 00:17:19.296 [2024-07-25 06:33:32.698984] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa491d0 00:17:19.296 [2024-07-25 06:33:32.698993] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa491d0 00:17:19.296 [2024-07-25 06:33:32.699085] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.296 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:19.553 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.554 "name": "raid_bdev1", 00:17:19.554 "uuid": "d8d56f37-420c-4dc2-a972-252a70a9f33d", 00:17:19.554 "strip_size_kb": 64, 00:17:19.554 "state": "online", 00:17:19.554 "raid_level": "raid0", 00:17:19.554 "superblock": true, 00:17:19.554 "num_base_bdevs": 3, 00:17:19.554 "num_base_bdevs_discovered": 3, 00:17:19.554 "num_base_bdevs_operational": 3, 00:17:19.554 "base_bdevs_list": [ 00:17:19.554 { 00:17:19.554 "name": "BaseBdev1", 00:17:19.554 "uuid": "81570709-0b07-5140-8c07-d802d47cb955", 00:17:19.554 "is_configured": true, 00:17:19.554 "data_offset": 2048, 00:17:19.554 "data_size": 63488 00:17:19.554 }, 00:17:19.554 { 00:17:19.554 "name": "BaseBdev2", 00:17:19.554 "uuid": "47514c37-ad52-5a54-927b-c80cf88a6477", 00:17:19.554 "is_configured": true, 00:17:19.554 "data_offset": 2048, 00:17:19.554 "data_size": 63488 00:17:19.554 }, 00:17:19.554 { 00:17:19.554 "name": "BaseBdev3", 00:17:19.554 "uuid": "13a3c86e-5995-50f6-a9e5-79e41a809f8f", 00:17:19.554 "is_configured": true, 00:17:19.554 "data_offset": 2048, 00:17:19.554 "data_size": 63488 00:17:19.554 } 00:17:19.554 ] 00:17:19.554 }' 00:17:19.554 06:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.554 06:33:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.117 06:33:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:20.117 06:33:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:17:20.117 [2024-07-25 06:33:33.611860] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa45650 00:17:21.049 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.306 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:21.563 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.563 "name": "raid_bdev1", 00:17:21.563 "uuid": "d8d56f37-420c-4dc2-a972-252a70a9f33d", 00:17:21.563 "strip_size_kb": 64, 00:17:21.563 "state": "online", 00:17:21.563 "raid_level": "raid0", 00:17:21.563 "superblock": true, 00:17:21.563 "num_base_bdevs": 3, 00:17:21.563 "num_base_bdevs_discovered": 3, 00:17:21.563 "num_base_bdevs_operational": 3, 00:17:21.563 "base_bdevs_list": [ 00:17:21.563 { 00:17:21.563 "name": "BaseBdev1", 00:17:21.563 "uuid": "81570709-0b07-5140-8c07-d802d47cb955", 00:17:21.563 "is_configured": true, 00:17:21.563 "data_offset": 2048, 00:17:21.563 "data_size": 63488 00:17:21.563 }, 00:17:21.563 { 00:17:21.563 "name": "BaseBdev2", 00:17:21.563 "uuid": "47514c37-ad52-5a54-927b-c80cf88a6477", 00:17:21.563 "is_configured": true, 00:17:21.563 "data_offset": 2048, 00:17:21.563 "data_size": 63488 00:17:21.563 }, 00:17:21.563 { 00:17:21.563 "name": "BaseBdev3", 00:17:21.563 "uuid": "13a3c86e-5995-50f6-a9e5-79e41a809f8f", 00:17:21.563 "is_configured": true, 00:17:21.563 "data_offset": 2048, 00:17:21.563 "data_size": 63488 00:17:21.563 } 00:17:21.563 ] 00:17:21.563 }' 00:17:21.563 06:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.563 06:33:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.127 06:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:22.384 [2024-07-25 06:33:35.786859] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:22.384 [2024-07-25 06:33:35.786904] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:22.384 [2024-07-25 06:33:35.789819] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:22.384 [2024-07-25 06:33:35.789851] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:22.384 [2024-07-25 06:33:35.789881] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:22.384 [2024-07-25 06:33:35.789891] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa491d0 name raid_bdev1, state offline 00:17:22.384 0 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1133776 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1133776 ']' 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1133776 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1133776 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1133776' 00:17:22.384 killing process with pid 1133776 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1133776 00:17:22.384 [2024-07-25 06:33:35.865708] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:22.384 06:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1133776 00:17:22.384 [2024-07-25 06:33:35.884330] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.0119rf85n5 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:17:22.642 00:17:22.642 real 0m6.587s 00:17:22.642 user 0m10.383s 00:17:22.642 sys 0m1.157s 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:22.642 06:33:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.642 ************************************ 00:17:22.642 END TEST raid_write_error_test 00:17:22.642 ************************************ 00:17:22.642 06:33:36 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:17:22.642 06:33:36 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:17:22.642 06:33:36 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:22.642 06:33:36 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:22.642 06:33:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:22.642 ************************************ 00:17:22.642 START TEST raid_state_function_test 00:17:22.642 ************************************ 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 false 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1134939 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1134939' 00:17:22.642 Process raid pid: 1134939 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1134939 /var/tmp/spdk-raid.sock 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1134939 ']' 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:22.642 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:22.642 06:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.900 [2024-07-25 06:33:36.231766] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:17:22.900 [2024-07-25 06:33:36.231828] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:01.0 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:01.1 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:01.2 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:01.3 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:01.4 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:01.5 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:01.6 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:01.7 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:02.0 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:02.1 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:02.2 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:02.3 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:02.4 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:02.5 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:02.6 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3d:02.7 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:01.0 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:01.1 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:01.2 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:01.3 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:01.4 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:01.5 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:01.6 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:01.7 cannot be used 00:17:22.900 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.900 EAL: Requested device 0000:3f:02.0 cannot be used 00:17:22.901 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.901 EAL: Requested device 0000:3f:02.1 cannot be used 00:17:22.901 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.901 EAL: Requested device 0000:3f:02.2 cannot be used 00:17:22.901 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.901 EAL: Requested device 0000:3f:02.3 cannot be used 00:17:22.901 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.901 EAL: Requested device 0000:3f:02.4 cannot be used 00:17:22.901 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.901 EAL: Requested device 0000:3f:02.5 cannot be used 00:17:22.901 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.901 EAL: Requested device 0000:3f:02.6 cannot be used 00:17:22.901 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:22.901 EAL: Requested device 0000:3f:02.7 cannot be used 00:17:22.901 [2024-07-25 06:33:36.369720] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:22.901 [2024-07-25 06:33:36.412542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:23.157 [2024-07-25 06:33:36.471622] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:23.157 [2024-07-25 06:33:36.471649] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:23.720 06:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:23.720 06:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:17:23.720 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:23.977 [2024-07-25 06:33:37.343716] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:23.977 [2024-07-25 06:33:37.343752] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:23.977 [2024-07-25 06:33:37.343762] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:23.977 [2024-07-25 06:33:37.343773] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:23.977 [2024-07-25 06:33:37.343781] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:23.977 [2024-07-25 06:33:37.343791] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.977 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.235 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.235 "name": "Existed_Raid", 00:17:24.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.235 "strip_size_kb": 64, 00:17:24.235 "state": "configuring", 00:17:24.235 "raid_level": "concat", 00:17:24.235 "superblock": false, 00:17:24.235 "num_base_bdevs": 3, 00:17:24.235 "num_base_bdevs_discovered": 0, 00:17:24.235 "num_base_bdevs_operational": 3, 00:17:24.235 "base_bdevs_list": [ 00:17:24.235 { 00:17:24.235 "name": "BaseBdev1", 00:17:24.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.235 "is_configured": false, 00:17:24.235 "data_offset": 0, 00:17:24.235 "data_size": 0 00:17:24.235 }, 00:17:24.235 { 00:17:24.235 "name": "BaseBdev2", 00:17:24.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.235 "is_configured": false, 00:17:24.235 "data_offset": 0, 00:17:24.235 "data_size": 0 00:17:24.235 }, 00:17:24.235 { 00:17:24.235 "name": "BaseBdev3", 00:17:24.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.235 "is_configured": false, 00:17:24.235 "data_offset": 0, 00:17:24.235 "data_size": 0 00:17:24.235 } 00:17:24.235 ] 00:17:24.235 }' 00:17:24.235 06:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.235 06:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.800 06:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:25.058 [2024-07-25 06:33:38.362288] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:25.058 [2024-07-25 06:33:38.362312] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x266b470 name Existed_Raid, state configuring 00:17:25.058 06:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:25.058 [2024-07-25 06:33:38.590899] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:25.058 [2024-07-25 06:33:38.590928] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:25.058 [2024-07-25 06:33:38.590937] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:25.058 [2024-07-25 06:33:38.590947] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:25.058 [2024-07-25 06:33:38.590955] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:25.058 [2024-07-25 06:33:38.590965] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:25.058 06:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:25.316 [2024-07-25 06:33:38.824873] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:25.316 BaseBdev1 00:17:25.316 06:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:25.316 06:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:25.316 06:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:25.316 06:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:25.316 06:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:25.316 06:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:25.316 06:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:25.574 06:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:25.832 [ 00:17:25.832 { 00:17:25.832 "name": "BaseBdev1", 00:17:25.832 "aliases": [ 00:17:25.832 "0eb202be-03ce-4674-8029-9c34839a2570" 00:17:25.832 ], 00:17:25.832 "product_name": "Malloc disk", 00:17:25.832 "block_size": 512, 00:17:25.832 "num_blocks": 65536, 00:17:25.832 "uuid": "0eb202be-03ce-4674-8029-9c34839a2570", 00:17:25.832 "assigned_rate_limits": { 00:17:25.832 "rw_ios_per_sec": 0, 00:17:25.832 "rw_mbytes_per_sec": 0, 00:17:25.832 "r_mbytes_per_sec": 0, 00:17:25.832 "w_mbytes_per_sec": 0 00:17:25.832 }, 00:17:25.832 "claimed": true, 00:17:25.832 "claim_type": "exclusive_write", 00:17:25.832 "zoned": false, 00:17:25.832 "supported_io_types": { 00:17:25.832 "read": true, 00:17:25.832 "write": true, 00:17:25.832 "unmap": true, 00:17:25.832 "flush": true, 00:17:25.832 "reset": true, 00:17:25.832 "nvme_admin": false, 00:17:25.832 "nvme_io": false, 00:17:25.832 "nvme_io_md": false, 00:17:25.832 "write_zeroes": true, 00:17:25.832 "zcopy": true, 00:17:25.832 "get_zone_info": false, 00:17:25.832 "zone_management": false, 00:17:25.832 "zone_append": false, 00:17:25.832 "compare": false, 00:17:25.832 "compare_and_write": false, 00:17:25.832 "abort": true, 00:17:25.832 "seek_hole": false, 00:17:25.832 "seek_data": false, 00:17:25.832 "copy": true, 00:17:25.832 "nvme_iov_md": false 00:17:25.832 }, 00:17:25.832 "memory_domains": [ 00:17:25.832 { 00:17:25.832 "dma_device_id": "system", 00:17:25.832 "dma_device_type": 1 00:17:25.832 }, 00:17:25.832 { 00:17:25.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.832 "dma_device_type": 2 00:17:25.832 } 00:17:25.832 ], 00:17:25.832 "driver_specific": {} 00:17:25.832 } 00:17:25.832 ] 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.832 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.091 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.091 "name": "Existed_Raid", 00:17:26.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.091 "strip_size_kb": 64, 00:17:26.091 "state": "configuring", 00:17:26.091 "raid_level": "concat", 00:17:26.091 "superblock": false, 00:17:26.091 "num_base_bdevs": 3, 00:17:26.091 "num_base_bdevs_discovered": 1, 00:17:26.091 "num_base_bdevs_operational": 3, 00:17:26.091 "base_bdevs_list": [ 00:17:26.091 { 00:17:26.091 "name": "BaseBdev1", 00:17:26.091 "uuid": "0eb202be-03ce-4674-8029-9c34839a2570", 00:17:26.091 "is_configured": true, 00:17:26.091 "data_offset": 0, 00:17:26.091 "data_size": 65536 00:17:26.091 }, 00:17:26.091 { 00:17:26.091 "name": "BaseBdev2", 00:17:26.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.091 "is_configured": false, 00:17:26.091 "data_offset": 0, 00:17:26.091 "data_size": 0 00:17:26.091 }, 00:17:26.091 { 00:17:26.091 "name": "BaseBdev3", 00:17:26.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.091 "is_configured": false, 00:17:26.091 "data_offset": 0, 00:17:26.091 "data_size": 0 00:17:26.091 } 00:17:26.091 ] 00:17:26.091 }' 00:17:26.091 06:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.091 06:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.657 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:26.915 [2024-07-25 06:33:40.284704] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:26.915 [2024-07-25 06:33:40.284745] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x266ace0 name Existed_Raid, state configuring 00:17:26.915 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:27.173 [2024-07-25 06:33:40.513330] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:27.173 [2024-07-25 06:33:40.514722] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:27.173 [2024-07-25 06:33:40.514755] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:27.173 [2024-07-25 06:33:40.514765] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:27.173 [2024-07-25 06:33:40.514776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:27.173 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.174 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.432 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.432 "name": "Existed_Raid", 00:17:27.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.432 "strip_size_kb": 64, 00:17:27.432 "state": "configuring", 00:17:27.432 "raid_level": "concat", 00:17:27.432 "superblock": false, 00:17:27.432 "num_base_bdevs": 3, 00:17:27.432 "num_base_bdevs_discovered": 1, 00:17:27.432 "num_base_bdevs_operational": 3, 00:17:27.432 "base_bdevs_list": [ 00:17:27.432 { 00:17:27.432 "name": "BaseBdev1", 00:17:27.432 "uuid": "0eb202be-03ce-4674-8029-9c34839a2570", 00:17:27.432 "is_configured": true, 00:17:27.432 "data_offset": 0, 00:17:27.432 "data_size": 65536 00:17:27.432 }, 00:17:27.432 { 00:17:27.432 "name": "BaseBdev2", 00:17:27.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.432 "is_configured": false, 00:17:27.432 "data_offset": 0, 00:17:27.432 "data_size": 0 00:17:27.432 }, 00:17:27.432 { 00:17:27.432 "name": "BaseBdev3", 00:17:27.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.432 "is_configured": false, 00:17:27.432 "data_offset": 0, 00:17:27.432 "data_size": 0 00:17:27.432 } 00:17:27.432 ] 00:17:27.432 }' 00:17:27.432 06:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.432 06:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.999 06:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:28.257 [2024-07-25 06:33:41.563305] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:28.257 BaseBdev2 00:17:28.257 06:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:28.257 06:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:28.257 06:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:28.257 06:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:28.257 06:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:28.257 06:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:28.257 06:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:28.257 06:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:28.516 [ 00:17:28.516 { 00:17:28.516 "name": "BaseBdev2", 00:17:28.516 "aliases": [ 00:17:28.516 "adc78484-6f29-485d-b1ef-d63c9eae7d17" 00:17:28.516 ], 00:17:28.516 "product_name": "Malloc disk", 00:17:28.516 "block_size": 512, 00:17:28.516 "num_blocks": 65536, 00:17:28.516 "uuid": "adc78484-6f29-485d-b1ef-d63c9eae7d17", 00:17:28.516 "assigned_rate_limits": { 00:17:28.516 "rw_ios_per_sec": 0, 00:17:28.516 "rw_mbytes_per_sec": 0, 00:17:28.516 "r_mbytes_per_sec": 0, 00:17:28.516 "w_mbytes_per_sec": 0 00:17:28.516 }, 00:17:28.516 "claimed": true, 00:17:28.516 "claim_type": "exclusive_write", 00:17:28.516 "zoned": false, 00:17:28.516 "supported_io_types": { 00:17:28.516 "read": true, 00:17:28.516 "write": true, 00:17:28.516 "unmap": true, 00:17:28.516 "flush": true, 00:17:28.516 "reset": true, 00:17:28.516 "nvme_admin": false, 00:17:28.516 "nvme_io": false, 00:17:28.516 "nvme_io_md": false, 00:17:28.516 "write_zeroes": true, 00:17:28.516 "zcopy": true, 00:17:28.516 "get_zone_info": false, 00:17:28.516 "zone_management": false, 00:17:28.516 "zone_append": false, 00:17:28.516 "compare": false, 00:17:28.516 "compare_and_write": false, 00:17:28.516 "abort": true, 00:17:28.516 "seek_hole": false, 00:17:28.516 "seek_data": false, 00:17:28.516 "copy": true, 00:17:28.516 "nvme_iov_md": false 00:17:28.516 }, 00:17:28.516 "memory_domains": [ 00:17:28.516 { 00:17:28.516 "dma_device_id": "system", 00:17:28.516 "dma_device_type": 1 00:17:28.516 }, 00:17:28.516 { 00:17:28.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.516 "dma_device_type": 2 00:17:28.516 } 00:17:28.516 ], 00:17:28.516 "driver_specific": {} 00:17:28.516 } 00:17:28.516 ] 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.516 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.774 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.774 "name": "Existed_Raid", 00:17:28.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.774 "strip_size_kb": 64, 00:17:28.774 "state": "configuring", 00:17:28.774 "raid_level": "concat", 00:17:28.774 "superblock": false, 00:17:28.774 "num_base_bdevs": 3, 00:17:28.774 "num_base_bdevs_discovered": 2, 00:17:28.774 "num_base_bdevs_operational": 3, 00:17:28.774 "base_bdevs_list": [ 00:17:28.774 { 00:17:28.774 "name": "BaseBdev1", 00:17:28.774 "uuid": "0eb202be-03ce-4674-8029-9c34839a2570", 00:17:28.774 "is_configured": true, 00:17:28.774 "data_offset": 0, 00:17:28.774 "data_size": 65536 00:17:28.774 }, 00:17:28.774 { 00:17:28.774 "name": "BaseBdev2", 00:17:28.774 "uuid": "adc78484-6f29-485d-b1ef-d63c9eae7d17", 00:17:28.774 "is_configured": true, 00:17:28.774 "data_offset": 0, 00:17:28.774 "data_size": 65536 00:17:28.774 }, 00:17:28.774 { 00:17:28.774 "name": "BaseBdev3", 00:17:28.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.774 "is_configured": false, 00:17:28.774 "data_offset": 0, 00:17:28.774 "data_size": 0 00:17:28.774 } 00:17:28.774 ] 00:17:28.774 }' 00:17:28.774 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.774 06:33:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.340 06:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:29.616 [2024-07-25 06:33:43.066477] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:29.616 [2024-07-25 06:33:43.066510] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x281e380 00:17:29.616 [2024-07-25 06:33:43.066518] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:17:29.616 [2024-07-25 06:33:43.066701] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2817290 00:17:29.616 [2024-07-25 06:33:43.066817] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x281e380 00:17:29.616 [2024-07-25 06:33:43.066827] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x281e380 00:17:29.616 [2024-07-25 06:33:43.066976] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:29.616 BaseBdev3 00:17:29.616 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:29.616 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:29.616 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:29.616 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:29.616 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:29.616 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:29.616 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:29.889 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:30.147 [ 00:17:30.147 { 00:17:30.147 "name": "BaseBdev3", 00:17:30.147 "aliases": [ 00:17:30.147 "961444b4-5a06-4c7f-8901-ddb9a16b792e" 00:17:30.147 ], 00:17:30.147 "product_name": "Malloc disk", 00:17:30.147 "block_size": 512, 00:17:30.147 "num_blocks": 65536, 00:17:30.147 "uuid": "961444b4-5a06-4c7f-8901-ddb9a16b792e", 00:17:30.147 "assigned_rate_limits": { 00:17:30.147 "rw_ios_per_sec": 0, 00:17:30.147 "rw_mbytes_per_sec": 0, 00:17:30.147 "r_mbytes_per_sec": 0, 00:17:30.147 "w_mbytes_per_sec": 0 00:17:30.147 }, 00:17:30.147 "claimed": true, 00:17:30.147 "claim_type": "exclusive_write", 00:17:30.147 "zoned": false, 00:17:30.147 "supported_io_types": { 00:17:30.147 "read": true, 00:17:30.147 "write": true, 00:17:30.147 "unmap": true, 00:17:30.147 "flush": true, 00:17:30.147 "reset": true, 00:17:30.147 "nvme_admin": false, 00:17:30.147 "nvme_io": false, 00:17:30.147 "nvme_io_md": false, 00:17:30.147 "write_zeroes": true, 00:17:30.147 "zcopy": true, 00:17:30.147 "get_zone_info": false, 00:17:30.147 "zone_management": false, 00:17:30.147 "zone_append": false, 00:17:30.147 "compare": false, 00:17:30.147 "compare_and_write": false, 00:17:30.147 "abort": true, 00:17:30.147 "seek_hole": false, 00:17:30.147 "seek_data": false, 00:17:30.147 "copy": true, 00:17:30.147 "nvme_iov_md": false 00:17:30.147 }, 00:17:30.147 "memory_domains": [ 00:17:30.147 { 00:17:30.147 "dma_device_id": "system", 00:17:30.147 "dma_device_type": 1 00:17:30.147 }, 00:17:30.147 { 00:17:30.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.147 "dma_device_type": 2 00:17:30.147 } 00:17:30.147 ], 00:17:30.147 "driver_specific": {} 00:17:30.147 } 00:17:30.147 ] 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.147 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.404 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.404 "name": "Existed_Raid", 00:17:30.404 "uuid": "4014ae6b-4a5d-4f30-a26b-7321184c2d7e", 00:17:30.404 "strip_size_kb": 64, 00:17:30.404 "state": "online", 00:17:30.404 "raid_level": "concat", 00:17:30.404 "superblock": false, 00:17:30.404 "num_base_bdevs": 3, 00:17:30.404 "num_base_bdevs_discovered": 3, 00:17:30.405 "num_base_bdevs_operational": 3, 00:17:30.405 "base_bdevs_list": [ 00:17:30.405 { 00:17:30.405 "name": "BaseBdev1", 00:17:30.405 "uuid": "0eb202be-03ce-4674-8029-9c34839a2570", 00:17:30.405 "is_configured": true, 00:17:30.405 "data_offset": 0, 00:17:30.405 "data_size": 65536 00:17:30.405 }, 00:17:30.405 { 00:17:30.405 "name": "BaseBdev2", 00:17:30.405 "uuid": "adc78484-6f29-485d-b1ef-d63c9eae7d17", 00:17:30.405 "is_configured": true, 00:17:30.405 "data_offset": 0, 00:17:30.405 "data_size": 65536 00:17:30.405 }, 00:17:30.405 { 00:17:30.405 "name": "BaseBdev3", 00:17:30.405 "uuid": "961444b4-5a06-4c7f-8901-ddb9a16b792e", 00:17:30.405 "is_configured": true, 00:17:30.405 "data_offset": 0, 00:17:30.405 "data_size": 65536 00:17:30.405 } 00:17:30.405 ] 00:17:30.405 }' 00:17:30.405 06:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.405 06:33:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.969 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:30.969 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:30.969 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:30.969 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:30.969 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:30.969 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:30.969 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:30.969 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:31.227 [2024-07-25 06:33:44.562701] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:31.227 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:31.227 "name": "Existed_Raid", 00:17:31.227 "aliases": [ 00:17:31.227 "4014ae6b-4a5d-4f30-a26b-7321184c2d7e" 00:17:31.227 ], 00:17:31.227 "product_name": "Raid Volume", 00:17:31.227 "block_size": 512, 00:17:31.227 "num_blocks": 196608, 00:17:31.227 "uuid": "4014ae6b-4a5d-4f30-a26b-7321184c2d7e", 00:17:31.227 "assigned_rate_limits": { 00:17:31.227 "rw_ios_per_sec": 0, 00:17:31.227 "rw_mbytes_per_sec": 0, 00:17:31.227 "r_mbytes_per_sec": 0, 00:17:31.227 "w_mbytes_per_sec": 0 00:17:31.227 }, 00:17:31.227 "claimed": false, 00:17:31.227 "zoned": false, 00:17:31.227 "supported_io_types": { 00:17:31.227 "read": true, 00:17:31.227 "write": true, 00:17:31.227 "unmap": true, 00:17:31.227 "flush": true, 00:17:31.227 "reset": true, 00:17:31.227 "nvme_admin": false, 00:17:31.227 "nvme_io": false, 00:17:31.227 "nvme_io_md": false, 00:17:31.227 "write_zeroes": true, 00:17:31.227 "zcopy": false, 00:17:31.227 "get_zone_info": false, 00:17:31.227 "zone_management": false, 00:17:31.227 "zone_append": false, 00:17:31.227 "compare": false, 00:17:31.227 "compare_and_write": false, 00:17:31.227 "abort": false, 00:17:31.227 "seek_hole": false, 00:17:31.227 "seek_data": false, 00:17:31.227 "copy": false, 00:17:31.227 "nvme_iov_md": false 00:17:31.227 }, 00:17:31.227 "memory_domains": [ 00:17:31.227 { 00:17:31.227 "dma_device_id": "system", 00:17:31.227 "dma_device_type": 1 00:17:31.227 }, 00:17:31.227 { 00:17:31.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.227 "dma_device_type": 2 00:17:31.227 }, 00:17:31.227 { 00:17:31.227 "dma_device_id": "system", 00:17:31.227 "dma_device_type": 1 00:17:31.227 }, 00:17:31.227 { 00:17:31.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.227 "dma_device_type": 2 00:17:31.227 }, 00:17:31.227 { 00:17:31.227 "dma_device_id": "system", 00:17:31.227 "dma_device_type": 1 00:17:31.227 }, 00:17:31.227 { 00:17:31.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.227 "dma_device_type": 2 00:17:31.227 } 00:17:31.227 ], 00:17:31.227 "driver_specific": { 00:17:31.227 "raid": { 00:17:31.227 "uuid": "4014ae6b-4a5d-4f30-a26b-7321184c2d7e", 00:17:31.227 "strip_size_kb": 64, 00:17:31.227 "state": "online", 00:17:31.227 "raid_level": "concat", 00:17:31.227 "superblock": false, 00:17:31.227 "num_base_bdevs": 3, 00:17:31.227 "num_base_bdevs_discovered": 3, 00:17:31.227 "num_base_bdevs_operational": 3, 00:17:31.227 "base_bdevs_list": [ 00:17:31.227 { 00:17:31.227 "name": "BaseBdev1", 00:17:31.227 "uuid": "0eb202be-03ce-4674-8029-9c34839a2570", 00:17:31.227 "is_configured": true, 00:17:31.227 "data_offset": 0, 00:17:31.227 "data_size": 65536 00:17:31.227 }, 00:17:31.227 { 00:17:31.227 "name": "BaseBdev2", 00:17:31.227 "uuid": "adc78484-6f29-485d-b1ef-d63c9eae7d17", 00:17:31.227 "is_configured": true, 00:17:31.227 "data_offset": 0, 00:17:31.227 "data_size": 65536 00:17:31.227 }, 00:17:31.227 { 00:17:31.227 "name": "BaseBdev3", 00:17:31.227 "uuid": "961444b4-5a06-4c7f-8901-ddb9a16b792e", 00:17:31.227 "is_configured": true, 00:17:31.227 "data_offset": 0, 00:17:31.227 "data_size": 65536 00:17:31.227 } 00:17:31.227 ] 00:17:31.227 } 00:17:31.227 } 00:17:31.227 }' 00:17:31.227 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:31.227 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:31.227 BaseBdev2 00:17:31.227 BaseBdev3' 00:17:31.227 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.227 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:31.227 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.483 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.483 "name": "BaseBdev1", 00:17:31.483 "aliases": [ 00:17:31.483 "0eb202be-03ce-4674-8029-9c34839a2570" 00:17:31.483 ], 00:17:31.483 "product_name": "Malloc disk", 00:17:31.483 "block_size": 512, 00:17:31.483 "num_blocks": 65536, 00:17:31.483 "uuid": "0eb202be-03ce-4674-8029-9c34839a2570", 00:17:31.483 "assigned_rate_limits": { 00:17:31.483 "rw_ios_per_sec": 0, 00:17:31.483 "rw_mbytes_per_sec": 0, 00:17:31.483 "r_mbytes_per_sec": 0, 00:17:31.483 "w_mbytes_per_sec": 0 00:17:31.483 }, 00:17:31.483 "claimed": true, 00:17:31.483 "claim_type": "exclusive_write", 00:17:31.483 "zoned": false, 00:17:31.483 "supported_io_types": { 00:17:31.483 "read": true, 00:17:31.483 "write": true, 00:17:31.483 "unmap": true, 00:17:31.483 "flush": true, 00:17:31.483 "reset": true, 00:17:31.483 "nvme_admin": false, 00:17:31.484 "nvme_io": false, 00:17:31.484 "nvme_io_md": false, 00:17:31.484 "write_zeroes": true, 00:17:31.484 "zcopy": true, 00:17:31.484 "get_zone_info": false, 00:17:31.484 "zone_management": false, 00:17:31.484 "zone_append": false, 00:17:31.484 "compare": false, 00:17:31.484 "compare_and_write": false, 00:17:31.484 "abort": true, 00:17:31.484 "seek_hole": false, 00:17:31.484 "seek_data": false, 00:17:31.484 "copy": true, 00:17:31.484 "nvme_iov_md": false 00:17:31.484 }, 00:17:31.484 "memory_domains": [ 00:17:31.484 { 00:17:31.484 "dma_device_id": "system", 00:17:31.484 "dma_device_type": 1 00:17:31.484 }, 00:17:31.484 { 00:17:31.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.484 "dma_device_type": 2 00:17:31.484 } 00:17:31.484 ], 00:17:31.484 "driver_specific": {} 00:17:31.484 }' 00:17:31.484 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.484 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.484 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.484 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.484 06:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.484 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.484 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.741 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.741 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.741 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.741 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.741 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.741 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.741 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:31.741 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.998 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.998 "name": "BaseBdev2", 00:17:31.998 "aliases": [ 00:17:31.998 "adc78484-6f29-485d-b1ef-d63c9eae7d17" 00:17:31.998 ], 00:17:31.998 "product_name": "Malloc disk", 00:17:31.998 "block_size": 512, 00:17:31.998 "num_blocks": 65536, 00:17:31.998 "uuid": "adc78484-6f29-485d-b1ef-d63c9eae7d17", 00:17:31.998 "assigned_rate_limits": { 00:17:31.998 "rw_ios_per_sec": 0, 00:17:31.998 "rw_mbytes_per_sec": 0, 00:17:31.998 "r_mbytes_per_sec": 0, 00:17:31.998 "w_mbytes_per_sec": 0 00:17:31.998 }, 00:17:31.998 "claimed": true, 00:17:31.998 "claim_type": "exclusive_write", 00:17:31.998 "zoned": false, 00:17:31.998 "supported_io_types": { 00:17:31.998 "read": true, 00:17:31.998 "write": true, 00:17:31.998 "unmap": true, 00:17:31.998 "flush": true, 00:17:31.998 "reset": true, 00:17:31.998 "nvme_admin": false, 00:17:31.998 "nvme_io": false, 00:17:31.998 "nvme_io_md": false, 00:17:31.998 "write_zeroes": true, 00:17:31.998 "zcopy": true, 00:17:31.998 "get_zone_info": false, 00:17:31.998 "zone_management": false, 00:17:31.998 "zone_append": false, 00:17:31.998 "compare": false, 00:17:31.998 "compare_and_write": false, 00:17:31.998 "abort": true, 00:17:31.998 "seek_hole": false, 00:17:31.998 "seek_data": false, 00:17:31.998 "copy": true, 00:17:31.998 "nvme_iov_md": false 00:17:31.998 }, 00:17:31.998 "memory_domains": [ 00:17:31.998 { 00:17:31.998 "dma_device_id": "system", 00:17:31.998 "dma_device_type": 1 00:17:31.998 }, 00:17:31.998 { 00:17:31.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.998 "dma_device_type": 2 00:17:31.998 } 00:17:31.998 ], 00:17:31.998 "driver_specific": {} 00:17:31.998 }' 00:17:31.998 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.998 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.998 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.998 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.255 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:32.256 06:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.513 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.513 "name": "BaseBdev3", 00:17:32.513 "aliases": [ 00:17:32.513 "961444b4-5a06-4c7f-8901-ddb9a16b792e" 00:17:32.513 ], 00:17:32.513 "product_name": "Malloc disk", 00:17:32.513 "block_size": 512, 00:17:32.513 "num_blocks": 65536, 00:17:32.513 "uuid": "961444b4-5a06-4c7f-8901-ddb9a16b792e", 00:17:32.513 "assigned_rate_limits": { 00:17:32.513 "rw_ios_per_sec": 0, 00:17:32.513 "rw_mbytes_per_sec": 0, 00:17:32.513 "r_mbytes_per_sec": 0, 00:17:32.513 "w_mbytes_per_sec": 0 00:17:32.513 }, 00:17:32.513 "claimed": true, 00:17:32.513 "claim_type": "exclusive_write", 00:17:32.513 "zoned": false, 00:17:32.513 "supported_io_types": { 00:17:32.513 "read": true, 00:17:32.513 "write": true, 00:17:32.513 "unmap": true, 00:17:32.513 "flush": true, 00:17:32.513 "reset": true, 00:17:32.513 "nvme_admin": false, 00:17:32.513 "nvme_io": false, 00:17:32.513 "nvme_io_md": false, 00:17:32.513 "write_zeroes": true, 00:17:32.513 "zcopy": true, 00:17:32.513 "get_zone_info": false, 00:17:32.513 "zone_management": false, 00:17:32.513 "zone_append": false, 00:17:32.513 "compare": false, 00:17:32.513 "compare_and_write": false, 00:17:32.513 "abort": true, 00:17:32.513 "seek_hole": false, 00:17:32.513 "seek_data": false, 00:17:32.513 "copy": true, 00:17:32.513 "nvme_iov_md": false 00:17:32.513 }, 00:17:32.513 "memory_domains": [ 00:17:32.513 { 00:17:32.513 "dma_device_id": "system", 00:17:32.513 "dma_device_type": 1 00:17:32.513 }, 00:17:32.513 { 00:17:32.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.513 "dma_device_type": 2 00:17:32.513 } 00:17:32.513 ], 00:17:32.513 "driver_specific": {} 00:17:32.513 }' 00:17:32.513 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.513 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.771 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.029 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.029 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:33.029 [2024-07-25 06:33:46.575931] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:33.029 [2024-07-25 06:33:46.575956] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:33.029 [2024-07-25 06:33:46.575996] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.287 "name": "Existed_Raid", 00:17:33.287 "uuid": "4014ae6b-4a5d-4f30-a26b-7321184c2d7e", 00:17:33.287 "strip_size_kb": 64, 00:17:33.287 "state": "offline", 00:17:33.287 "raid_level": "concat", 00:17:33.287 "superblock": false, 00:17:33.287 "num_base_bdevs": 3, 00:17:33.287 "num_base_bdevs_discovered": 2, 00:17:33.287 "num_base_bdevs_operational": 2, 00:17:33.287 "base_bdevs_list": [ 00:17:33.287 { 00:17:33.287 "name": null, 00:17:33.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.287 "is_configured": false, 00:17:33.287 "data_offset": 0, 00:17:33.287 "data_size": 65536 00:17:33.287 }, 00:17:33.287 { 00:17:33.287 "name": "BaseBdev2", 00:17:33.287 "uuid": "adc78484-6f29-485d-b1ef-d63c9eae7d17", 00:17:33.287 "is_configured": true, 00:17:33.287 "data_offset": 0, 00:17:33.287 "data_size": 65536 00:17:33.287 }, 00:17:33.287 { 00:17:33.287 "name": "BaseBdev3", 00:17:33.287 "uuid": "961444b4-5a06-4c7f-8901-ddb9a16b792e", 00:17:33.287 "is_configured": true, 00:17:33.287 "data_offset": 0, 00:17:33.287 "data_size": 65536 00:17:33.287 } 00:17:33.287 ] 00:17:33.287 }' 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.287 06:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.853 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:33.853 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:33.853 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.853 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:34.110 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:34.110 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:34.110 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:34.368 [2024-07-25 06:33:47.848267] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:34.368 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:34.368 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.368 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.368 06:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:34.627 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:34.627 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:34.627 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:34.885 [2024-07-25 06:33:48.311441] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:34.885 [2024-07-25 06:33:48.311484] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x281e380 name Existed_Raid, state offline 00:17:34.885 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:34.885 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.885 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.885 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:35.143 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:35.143 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:35.143 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:35.143 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:35.143 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:35.143 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:35.400 BaseBdev2 00:17:35.401 06:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:35.401 06:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:35.401 06:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:35.401 06:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:35.401 06:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:35.401 06:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:35.401 06:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:35.659 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:35.917 [ 00:17:35.917 { 00:17:35.917 "name": "BaseBdev2", 00:17:35.917 "aliases": [ 00:17:35.917 "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492" 00:17:35.917 ], 00:17:35.917 "product_name": "Malloc disk", 00:17:35.917 "block_size": 512, 00:17:35.917 "num_blocks": 65536, 00:17:35.917 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:35.917 "assigned_rate_limits": { 00:17:35.917 "rw_ios_per_sec": 0, 00:17:35.917 "rw_mbytes_per_sec": 0, 00:17:35.917 "r_mbytes_per_sec": 0, 00:17:35.917 "w_mbytes_per_sec": 0 00:17:35.917 }, 00:17:35.917 "claimed": false, 00:17:35.917 "zoned": false, 00:17:35.917 "supported_io_types": { 00:17:35.917 "read": true, 00:17:35.917 "write": true, 00:17:35.917 "unmap": true, 00:17:35.917 "flush": true, 00:17:35.917 "reset": true, 00:17:35.917 "nvme_admin": false, 00:17:35.917 "nvme_io": false, 00:17:35.917 "nvme_io_md": false, 00:17:35.917 "write_zeroes": true, 00:17:35.917 "zcopy": true, 00:17:35.917 "get_zone_info": false, 00:17:35.917 "zone_management": false, 00:17:35.917 "zone_append": false, 00:17:35.917 "compare": false, 00:17:35.917 "compare_and_write": false, 00:17:35.917 "abort": true, 00:17:35.917 "seek_hole": false, 00:17:35.917 "seek_data": false, 00:17:35.917 "copy": true, 00:17:35.917 "nvme_iov_md": false 00:17:35.917 }, 00:17:35.917 "memory_domains": [ 00:17:35.917 { 00:17:35.917 "dma_device_id": "system", 00:17:35.917 "dma_device_type": 1 00:17:35.917 }, 00:17:35.917 { 00:17:35.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.917 "dma_device_type": 2 00:17:35.917 } 00:17:35.917 ], 00:17:35.917 "driver_specific": {} 00:17:35.917 } 00:17:35.917 ] 00:17:35.917 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:35.917 06:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:35.917 06:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:35.917 06:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:35.917 BaseBdev3 00:17:36.175 06:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:36.175 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:36.175 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:36.175 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:36.175 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:36.175 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:36.175 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:36.175 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:36.433 [ 00:17:36.433 { 00:17:36.433 "name": "BaseBdev3", 00:17:36.433 "aliases": [ 00:17:36.434 "61fd2fb9-6810-43bf-9347-b98f775d5c9f" 00:17:36.434 ], 00:17:36.434 "product_name": "Malloc disk", 00:17:36.434 "block_size": 512, 00:17:36.434 "num_blocks": 65536, 00:17:36.434 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:36.434 "assigned_rate_limits": { 00:17:36.434 "rw_ios_per_sec": 0, 00:17:36.434 "rw_mbytes_per_sec": 0, 00:17:36.434 "r_mbytes_per_sec": 0, 00:17:36.434 "w_mbytes_per_sec": 0 00:17:36.434 }, 00:17:36.434 "claimed": false, 00:17:36.434 "zoned": false, 00:17:36.434 "supported_io_types": { 00:17:36.434 "read": true, 00:17:36.434 "write": true, 00:17:36.434 "unmap": true, 00:17:36.434 "flush": true, 00:17:36.434 "reset": true, 00:17:36.434 "nvme_admin": false, 00:17:36.434 "nvme_io": false, 00:17:36.434 "nvme_io_md": false, 00:17:36.434 "write_zeroes": true, 00:17:36.434 "zcopy": true, 00:17:36.434 "get_zone_info": false, 00:17:36.434 "zone_management": false, 00:17:36.434 "zone_append": false, 00:17:36.434 "compare": false, 00:17:36.434 "compare_and_write": false, 00:17:36.434 "abort": true, 00:17:36.434 "seek_hole": false, 00:17:36.434 "seek_data": false, 00:17:36.434 "copy": true, 00:17:36.434 "nvme_iov_md": false 00:17:36.434 }, 00:17:36.434 "memory_domains": [ 00:17:36.434 { 00:17:36.434 "dma_device_id": "system", 00:17:36.434 "dma_device_type": 1 00:17:36.434 }, 00:17:36.434 { 00:17:36.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.434 "dma_device_type": 2 00:17:36.434 } 00:17:36.434 ], 00:17:36.434 "driver_specific": {} 00:17:36.434 } 00:17:36.434 ] 00:17:36.434 06:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:36.434 06:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:36.434 06:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:36.434 06:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:36.693 [2024-07-25 06:33:50.134770] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:36.693 [2024-07-25 06:33:50.134812] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:36.693 [2024-07-25 06:33:50.134832] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:36.693 [2024-07-25 06:33:50.136055] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.693 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.951 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.951 "name": "Existed_Raid", 00:17:36.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.951 "strip_size_kb": 64, 00:17:36.951 "state": "configuring", 00:17:36.951 "raid_level": "concat", 00:17:36.951 "superblock": false, 00:17:36.951 "num_base_bdevs": 3, 00:17:36.951 "num_base_bdevs_discovered": 2, 00:17:36.951 "num_base_bdevs_operational": 3, 00:17:36.951 "base_bdevs_list": [ 00:17:36.951 { 00:17:36.951 "name": "BaseBdev1", 00:17:36.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.951 "is_configured": false, 00:17:36.951 "data_offset": 0, 00:17:36.951 "data_size": 0 00:17:36.951 }, 00:17:36.951 { 00:17:36.951 "name": "BaseBdev2", 00:17:36.951 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:36.951 "is_configured": true, 00:17:36.951 "data_offset": 0, 00:17:36.951 "data_size": 65536 00:17:36.951 }, 00:17:36.951 { 00:17:36.951 "name": "BaseBdev3", 00:17:36.951 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:36.951 "is_configured": true, 00:17:36.951 "data_offset": 0, 00:17:36.951 "data_size": 65536 00:17:36.951 } 00:17:36.951 ] 00:17:36.951 }' 00:17:36.951 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.951 06:33:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.518 06:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:37.776 [2024-07-25 06:33:51.157458] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.776 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.035 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.035 "name": "Existed_Raid", 00:17:38.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.035 "strip_size_kb": 64, 00:17:38.035 "state": "configuring", 00:17:38.035 "raid_level": "concat", 00:17:38.035 "superblock": false, 00:17:38.035 "num_base_bdevs": 3, 00:17:38.035 "num_base_bdevs_discovered": 1, 00:17:38.035 "num_base_bdevs_operational": 3, 00:17:38.035 "base_bdevs_list": [ 00:17:38.035 { 00:17:38.035 "name": "BaseBdev1", 00:17:38.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.035 "is_configured": false, 00:17:38.035 "data_offset": 0, 00:17:38.035 "data_size": 0 00:17:38.035 }, 00:17:38.035 { 00:17:38.035 "name": null, 00:17:38.035 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:38.035 "is_configured": false, 00:17:38.035 "data_offset": 0, 00:17:38.035 "data_size": 65536 00:17:38.035 }, 00:17:38.035 { 00:17:38.035 "name": "BaseBdev3", 00:17:38.035 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:38.035 "is_configured": true, 00:17:38.035 "data_offset": 0, 00:17:38.035 "data_size": 65536 00:17:38.035 } 00:17:38.035 ] 00:17:38.035 }' 00:17:38.035 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.035 06:33:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.602 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.602 06:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:38.861 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:38.861 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:39.119 [2024-07-25 06:33:52.428130] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:39.119 BaseBdev1 00:17:39.119 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:39.119 06:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:39.119 06:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:39.119 06:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:39.119 06:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:39.119 06:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:39.119 06:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.119 06:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:39.378 [ 00:17:39.378 { 00:17:39.378 "name": "BaseBdev1", 00:17:39.378 "aliases": [ 00:17:39.378 "65f5f507-ce44-43d1-a89b-22998b0c259e" 00:17:39.378 ], 00:17:39.378 "product_name": "Malloc disk", 00:17:39.378 "block_size": 512, 00:17:39.378 "num_blocks": 65536, 00:17:39.378 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:39.378 "assigned_rate_limits": { 00:17:39.378 "rw_ios_per_sec": 0, 00:17:39.378 "rw_mbytes_per_sec": 0, 00:17:39.378 "r_mbytes_per_sec": 0, 00:17:39.378 "w_mbytes_per_sec": 0 00:17:39.378 }, 00:17:39.378 "claimed": true, 00:17:39.378 "claim_type": "exclusive_write", 00:17:39.378 "zoned": false, 00:17:39.378 "supported_io_types": { 00:17:39.378 "read": true, 00:17:39.378 "write": true, 00:17:39.378 "unmap": true, 00:17:39.378 "flush": true, 00:17:39.378 "reset": true, 00:17:39.378 "nvme_admin": false, 00:17:39.378 "nvme_io": false, 00:17:39.378 "nvme_io_md": false, 00:17:39.378 "write_zeroes": true, 00:17:39.378 "zcopy": true, 00:17:39.378 "get_zone_info": false, 00:17:39.378 "zone_management": false, 00:17:39.378 "zone_append": false, 00:17:39.378 "compare": false, 00:17:39.378 "compare_and_write": false, 00:17:39.378 "abort": true, 00:17:39.378 "seek_hole": false, 00:17:39.378 "seek_data": false, 00:17:39.378 "copy": true, 00:17:39.378 "nvme_iov_md": false 00:17:39.378 }, 00:17:39.378 "memory_domains": [ 00:17:39.378 { 00:17:39.378 "dma_device_id": "system", 00:17:39.378 "dma_device_type": 1 00:17:39.378 }, 00:17:39.378 { 00:17:39.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.378 "dma_device_type": 2 00:17:39.378 } 00:17:39.378 ], 00:17:39.378 "driver_specific": {} 00:17:39.378 } 00:17:39.378 ] 00:17:39.378 06:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:39.378 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.379 06:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.637 06:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.637 "name": "Existed_Raid", 00:17:39.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.637 "strip_size_kb": 64, 00:17:39.637 "state": "configuring", 00:17:39.637 "raid_level": "concat", 00:17:39.637 "superblock": false, 00:17:39.637 "num_base_bdevs": 3, 00:17:39.637 "num_base_bdevs_discovered": 2, 00:17:39.637 "num_base_bdevs_operational": 3, 00:17:39.637 "base_bdevs_list": [ 00:17:39.637 { 00:17:39.637 "name": "BaseBdev1", 00:17:39.637 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:39.637 "is_configured": true, 00:17:39.637 "data_offset": 0, 00:17:39.637 "data_size": 65536 00:17:39.637 }, 00:17:39.637 { 00:17:39.637 "name": null, 00:17:39.637 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:39.637 "is_configured": false, 00:17:39.637 "data_offset": 0, 00:17:39.637 "data_size": 65536 00:17:39.637 }, 00:17:39.637 { 00:17:39.637 "name": "BaseBdev3", 00:17:39.637 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:39.637 "is_configured": true, 00:17:39.637 "data_offset": 0, 00:17:39.637 "data_size": 65536 00:17:39.637 } 00:17:39.637 ] 00:17:39.637 }' 00:17:39.637 06:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.637 06:33:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.204 06:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.204 06:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:40.463 06:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:40.463 06:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:40.721 [2024-07-25 06:33:54.100569] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.721 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.980 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.980 "name": "Existed_Raid", 00:17:40.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.980 "strip_size_kb": 64, 00:17:40.980 "state": "configuring", 00:17:40.980 "raid_level": "concat", 00:17:40.980 "superblock": false, 00:17:40.980 "num_base_bdevs": 3, 00:17:40.980 "num_base_bdevs_discovered": 1, 00:17:40.980 "num_base_bdevs_operational": 3, 00:17:40.980 "base_bdevs_list": [ 00:17:40.980 { 00:17:40.980 "name": "BaseBdev1", 00:17:40.980 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:40.980 "is_configured": true, 00:17:40.980 "data_offset": 0, 00:17:40.980 "data_size": 65536 00:17:40.980 }, 00:17:40.980 { 00:17:40.980 "name": null, 00:17:40.980 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:40.980 "is_configured": false, 00:17:40.980 "data_offset": 0, 00:17:40.980 "data_size": 65536 00:17:40.980 }, 00:17:40.980 { 00:17:40.980 "name": null, 00:17:40.980 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:40.980 "is_configured": false, 00:17:40.980 "data_offset": 0, 00:17:40.980 "data_size": 65536 00:17:40.980 } 00:17:40.980 ] 00:17:40.980 }' 00:17:40.980 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.980 06:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.547 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.547 06:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:41.806 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:41.806 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:41.806 [2024-07-25 06:33:55.351882] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.064 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.064 "name": "Existed_Raid", 00:17:42.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.064 "strip_size_kb": 64, 00:17:42.064 "state": "configuring", 00:17:42.064 "raid_level": "concat", 00:17:42.064 "superblock": false, 00:17:42.064 "num_base_bdevs": 3, 00:17:42.064 "num_base_bdevs_discovered": 2, 00:17:42.064 "num_base_bdevs_operational": 3, 00:17:42.064 "base_bdevs_list": [ 00:17:42.064 { 00:17:42.065 "name": "BaseBdev1", 00:17:42.065 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:42.065 "is_configured": true, 00:17:42.065 "data_offset": 0, 00:17:42.065 "data_size": 65536 00:17:42.065 }, 00:17:42.065 { 00:17:42.065 "name": null, 00:17:42.065 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:42.065 "is_configured": false, 00:17:42.065 "data_offset": 0, 00:17:42.065 "data_size": 65536 00:17:42.065 }, 00:17:42.065 { 00:17:42.065 "name": "BaseBdev3", 00:17:42.065 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:42.065 "is_configured": true, 00:17:42.065 "data_offset": 0, 00:17:42.065 "data_size": 65536 00:17:42.065 } 00:17:42.065 ] 00:17:42.065 }' 00:17:42.065 06:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.065 06:33:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.648 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.649 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:42.918 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:42.918 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:43.177 [2024-07-25 06:33:56.599189] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.177 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:43.435 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.435 "name": "Existed_Raid", 00:17:43.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:43.435 "strip_size_kb": 64, 00:17:43.435 "state": "configuring", 00:17:43.435 "raid_level": "concat", 00:17:43.435 "superblock": false, 00:17:43.435 "num_base_bdevs": 3, 00:17:43.435 "num_base_bdevs_discovered": 1, 00:17:43.435 "num_base_bdevs_operational": 3, 00:17:43.435 "base_bdevs_list": [ 00:17:43.435 { 00:17:43.435 "name": null, 00:17:43.435 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:43.435 "is_configured": false, 00:17:43.435 "data_offset": 0, 00:17:43.435 "data_size": 65536 00:17:43.435 }, 00:17:43.435 { 00:17:43.435 "name": null, 00:17:43.435 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:43.435 "is_configured": false, 00:17:43.435 "data_offset": 0, 00:17:43.435 "data_size": 65536 00:17:43.435 }, 00:17:43.435 { 00:17:43.435 "name": "BaseBdev3", 00:17:43.435 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:43.435 "is_configured": true, 00:17:43.435 "data_offset": 0, 00:17:43.435 "data_size": 65536 00:17:43.435 } 00:17:43.435 ] 00:17:43.435 }' 00:17:43.435 06:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.435 06:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.002 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.002 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:44.261 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:44.261 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:44.519 [2024-07-25 06:33:57.860528] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.519 06:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.778 06:33:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.778 "name": "Existed_Raid", 00:17:44.778 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.778 "strip_size_kb": 64, 00:17:44.778 "state": "configuring", 00:17:44.778 "raid_level": "concat", 00:17:44.778 "superblock": false, 00:17:44.778 "num_base_bdevs": 3, 00:17:44.778 "num_base_bdevs_discovered": 2, 00:17:44.778 "num_base_bdevs_operational": 3, 00:17:44.778 "base_bdevs_list": [ 00:17:44.778 { 00:17:44.778 "name": null, 00:17:44.778 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:44.778 "is_configured": false, 00:17:44.778 "data_offset": 0, 00:17:44.778 "data_size": 65536 00:17:44.778 }, 00:17:44.778 { 00:17:44.778 "name": "BaseBdev2", 00:17:44.778 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:44.778 "is_configured": true, 00:17:44.778 "data_offset": 0, 00:17:44.778 "data_size": 65536 00:17:44.778 }, 00:17:44.778 { 00:17:44.778 "name": "BaseBdev3", 00:17:44.778 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:44.778 "is_configured": true, 00:17:44.778 "data_offset": 0, 00:17:44.778 "data_size": 65536 00:17:44.778 } 00:17:44.778 ] 00:17:44.778 }' 00:17:44.778 06:33:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.778 06:33:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.344 06:33:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.344 06:33:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:45.344 06:33:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:45.603 06:33:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.603 06:33:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:45.603 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 65f5f507-ce44-43d1-a89b-22998b0c259e 00:17:45.861 [2024-07-25 06:33:59.359728] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:45.861 [2024-07-25 06:33:59.359760] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x28178c0 00:17:45.861 [2024-07-25 06:33:59.359768] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:17:45.861 [2024-07-25 06:33:59.359950] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2817190 00:17:45.861 [2024-07-25 06:33:59.360061] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28178c0 00:17:45.861 [2024-07-25 06:33:59.360070] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x28178c0 00:17:45.861 [2024-07-25 06:33:59.360228] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:45.861 NewBaseBdev 00:17:45.861 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:45.861 06:33:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:17:45.861 06:33:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:45.861 06:33:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:45.861 06:33:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:45.861 06:33:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:45.861 06:33:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:46.118 06:33:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:46.376 [ 00:17:46.376 { 00:17:46.376 "name": "NewBaseBdev", 00:17:46.376 "aliases": [ 00:17:46.376 "65f5f507-ce44-43d1-a89b-22998b0c259e" 00:17:46.376 ], 00:17:46.376 "product_name": "Malloc disk", 00:17:46.376 "block_size": 512, 00:17:46.376 "num_blocks": 65536, 00:17:46.376 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:46.376 "assigned_rate_limits": { 00:17:46.376 "rw_ios_per_sec": 0, 00:17:46.376 "rw_mbytes_per_sec": 0, 00:17:46.376 "r_mbytes_per_sec": 0, 00:17:46.376 "w_mbytes_per_sec": 0 00:17:46.376 }, 00:17:46.376 "claimed": true, 00:17:46.376 "claim_type": "exclusive_write", 00:17:46.376 "zoned": false, 00:17:46.376 "supported_io_types": { 00:17:46.376 "read": true, 00:17:46.376 "write": true, 00:17:46.376 "unmap": true, 00:17:46.376 "flush": true, 00:17:46.376 "reset": true, 00:17:46.376 "nvme_admin": false, 00:17:46.376 "nvme_io": false, 00:17:46.376 "nvme_io_md": false, 00:17:46.376 "write_zeroes": true, 00:17:46.376 "zcopy": true, 00:17:46.376 "get_zone_info": false, 00:17:46.376 "zone_management": false, 00:17:46.376 "zone_append": false, 00:17:46.376 "compare": false, 00:17:46.377 "compare_and_write": false, 00:17:46.377 "abort": true, 00:17:46.377 "seek_hole": false, 00:17:46.377 "seek_data": false, 00:17:46.377 "copy": true, 00:17:46.377 "nvme_iov_md": false 00:17:46.377 }, 00:17:46.377 "memory_domains": [ 00:17:46.377 { 00:17:46.377 "dma_device_id": "system", 00:17:46.377 "dma_device_type": 1 00:17:46.377 }, 00:17:46.377 { 00:17:46.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.377 "dma_device_type": 2 00:17:46.377 } 00:17:46.377 ], 00:17:46.377 "driver_specific": {} 00:17:46.377 } 00:17:46.377 ] 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.377 06:33:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:46.635 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.635 "name": "Existed_Raid", 00:17:46.635 "uuid": "6177d3c7-1967-4624-a86a-cc0f6f3cd780", 00:17:46.635 "strip_size_kb": 64, 00:17:46.635 "state": "online", 00:17:46.635 "raid_level": "concat", 00:17:46.635 "superblock": false, 00:17:46.635 "num_base_bdevs": 3, 00:17:46.635 "num_base_bdevs_discovered": 3, 00:17:46.635 "num_base_bdevs_operational": 3, 00:17:46.635 "base_bdevs_list": [ 00:17:46.635 { 00:17:46.635 "name": "NewBaseBdev", 00:17:46.635 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:46.635 "is_configured": true, 00:17:46.635 "data_offset": 0, 00:17:46.635 "data_size": 65536 00:17:46.635 }, 00:17:46.635 { 00:17:46.635 "name": "BaseBdev2", 00:17:46.635 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:46.635 "is_configured": true, 00:17:46.635 "data_offset": 0, 00:17:46.635 "data_size": 65536 00:17:46.635 }, 00:17:46.635 { 00:17:46.635 "name": "BaseBdev3", 00:17:46.635 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:46.635 "is_configured": true, 00:17:46.635 "data_offset": 0, 00:17:46.635 "data_size": 65536 00:17:46.635 } 00:17:46.635 ] 00:17:46.635 }' 00:17:46.635 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.635 06:34:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.202 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:47.202 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:47.202 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:47.202 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:47.202 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:47.202 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:47.202 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:47.202 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:47.461 [2024-07-25 06:34:00.815854] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:47.461 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:47.461 "name": "Existed_Raid", 00:17:47.461 "aliases": [ 00:17:47.461 "6177d3c7-1967-4624-a86a-cc0f6f3cd780" 00:17:47.461 ], 00:17:47.461 "product_name": "Raid Volume", 00:17:47.461 "block_size": 512, 00:17:47.461 "num_blocks": 196608, 00:17:47.461 "uuid": "6177d3c7-1967-4624-a86a-cc0f6f3cd780", 00:17:47.461 "assigned_rate_limits": { 00:17:47.461 "rw_ios_per_sec": 0, 00:17:47.461 "rw_mbytes_per_sec": 0, 00:17:47.461 "r_mbytes_per_sec": 0, 00:17:47.461 "w_mbytes_per_sec": 0 00:17:47.461 }, 00:17:47.461 "claimed": false, 00:17:47.461 "zoned": false, 00:17:47.461 "supported_io_types": { 00:17:47.461 "read": true, 00:17:47.461 "write": true, 00:17:47.461 "unmap": true, 00:17:47.461 "flush": true, 00:17:47.461 "reset": true, 00:17:47.461 "nvme_admin": false, 00:17:47.461 "nvme_io": false, 00:17:47.461 "nvme_io_md": false, 00:17:47.461 "write_zeroes": true, 00:17:47.461 "zcopy": false, 00:17:47.461 "get_zone_info": false, 00:17:47.461 "zone_management": false, 00:17:47.461 "zone_append": false, 00:17:47.461 "compare": false, 00:17:47.461 "compare_and_write": false, 00:17:47.461 "abort": false, 00:17:47.461 "seek_hole": false, 00:17:47.461 "seek_data": false, 00:17:47.461 "copy": false, 00:17:47.461 "nvme_iov_md": false 00:17:47.461 }, 00:17:47.461 "memory_domains": [ 00:17:47.461 { 00:17:47.461 "dma_device_id": "system", 00:17:47.461 "dma_device_type": 1 00:17:47.461 }, 00:17:47.461 { 00:17:47.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.461 "dma_device_type": 2 00:17:47.461 }, 00:17:47.461 { 00:17:47.461 "dma_device_id": "system", 00:17:47.461 "dma_device_type": 1 00:17:47.461 }, 00:17:47.461 { 00:17:47.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.461 "dma_device_type": 2 00:17:47.461 }, 00:17:47.461 { 00:17:47.461 "dma_device_id": "system", 00:17:47.461 "dma_device_type": 1 00:17:47.461 }, 00:17:47.461 { 00:17:47.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.461 "dma_device_type": 2 00:17:47.461 } 00:17:47.461 ], 00:17:47.461 "driver_specific": { 00:17:47.461 "raid": { 00:17:47.461 "uuid": "6177d3c7-1967-4624-a86a-cc0f6f3cd780", 00:17:47.461 "strip_size_kb": 64, 00:17:47.461 "state": "online", 00:17:47.461 "raid_level": "concat", 00:17:47.461 "superblock": false, 00:17:47.461 "num_base_bdevs": 3, 00:17:47.461 "num_base_bdevs_discovered": 3, 00:17:47.461 "num_base_bdevs_operational": 3, 00:17:47.461 "base_bdevs_list": [ 00:17:47.461 { 00:17:47.461 "name": "NewBaseBdev", 00:17:47.461 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:47.461 "is_configured": true, 00:17:47.461 "data_offset": 0, 00:17:47.461 "data_size": 65536 00:17:47.461 }, 00:17:47.461 { 00:17:47.461 "name": "BaseBdev2", 00:17:47.461 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:47.461 "is_configured": true, 00:17:47.461 "data_offset": 0, 00:17:47.461 "data_size": 65536 00:17:47.461 }, 00:17:47.461 { 00:17:47.461 "name": "BaseBdev3", 00:17:47.461 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:47.461 "is_configured": true, 00:17:47.461 "data_offset": 0, 00:17:47.461 "data_size": 65536 00:17:47.461 } 00:17:47.461 ] 00:17:47.461 } 00:17:47.461 } 00:17:47.462 }' 00:17:47.462 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:47.462 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:47.462 BaseBdev2 00:17:47.462 BaseBdev3' 00:17:47.462 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.462 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:47.462 06:34:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.720 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.720 "name": "NewBaseBdev", 00:17:47.720 "aliases": [ 00:17:47.720 "65f5f507-ce44-43d1-a89b-22998b0c259e" 00:17:47.720 ], 00:17:47.720 "product_name": "Malloc disk", 00:17:47.720 "block_size": 512, 00:17:47.720 "num_blocks": 65536, 00:17:47.720 "uuid": "65f5f507-ce44-43d1-a89b-22998b0c259e", 00:17:47.720 "assigned_rate_limits": { 00:17:47.720 "rw_ios_per_sec": 0, 00:17:47.720 "rw_mbytes_per_sec": 0, 00:17:47.720 "r_mbytes_per_sec": 0, 00:17:47.720 "w_mbytes_per_sec": 0 00:17:47.720 }, 00:17:47.720 "claimed": true, 00:17:47.720 "claim_type": "exclusive_write", 00:17:47.720 "zoned": false, 00:17:47.720 "supported_io_types": { 00:17:47.720 "read": true, 00:17:47.720 "write": true, 00:17:47.720 "unmap": true, 00:17:47.720 "flush": true, 00:17:47.720 "reset": true, 00:17:47.720 "nvme_admin": false, 00:17:47.720 "nvme_io": false, 00:17:47.720 "nvme_io_md": false, 00:17:47.720 "write_zeroes": true, 00:17:47.720 "zcopy": true, 00:17:47.720 "get_zone_info": false, 00:17:47.720 "zone_management": false, 00:17:47.720 "zone_append": false, 00:17:47.720 "compare": false, 00:17:47.720 "compare_and_write": false, 00:17:47.720 "abort": true, 00:17:47.720 "seek_hole": false, 00:17:47.720 "seek_data": false, 00:17:47.720 "copy": true, 00:17:47.720 "nvme_iov_md": false 00:17:47.720 }, 00:17:47.720 "memory_domains": [ 00:17:47.720 { 00:17:47.720 "dma_device_id": "system", 00:17:47.720 "dma_device_type": 1 00:17:47.720 }, 00:17:47.720 { 00:17:47.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.720 "dma_device_type": 2 00:17:47.720 } 00:17:47.720 ], 00:17:47.720 "driver_specific": {} 00:17:47.720 }' 00:17:47.720 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.720 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.720 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.720 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.720 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:47.978 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.237 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.237 "name": "BaseBdev2", 00:17:48.237 "aliases": [ 00:17:48.237 "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492" 00:17:48.237 ], 00:17:48.237 "product_name": "Malloc disk", 00:17:48.237 "block_size": 512, 00:17:48.237 "num_blocks": 65536, 00:17:48.237 "uuid": "5aeffd71-bbc1-4cb2-9a27-fbecdf2c7492", 00:17:48.237 "assigned_rate_limits": { 00:17:48.237 "rw_ios_per_sec": 0, 00:17:48.237 "rw_mbytes_per_sec": 0, 00:17:48.237 "r_mbytes_per_sec": 0, 00:17:48.237 "w_mbytes_per_sec": 0 00:17:48.237 }, 00:17:48.237 "claimed": true, 00:17:48.237 "claim_type": "exclusive_write", 00:17:48.237 "zoned": false, 00:17:48.237 "supported_io_types": { 00:17:48.237 "read": true, 00:17:48.237 "write": true, 00:17:48.237 "unmap": true, 00:17:48.237 "flush": true, 00:17:48.237 "reset": true, 00:17:48.237 "nvme_admin": false, 00:17:48.237 "nvme_io": false, 00:17:48.237 "nvme_io_md": false, 00:17:48.237 "write_zeroes": true, 00:17:48.237 "zcopy": true, 00:17:48.237 "get_zone_info": false, 00:17:48.237 "zone_management": false, 00:17:48.237 "zone_append": false, 00:17:48.237 "compare": false, 00:17:48.237 "compare_and_write": false, 00:17:48.237 "abort": true, 00:17:48.237 "seek_hole": false, 00:17:48.237 "seek_data": false, 00:17:48.237 "copy": true, 00:17:48.237 "nvme_iov_md": false 00:17:48.237 }, 00:17:48.237 "memory_domains": [ 00:17:48.237 { 00:17:48.237 "dma_device_id": "system", 00:17:48.237 "dma_device_type": 1 00:17:48.237 }, 00:17:48.237 { 00:17:48.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.237 "dma_device_type": 2 00:17:48.237 } 00:17:48.237 ], 00:17:48.237 "driver_specific": {} 00:17:48.237 }' 00:17:48.237 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.237 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.237 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.237 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:48.494 06:34:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.751 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.751 "name": "BaseBdev3", 00:17:48.751 "aliases": [ 00:17:48.751 "61fd2fb9-6810-43bf-9347-b98f775d5c9f" 00:17:48.751 ], 00:17:48.751 "product_name": "Malloc disk", 00:17:48.751 "block_size": 512, 00:17:48.751 "num_blocks": 65536, 00:17:48.751 "uuid": "61fd2fb9-6810-43bf-9347-b98f775d5c9f", 00:17:48.751 "assigned_rate_limits": { 00:17:48.751 "rw_ios_per_sec": 0, 00:17:48.751 "rw_mbytes_per_sec": 0, 00:17:48.751 "r_mbytes_per_sec": 0, 00:17:48.751 "w_mbytes_per_sec": 0 00:17:48.751 }, 00:17:48.751 "claimed": true, 00:17:48.751 "claim_type": "exclusive_write", 00:17:48.751 "zoned": false, 00:17:48.751 "supported_io_types": { 00:17:48.751 "read": true, 00:17:48.751 "write": true, 00:17:48.751 "unmap": true, 00:17:48.751 "flush": true, 00:17:48.751 "reset": true, 00:17:48.751 "nvme_admin": false, 00:17:48.751 "nvme_io": false, 00:17:48.751 "nvme_io_md": false, 00:17:48.751 "write_zeroes": true, 00:17:48.751 "zcopy": true, 00:17:48.751 "get_zone_info": false, 00:17:48.751 "zone_management": false, 00:17:48.751 "zone_append": false, 00:17:48.751 "compare": false, 00:17:48.751 "compare_and_write": false, 00:17:48.751 "abort": true, 00:17:48.751 "seek_hole": false, 00:17:48.751 "seek_data": false, 00:17:48.751 "copy": true, 00:17:48.751 "nvme_iov_md": false 00:17:48.751 }, 00:17:48.751 "memory_domains": [ 00:17:48.751 { 00:17:48.751 "dma_device_id": "system", 00:17:48.751 "dma_device_type": 1 00:17:48.751 }, 00:17:48.751 { 00:17:48.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.751 "dma_device_type": 2 00:17:48.751 } 00:17:48.751 ], 00:17:48.751 "driver_specific": {} 00:17:48.751 }' 00:17:48.751 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.751 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.751 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.752 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.009 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.010 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:49.010 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.010 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.010 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.010 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.010 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.010 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.010 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:49.268 [2024-07-25 06:34:02.768727] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:49.268 [2024-07-25 06:34:02.768752] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:49.268 [2024-07-25 06:34:02.768806] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:49.268 [2024-07-25 06:34:02.768855] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:49.268 [2024-07-25 06:34:02.768866] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28178c0 name Existed_Raid, state offline 00:17:49.268 06:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1134939 00:17:49.268 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1134939 ']' 00:17:49.268 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1134939 00:17:49.268 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:17:49.268 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:49.268 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1134939 00:17:49.526 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:49.526 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:49.527 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1134939' 00:17:49.527 killing process with pid 1134939 00:17:49.527 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1134939 00:17:49.527 [2024-07-25 06:34:02.847725] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:49.527 06:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1134939 00:17:49.527 [2024-07-25 06:34:02.872277] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:49.527 06:34:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:49.527 00:17:49.527 real 0m26.887s 00:17:49.527 user 0m49.286s 00:17:49.527 sys 0m4.875s 00:17:49.527 06:34:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:49.527 06:34:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.527 ************************************ 00:17:49.527 END TEST raid_state_function_test 00:17:49.527 ************************************ 00:17:49.785 06:34:03 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:17:49.785 06:34:03 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:49.786 06:34:03 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:49.786 06:34:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:49.786 ************************************ 00:17:49.786 START TEST raid_state_function_test_sb 00:17:49.786 ************************************ 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 true 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1140112 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1140112' 00:17:49.786 Process raid pid: 1140112 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1140112 /var/tmp/spdk-raid.sock 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1140112 ']' 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:49.786 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:49.786 06:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:49.786 [2024-07-25 06:34:03.196656] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:17:49.786 [2024-07-25 06:34:03.196713] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:01.0 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:01.1 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:01.2 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:01.3 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:01.4 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:01.5 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:01.6 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:01.7 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:02.0 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:02.1 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:02.2 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:02.3 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:02.4 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:02.5 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:02.6 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3d:02.7 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:01.0 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:01.1 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:01.2 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:01.3 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:01.4 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:01.5 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:01.6 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:01.7 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:02.0 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:02.1 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:02.2 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:02.3 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:02.4 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:02.5 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:02.6 cannot be used 00:17:49.786 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:17:49.786 EAL: Requested device 0000:3f:02.7 cannot be used 00:17:49.786 [2024-07-25 06:34:03.333469] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:50.043 [2024-07-25 06:34:03.378371] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:50.043 [2024-07-25 06:34:03.444173] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.043 [2024-07-25 06:34:03.444198] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.608 06:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:50.608 06:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:17:50.608 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:50.866 [2024-07-25 06:34:04.303769] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:50.866 [2024-07-25 06:34:04.303805] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:50.866 [2024-07-25 06:34:04.303815] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:50.866 [2024-07-25 06:34:04.303825] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:50.866 [2024-07-25 06:34:04.303833] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:50.866 [2024-07-25 06:34:04.303843] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.866 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:51.124 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.124 "name": "Existed_Raid", 00:17:51.124 "uuid": "866a958f-dc02-4bb6-898c-058ab1145c63", 00:17:51.124 "strip_size_kb": 64, 00:17:51.124 "state": "configuring", 00:17:51.124 "raid_level": "concat", 00:17:51.124 "superblock": true, 00:17:51.124 "num_base_bdevs": 3, 00:17:51.124 "num_base_bdevs_discovered": 0, 00:17:51.124 "num_base_bdevs_operational": 3, 00:17:51.124 "base_bdevs_list": [ 00:17:51.124 { 00:17:51.124 "name": "BaseBdev1", 00:17:51.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.124 "is_configured": false, 00:17:51.124 "data_offset": 0, 00:17:51.124 "data_size": 0 00:17:51.125 }, 00:17:51.125 { 00:17:51.125 "name": "BaseBdev2", 00:17:51.125 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.125 "is_configured": false, 00:17:51.125 "data_offset": 0, 00:17:51.125 "data_size": 0 00:17:51.125 }, 00:17:51.125 { 00:17:51.125 "name": "BaseBdev3", 00:17:51.125 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.125 "is_configured": false, 00:17:51.125 "data_offset": 0, 00:17:51.125 "data_size": 0 00:17:51.125 } 00:17:51.125 ] 00:17:51.125 }' 00:17:51.125 06:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.125 06:34:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:51.702 06:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:51.959 [2024-07-25 06:34:05.282219] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:51.959 [2024-07-25 06:34:05.282243] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xabd470 name Existed_Raid, state configuring 00:17:51.959 06:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:51.959 [2024-07-25 06:34:05.506835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:51.959 [2024-07-25 06:34:05.506862] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:51.959 [2024-07-25 06:34:05.506870] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:51.959 [2024-07-25 06:34:05.506880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:51.959 [2024-07-25 06:34:05.506888] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:51.959 [2024-07-25 06:34:05.506898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:52.217 06:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:52.217 [2024-07-25 06:34:05.740886] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:52.217 BaseBdev1 00:17:52.217 06:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:52.217 06:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:52.217 06:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:52.217 06:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:52.217 06:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:52.217 06:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:52.217 06:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.474 06:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:52.733 [ 00:17:52.733 { 00:17:52.733 "name": "BaseBdev1", 00:17:52.733 "aliases": [ 00:17:52.733 "fc548c5e-24d3-4572-b213-133a68d68c39" 00:17:52.733 ], 00:17:52.733 "product_name": "Malloc disk", 00:17:52.733 "block_size": 512, 00:17:52.733 "num_blocks": 65536, 00:17:52.733 "uuid": "fc548c5e-24d3-4572-b213-133a68d68c39", 00:17:52.733 "assigned_rate_limits": { 00:17:52.733 "rw_ios_per_sec": 0, 00:17:52.733 "rw_mbytes_per_sec": 0, 00:17:52.733 "r_mbytes_per_sec": 0, 00:17:52.733 "w_mbytes_per_sec": 0 00:17:52.733 }, 00:17:52.733 "claimed": true, 00:17:52.733 "claim_type": "exclusive_write", 00:17:52.733 "zoned": false, 00:17:52.733 "supported_io_types": { 00:17:52.733 "read": true, 00:17:52.733 "write": true, 00:17:52.733 "unmap": true, 00:17:52.733 "flush": true, 00:17:52.733 "reset": true, 00:17:52.733 "nvme_admin": false, 00:17:52.733 "nvme_io": false, 00:17:52.733 "nvme_io_md": false, 00:17:52.733 "write_zeroes": true, 00:17:52.733 "zcopy": true, 00:17:52.733 "get_zone_info": false, 00:17:52.733 "zone_management": false, 00:17:52.733 "zone_append": false, 00:17:52.733 "compare": false, 00:17:52.733 "compare_and_write": false, 00:17:52.733 "abort": true, 00:17:52.733 "seek_hole": false, 00:17:52.733 "seek_data": false, 00:17:52.733 "copy": true, 00:17:52.733 "nvme_iov_md": false 00:17:52.733 }, 00:17:52.733 "memory_domains": [ 00:17:52.733 { 00:17:52.733 "dma_device_id": "system", 00:17:52.733 "dma_device_type": 1 00:17:52.733 }, 00:17:52.733 { 00:17:52.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.733 "dma_device_type": 2 00:17:52.733 } 00:17:52.733 ], 00:17:52.733 "driver_specific": {} 00:17:52.733 } 00:17:52.733 ] 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.733 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.990 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.990 "name": "Existed_Raid", 00:17:52.990 "uuid": "8e76e881-55a8-4f3a-871b-68827955323c", 00:17:52.990 "strip_size_kb": 64, 00:17:52.990 "state": "configuring", 00:17:52.990 "raid_level": "concat", 00:17:52.990 "superblock": true, 00:17:52.990 "num_base_bdevs": 3, 00:17:52.990 "num_base_bdevs_discovered": 1, 00:17:52.990 "num_base_bdevs_operational": 3, 00:17:52.990 "base_bdevs_list": [ 00:17:52.990 { 00:17:52.990 "name": "BaseBdev1", 00:17:52.990 "uuid": "fc548c5e-24d3-4572-b213-133a68d68c39", 00:17:52.990 "is_configured": true, 00:17:52.990 "data_offset": 2048, 00:17:52.990 "data_size": 63488 00:17:52.990 }, 00:17:52.990 { 00:17:52.990 "name": "BaseBdev2", 00:17:52.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.990 "is_configured": false, 00:17:52.990 "data_offset": 0, 00:17:52.990 "data_size": 0 00:17:52.990 }, 00:17:52.990 { 00:17:52.990 "name": "BaseBdev3", 00:17:52.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.990 "is_configured": false, 00:17:52.990 "data_offset": 0, 00:17:52.990 "data_size": 0 00:17:52.990 } 00:17:52.990 ] 00:17:52.990 }' 00:17:52.990 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.990 06:34:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:53.556 06:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:53.814 [2024-07-25 06:34:07.164610] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:53.814 [2024-07-25 06:34:07.164646] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xabcce0 name Existed_Raid, state configuring 00:17:53.814 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:54.072 [2024-07-25 06:34:07.389248] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:54.072 [2024-07-25 06:34:07.390638] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:54.072 [2024-07-25 06:34:07.390670] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:54.072 [2024-07-25 06:34:07.390679] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:54.072 [2024-07-25 06:34:07.390689] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:54.072 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:54.072 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:54.072 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:54.072 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.072 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.073 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.331 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.331 "name": "Existed_Raid", 00:17:54.331 "uuid": "c20eae1a-745a-4a7c-848f-9da2f68098df", 00:17:54.331 "strip_size_kb": 64, 00:17:54.331 "state": "configuring", 00:17:54.331 "raid_level": "concat", 00:17:54.331 "superblock": true, 00:17:54.331 "num_base_bdevs": 3, 00:17:54.331 "num_base_bdevs_discovered": 1, 00:17:54.331 "num_base_bdevs_operational": 3, 00:17:54.331 "base_bdevs_list": [ 00:17:54.331 { 00:17:54.331 "name": "BaseBdev1", 00:17:54.331 "uuid": "fc548c5e-24d3-4572-b213-133a68d68c39", 00:17:54.331 "is_configured": true, 00:17:54.331 "data_offset": 2048, 00:17:54.331 "data_size": 63488 00:17:54.331 }, 00:17:54.331 { 00:17:54.331 "name": "BaseBdev2", 00:17:54.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.331 "is_configured": false, 00:17:54.331 "data_offset": 0, 00:17:54.331 "data_size": 0 00:17:54.331 }, 00:17:54.331 { 00:17:54.331 "name": "BaseBdev3", 00:17:54.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.331 "is_configured": false, 00:17:54.331 "data_offset": 0, 00:17:54.331 "data_size": 0 00:17:54.331 } 00:17:54.331 ] 00:17:54.331 }' 00:17:54.331 06:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.331 06:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:54.898 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:54.898 [2024-07-25 06:34:08.374893] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:54.898 BaseBdev2 00:17:54.898 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:54.898 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:54.898 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:54.898 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:54.898 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:54.898 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:54.898 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:55.157 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:55.157 [ 00:17:55.157 { 00:17:55.157 "name": "BaseBdev2", 00:17:55.157 "aliases": [ 00:17:55.157 "89f04fab-19f8-4e36-a0fe-eef5f81126de" 00:17:55.157 ], 00:17:55.157 "product_name": "Malloc disk", 00:17:55.157 "block_size": 512, 00:17:55.157 "num_blocks": 65536, 00:17:55.157 "uuid": "89f04fab-19f8-4e36-a0fe-eef5f81126de", 00:17:55.157 "assigned_rate_limits": { 00:17:55.157 "rw_ios_per_sec": 0, 00:17:55.157 "rw_mbytes_per_sec": 0, 00:17:55.157 "r_mbytes_per_sec": 0, 00:17:55.157 "w_mbytes_per_sec": 0 00:17:55.157 }, 00:17:55.157 "claimed": true, 00:17:55.157 "claim_type": "exclusive_write", 00:17:55.157 "zoned": false, 00:17:55.157 "supported_io_types": { 00:17:55.157 "read": true, 00:17:55.157 "write": true, 00:17:55.157 "unmap": true, 00:17:55.157 "flush": true, 00:17:55.157 "reset": true, 00:17:55.157 "nvme_admin": false, 00:17:55.157 "nvme_io": false, 00:17:55.157 "nvme_io_md": false, 00:17:55.157 "write_zeroes": true, 00:17:55.157 "zcopy": true, 00:17:55.157 "get_zone_info": false, 00:17:55.157 "zone_management": false, 00:17:55.157 "zone_append": false, 00:17:55.157 "compare": false, 00:17:55.157 "compare_and_write": false, 00:17:55.157 "abort": true, 00:17:55.157 "seek_hole": false, 00:17:55.157 "seek_data": false, 00:17:55.157 "copy": true, 00:17:55.157 "nvme_iov_md": false 00:17:55.157 }, 00:17:55.157 "memory_domains": [ 00:17:55.157 { 00:17:55.157 "dma_device_id": "system", 00:17:55.157 "dma_device_type": 1 00:17:55.157 }, 00:17:55.157 { 00:17:55.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:55.157 "dma_device_type": 2 00:17:55.157 } 00:17:55.157 ], 00:17:55.157 "driver_specific": {} 00:17:55.157 } 00:17:55.157 ] 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.435 "name": "Existed_Raid", 00:17:55.435 "uuid": "c20eae1a-745a-4a7c-848f-9da2f68098df", 00:17:55.435 "strip_size_kb": 64, 00:17:55.435 "state": "configuring", 00:17:55.435 "raid_level": "concat", 00:17:55.435 "superblock": true, 00:17:55.435 "num_base_bdevs": 3, 00:17:55.435 "num_base_bdevs_discovered": 2, 00:17:55.435 "num_base_bdevs_operational": 3, 00:17:55.435 "base_bdevs_list": [ 00:17:55.435 { 00:17:55.435 "name": "BaseBdev1", 00:17:55.435 "uuid": "fc548c5e-24d3-4572-b213-133a68d68c39", 00:17:55.435 "is_configured": true, 00:17:55.435 "data_offset": 2048, 00:17:55.435 "data_size": 63488 00:17:55.435 }, 00:17:55.435 { 00:17:55.435 "name": "BaseBdev2", 00:17:55.435 "uuid": "89f04fab-19f8-4e36-a0fe-eef5f81126de", 00:17:55.435 "is_configured": true, 00:17:55.435 "data_offset": 2048, 00:17:55.435 "data_size": 63488 00:17:55.435 }, 00:17:55.435 { 00:17:55.435 "name": "BaseBdev3", 00:17:55.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.435 "is_configured": false, 00:17:55.435 "data_offset": 0, 00:17:55.435 "data_size": 0 00:17:55.435 } 00:17:55.435 ] 00:17:55.435 }' 00:17:55.435 06:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.436 06:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.020 06:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:56.280 [2024-07-25 06:34:09.709603] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:56.280 [2024-07-25 06:34:09.709741] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc70380 00:17:56.280 [2024-07-25 06:34:09.709753] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:56.280 [2024-07-25 06:34:09.709911] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc67550 00:17:56.280 [2024-07-25 06:34:09.710021] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc70380 00:17:56.280 [2024-07-25 06:34:09.710030] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc70380 00:17:56.280 [2024-07-25 06:34:09.710112] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:56.280 BaseBdev3 00:17:56.280 06:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:56.280 06:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:56.280 06:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:56.280 06:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:56.280 06:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:56.280 06:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:56.280 06:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:56.539 06:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:56.799 [ 00:17:56.799 { 00:17:56.799 "name": "BaseBdev3", 00:17:56.799 "aliases": [ 00:17:56.799 "b52e1b46-0e86-454c-b18d-55b98cf5e2ee" 00:17:56.799 ], 00:17:56.799 "product_name": "Malloc disk", 00:17:56.799 "block_size": 512, 00:17:56.799 "num_blocks": 65536, 00:17:56.799 "uuid": "b52e1b46-0e86-454c-b18d-55b98cf5e2ee", 00:17:56.799 "assigned_rate_limits": { 00:17:56.799 "rw_ios_per_sec": 0, 00:17:56.799 "rw_mbytes_per_sec": 0, 00:17:56.799 "r_mbytes_per_sec": 0, 00:17:56.799 "w_mbytes_per_sec": 0 00:17:56.799 }, 00:17:56.799 "claimed": true, 00:17:56.799 "claim_type": "exclusive_write", 00:17:56.799 "zoned": false, 00:17:56.799 "supported_io_types": { 00:17:56.799 "read": true, 00:17:56.799 "write": true, 00:17:56.799 "unmap": true, 00:17:56.799 "flush": true, 00:17:56.799 "reset": true, 00:17:56.799 "nvme_admin": false, 00:17:56.799 "nvme_io": false, 00:17:56.799 "nvme_io_md": false, 00:17:56.799 "write_zeroes": true, 00:17:56.799 "zcopy": true, 00:17:56.799 "get_zone_info": false, 00:17:56.799 "zone_management": false, 00:17:56.799 "zone_append": false, 00:17:56.799 "compare": false, 00:17:56.799 "compare_and_write": false, 00:17:56.799 "abort": true, 00:17:56.799 "seek_hole": false, 00:17:56.799 "seek_data": false, 00:17:56.799 "copy": true, 00:17:56.799 "nvme_iov_md": false 00:17:56.799 }, 00:17:56.799 "memory_domains": [ 00:17:56.799 { 00:17:56.799 "dma_device_id": "system", 00:17:56.799 "dma_device_type": 1 00:17:56.799 }, 00:17:56.799 { 00:17:56.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.799 "dma_device_type": 2 00:17:56.799 } 00:17:56.799 ], 00:17:56.799 "driver_specific": {} 00:17:56.799 } 00:17:56.799 ] 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.799 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.058 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.058 "name": "Existed_Raid", 00:17:57.058 "uuid": "c20eae1a-745a-4a7c-848f-9da2f68098df", 00:17:57.058 "strip_size_kb": 64, 00:17:57.058 "state": "online", 00:17:57.059 "raid_level": "concat", 00:17:57.059 "superblock": true, 00:17:57.059 "num_base_bdevs": 3, 00:17:57.059 "num_base_bdevs_discovered": 3, 00:17:57.059 "num_base_bdevs_operational": 3, 00:17:57.059 "base_bdevs_list": [ 00:17:57.059 { 00:17:57.059 "name": "BaseBdev1", 00:17:57.059 "uuid": "fc548c5e-24d3-4572-b213-133a68d68c39", 00:17:57.059 "is_configured": true, 00:17:57.059 "data_offset": 2048, 00:17:57.059 "data_size": 63488 00:17:57.059 }, 00:17:57.059 { 00:17:57.059 "name": "BaseBdev2", 00:17:57.059 "uuid": "89f04fab-19f8-4e36-a0fe-eef5f81126de", 00:17:57.059 "is_configured": true, 00:17:57.059 "data_offset": 2048, 00:17:57.059 "data_size": 63488 00:17:57.059 }, 00:17:57.059 { 00:17:57.059 "name": "BaseBdev3", 00:17:57.059 "uuid": "b52e1b46-0e86-454c-b18d-55b98cf5e2ee", 00:17:57.059 "is_configured": true, 00:17:57.059 "data_offset": 2048, 00:17:57.059 "data_size": 63488 00:17:57.059 } 00:17:57.059 ] 00:17:57.059 }' 00:17:57.059 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.059 06:34:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:57.627 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:57.627 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:57.627 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:57.627 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:57.627 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:57.627 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:57.627 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:57.627 06:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:57.627 [2024-07-25 06:34:11.141668] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:57.627 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:57.627 "name": "Existed_Raid", 00:17:57.627 "aliases": [ 00:17:57.627 "c20eae1a-745a-4a7c-848f-9da2f68098df" 00:17:57.627 ], 00:17:57.627 "product_name": "Raid Volume", 00:17:57.627 "block_size": 512, 00:17:57.627 "num_blocks": 190464, 00:17:57.627 "uuid": "c20eae1a-745a-4a7c-848f-9da2f68098df", 00:17:57.627 "assigned_rate_limits": { 00:17:57.627 "rw_ios_per_sec": 0, 00:17:57.627 "rw_mbytes_per_sec": 0, 00:17:57.627 "r_mbytes_per_sec": 0, 00:17:57.627 "w_mbytes_per_sec": 0 00:17:57.627 }, 00:17:57.627 "claimed": false, 00:17:57.627 "zoned": false, 00:17:57.627 "supported_io_types": { 00:17:57.627 "read": true, 00:17:57.627 "write": true, 00:17:57.627 "unmap": true, 00:17:57.627 "flush": true, 00:17:57.627 "reset": true, 00:17:57.627 "nvme_admin": false, 00:17:57.627 "nvme_io": false, 00:17:57.627 "nvme_io_md": false, 00:17:57.627 "write_zeroes": true, 00:17:57.627 "zcopy": false, 00:17:57.627 "get_zone_info": false, 00:17:57.627 "zone_management": false, 00:17:57.627 "zone_append": false, 00:17:57.627 "compare": false, 00:17:57.627 "compare_and_write": false, 00:17:57.627 "abort": false, 00:17:57.627 "seek_hole": false, 00:17:57.627 "seek_data": false, 00:17:57.627 "copy": false, 00:17:57.627 "nvme_iov_md": false 00:17:57.627 }, 00:17:57.627 "memory_domains": [ 00:17:57.627 { 00:17:57.627 "dma_device_id": "system", 00:17:57.627 "dma_device_type": 1 00:17:57.627 }, 00:17:57.627 { 00:17:57.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.627 "dma_device_type": 2 00:17:57.627 }, 00:17:57.627 { 00:17:57.627 "dma_device_id": "system", 00:17:57.627 "dma_device_type": 1 00:17:57.627 }, 00:17:57.627 { 00:17:57.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.627 "dma_device_type": 2 00:17:57.627 }, 00:17:57.627 { 00:17:57.627 "dma_device_id": "system", 00:17:57.627 "dma_device_type": 1 00:17:57.627 }, 00:17:57.627 { 00:17:57.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.627 "dma_device_type": 2 00:17:57.627 } 00:17:57.627 ], 00:17:57.627 "driver_specific": { 00:17:57.627 "raid": { 00:17:57.627 "uuid": "c20eae1a-745a-4a7c-848f-9da2f68098df", 00:17:57.627 "strip_size_kb": 64, 00:17:57.627 "state": "online", 00:17:57.627 "raid_level": "concat", 00:17:57.627 "superblock": true, 00:17:57.627 "num_base_bdevs": 3, 00:17:57.627 "num_base_bdevs_discovered": 3, 00:17:57.627 "num_base_bdevs_operational": 3, 00:17:57.627 "base_bdevs_list": [ 00:17:57.627 { 00:17:57.627 "name": "BaseBdev1", 00:17:57.627 "uuid": "fc548c5e-24d3-4572-b213-133a68d68c39", 00:17:57.627 "is_configured": true, 00:17:57.627 "data_offset": 2048, 00:17:57.627 "data_size": 63488 00:17:57.627 }, 00:17:57.627 { 00:17:57.627 "name": "BaseBdev2", 00:17:57.627 "uuid": "89f04fab-19f8-4e36-a0fe-eef5f81126de", 00:17:57.627 "is_configured": true, 00:17:57.627 "data_offset": 2048, 00:17:57.627 "data_size": 63488 00:17:57.627 }, 00:17:57.627 { 00:17:57.627 "name": "BaseBdev3", 00:17:57.627 "uuid": "b52e1b46-0e86-454c-b18d-55b98cf5e2ee", 00:17:57.627 "is_configured": true, 00:17:57.627 "data_offset": 2048, 00:17:57.627 "data_size": 63488 00:17:57.627 } 00:17:57.627 ] 00:17:57.627 } 00:17:57.627 } 00:17:57.627 }' 00:17:57.627 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:57.887 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:57.887 BaseBdev2 00:17:57.887 BaseBdev3' 00:17:57.887 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:57.887 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:57.887 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.887 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:57.887 "name": "BaseBdev1", 00:17:57.887 "aliases": [ 00:17:57.887 "fc548c5e-24d3-4572-b213-133a68d68c39" 00:17:57.887 ], 00:17:57.887 "product_name": "Malloc disk", 00:17:57.887 "block_size": 512, 00:17:57.887 "num_blocks": 65536, 00:17:57.887 "uuid": "fc548c5e-24d3-4572-b213-133a68d68c39", 00:17:57.887 "assigned_rate_limits": { 00:17:57.887 "rw_ios_per_sec": 0, 00:17:57.887 "rw_mbytes_per_sec": 0, 00:17:57.887 "r_mbytes_per_sec": 0, 00:17:57.887 "w_mbytes_per_sec": 0 00:17:57.887 }, 00:17:57.887 "claimed": true, 00:17:57.887 "claim_type": "exclusive_write", 00:17:57.887 "zoned": false, 00:17:57.887 "supported_io_types": { 00:17:57.887 "read": true, 00:17:57.887 "write": true, 00:17:57.887 "unmap": true, 00:17:57.887 "flush": true, 00:17:57.887 "reset": true, 00:17:57.887 "nvme_admin": false, 00:17:57.887 "nvme_io": false, 00:17:57.887 "nvme_io_md": false, 00:17:57.887 "write_zeroes": true, 00:17:57.887 "zcopy": true, 00:17:57.887 "get_zone_info": false, 00:17:57.887 "zone_management": false, 00:17:57.887 "zone_append": false, 00:17:57.887 "compare": false, 00:17:57.887 "compare_and_write": false, 00:17:57.887 "abort": true, 00:17:57.887 "seek_hole": false, 00:17:57.887 "seek_data": false, 00:17:57.887 "copy": true, 00:17:57.887 "nvme_iov_md": false 00:17:57.887 }, 00:17:57.887 "memory_domains": [ 00:17:57.887 { 00:17:57.887 "dma_device_id": "system", 00:17:57.887 "dma_device_type": 1 00:17:57.887 }, 00:17:57.887 { 00:17:57.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.887 "dma_device_type": 2 00:17:57.887 } 00:17:57.887 ], 00:17:57.887 "driver_specific": {} 00:17:57.887 }' 00:17:57.887 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:58.146 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.406 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.406 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:58.406 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:58.406 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:58.406 06:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:58.665 "name": "BaseBdev2", 00:17:58.665 "aliases": [ 00:17:58.665 "89f04fab-19f8-4e36-a0fe-eef5f81126de" 00:17:58.665 ], 00:17:58.665 "product_name": "Malloc disk", 00:17:58.665 "block_size": 512, 00:17:58.665 "num_blocks": 65536, 00:17:58.665 "uuid": "89f04fab-19f8-4e36-a0fe-eef5f81126de", 00:17:58.665 "assigned_rate_limits": { 00:17:58.665 "rw_ios_per_sec": 0, 00:17:58.665 "rw_mbytes_per_sec": 0, 00:17:58.665 "r_mbytes_per_sec": 0, 00:17:58.665 "w_mbytes_per_sec": 0 00:17:58.665 }, 00:17:58.665 "claimed": true, 00:17:58.665 "claim_type": "exclusive_write", 00:17:58.665 "zoned": false, 00:17:58.665 "supported_io_types": { 00:17:58.665 "read": true, 00:17:58.665 "write": true, 00:17:58.665 "unmap": true, 00:17:58.665 "flush": true, 00:17:58.665 "reset": true, 00:17:58.665 "nvme_admin": false, 00:17:58.665 "nvme_io": false, 00:17:58.665 "nvme_io_md": false, 00:17:58.665 "write_zeroes": true, 00:17:58.665 "zcopy": true, 00:17:58.665 "get_zone_info": false, 00:17:58.665 "zone_management": false, 00:17:58.665 "zone_append": false, 00:17:58.665 "compare": false, 00:17:58.665 "compare_and_write": false, 00:17:58.665 "abort": true, 00:17:58.665 "seek_hole": false, 00:17:58.665 "seek_data": false, 00:17:58.665 "copy": true, 00:17:58.665 "nvme_iov_md": false 00:17:58.665 }, 00:17:58.665 "memory_domains": [ 00:17:58.665 { 00:17:58.665 "dma_device_id": "system", 00:17:58.665 "dma_device_type": 1 00:17:58.665 }, 00:17:58.665 { 00:17:58.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.665 "dma_device_type": 2 00:17:58.665 } 00:17:58.665 ], 00:17:58.665 "driver_specific": {} 00:17:58.665 }' 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.665 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.923 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:58.923 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.923 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.923 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:58.923 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:58.923 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:58.923 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.182 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.182 "name": "BaseBdev3", 00:17:59.182 "aliases": [ 00:17:59.182 "b52e1b46-0e86-454c-b18d-55b98cf5e2ee" 00:17:59.183 ], 00:17:59.183 "product_name": "Malloc disk", 00:17:59.183 "block_size": 512, 00:17:59.183 "num_blocks": 65536, 00:17:59.183 "uuid": "b52e1b46-0e86-454c-b18d-55b98cf5e2ee", 00:17:59.183 "assigned_rate_limits": { 00:17:59.183 "rw_ios_per_sec": 0, 00:17:59.183 "rw_mbytes_per_sec": 0, 00:17:59.183 "r_mbytes_per_sec": 0, 00:17:59.183 "w_mbytes_per_sec": 0 00:17:59.183 }, 00:17:59.183 "claimed": true, 00:17:59.183 "claim_type": "exclusive_write", 00:17:59.183 "zoned": false, 00:17:59.183 "supported_io_types": { 00:17:59.183 "read": true, 00:17:59.183 "write": true, 00:17:59.183 "unmap": true, 00:17:59.183 "flush": true, 00:17:59.183 "reset": true, 00:17:59.183 "nvme_admin": false, 00:17:59.183 "nvme_io": false, 00:17:59.183 "nvme_io_md": false, 00:17:59.183 "write_zeroes": true, 00:17:59.183 "zcopy": true, 00:17:59.183 "get_zone_info": false, 00:17:59.183 "zone_management": false, 00:17:59.183 "zone_append": false, 00:17:59.183 "compare": false, 00:17:59.183 "compare_and_write": false, 00:17:59.183 "abort": true, 00:17:59.183 "seek_hole": false, 00:17:59.183 "seek_data": false, 00:17:59.183 "copy": true, 00:17:59.183 "nvme_iov_md": false 00:17:59.183 }, 00:17:59.183 "memory_domains": [ 00:17:59.183 { 00:17:59.183 "dma_device_id": "system", 00:17:59.183 "dma_device_type": 1 00:17:59.183 }, 00:17:59.183 { 00:17:59.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.183 "dma_device_type": 2 00:17:59.183 } 00:17:59.183 ], 00:17:59.183 "driver_specific": {} 00:17:59.183 }' 00:17:59.183 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.183 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.183 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:59.183 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.183 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.441 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:59.441 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.441 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.441 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.441 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.441 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.441 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.441 06:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:59.701 [2024-07-25 06:34:13.114653] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:59.701 [2024-07-25 06:34:13.114676] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:59.701 [2024-07-25 06:34:13.114714] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.701 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.958 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.958 "name": "Existed_Raid", 00:17:59.958 "uuid": "c20eae1a-745a-4a7c-848f-9da2f68098df", 00:17:59.958 "strip_size_kb": 64, 00:17:59.958 "state": "offline", 00:17:59.959 "raid_level": "concat", 00:17:59.959 "superblock": true, 00:17:59.959 "num_base_bdevs": 3, 00:17:59.959 "num_base_bdevs_discovered": 2, 00:17:59.959 "num_base_bdevs_operational": 2, 00:17:59.959 "base_bdevs_list": [ 00:17:59.959 { 00:17:59.959 "name": null, 00:17:59.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.959 "is_configured": false, 00:17:59.959 "data_offset": 2048, 00:17:59.959 "data_size": 63488 00:17:59.959 }, 00:17:59.959 { 00:17:59.959 "name": "BaseBdev2", 00:17:59.959 "uuid": "89f04fab-19f8-4e36-a0fe-eef5f81126de", 00:17:59.959 "is_configured": true, 00:17:59.959 "data_offset": 2048, 00:17:59.959 "data_size": 63488 00:17:59.959 }, 00:17:59.959 { 00:17:59.959 "name": "BaseBdev3", 00:17:59.959 "uuid": "b52e1b46-0e86-454c-b18d-55b98cf5e2ee", 00:17:59.959 "is_configured": true, 00:17:59.959 "data_offset": 2048, 00:17:59.959 "data_size": 63488 00:17:59.959 } 00:17:59.959 ] 00:17:59.959 }' 00:17:59.959 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.959 06:34:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:00.523 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:00.523 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:00.523 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.523 06:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:00.780 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:00.780 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:00.780 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:01.037 [2024-07-25 06:34:14.383096] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:01.037 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:01.037 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:01.037 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.037 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:01.295 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:01.295 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:01.295 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:01.552 [2024-07-25 06:34:14.854645] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:01.552 [2024-07-25 06:34:14.854684] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc70380 name Existed_Raid, state offline 00:18:01.552 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:01.552 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:01.552 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.552 06:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:01.809 BaseBdev2 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:01.809 06:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:02.066 06:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:02.323 [ 00:18:02.323 { 00:18:02.323 "name": "BaseBdev2", 00:18:02.323 "aliases": [ 00:18:02.323 "e4709eaf-6662-4b5e-9ec9-8ee05a94891c" 00:18:02.323 ], 00:18:02.323 "product_name": "Malloc disk", 00:18:02.323 "block_size": 512, 00:18:02.323 "num_blocks": 65536, 00:18:02.323 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:02.323 "assigned_rate_limits": { 00:18:02.323 "rw_ios_per_sec": 0, 00:18:02.323 "rw_mbytes_per_sec": 0, 00:18:02.323 "r_mbytes_per_sec": 0, 00:18:02.323 "w_mbytes_per_sec": 0 00:18:02.323 }, 00:18:02.323 "claimed": false, 00:18:02.323 "zoned": false, 00:18:02.323 "supported_io_types": { 00:18:02.323 "read": true, 00:18:02.323 "write": true, 00:18:02.323 "unmap": true, 00:18:02.323 "flush": true, 00:18:02.323 "reset": true, 00:18:02.323 "nvme_admin": false, 00:18:02.323 "nvme_io": false, 00:18:02.323 "nvme_io_md": false, 00:18:02.323 "write_zeroes": true, 00:18:02.323 "zcopy": true, 00:18:02.323 "get_zone_info": false, 00:18:02.323 "zone_management": false, 00:18:02.323 "zone_append": false, 00:18:02.323 "compare": false, 00:18:02.323 "compare_and_write": false, 00:18:02.323 "abort": true, 00:18:02.323 "seek_hole": false, 00:18:02.323 "seek_data": false, 00:18:02.323 "copy": true, 00:18:02.323 "nvme_iov_md": false 00:18:02.323 }, 00:18:02.323 "memory_domains": [ 00:18:02.323 { 00:18:02.323 "dma_device_id": "system", 00:18:02.323 "dma_device_type": 1 00:18:02.323 }, 00:18:02.323 { 00:18:02.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.323 "dma_device_type": 2 00:18:02.323 } 00:18:02.323 ], 00:18:02.323 "driver_specific": {} 00:18:02.323 } 00:18:02.323 ] 00:18:02.323 06:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:02.323 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:02.323 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:02.323 06:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:02.580 BaseBdev3 00:18:02.580 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:02.580 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:18:02.580 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:02.580 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:02.580 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:02.580 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:02.580 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:02.837 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:03.094 [ 00:18:03.094 { 00:18:03.094 "name": "BaseBdev3", 00:18:03.094 "aliases": [ 00:18:03.094 "39e538f8-eb80-4fc3-a061-2712f795006c" 00:18:03.094 ], 00:18:03.094 "product_name": "Malloc disk", 00:18:03.094 "block_size": 512, 00:18:03.094 "num_blocks": 65536, 00:18:03.094 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:03.094 "assigned_rate_limits": { 00:18:03.094 "rw_ios_per_sec": 0, 00:18:03.094 "rw_mbytes_per_sec": 0, 00:18:03.094 "r_mbytes_per_sec": 0, 00:18:03.094 "w_mbytes_per_sec": 0 00:18:03.094 }, 00:18:03.094 "claimed": false, 00:18:03.094 "zoned": false, 00:18:03.094 "supported_io_types": { 00:18:03.094 "read": true, 00:18:03.094 "write": true, 00:18:03.094 "unmap": true, 00:18:03.094 "flush": true, 00:18:03.094 "reset": true, 00:18:03.094 "nvme_admin": false, 00:18:03.094 "nvme_io": false, 00:18:03.094 "nvme_io_md": false, 00:18:03.094 "write_zeroes": true, 00:18:03.094 "zcopy": true, 00:18:03.094 "get_zone_info": false, 00:18:03.094 "zone_management": false, 00:18:03.094 "zone_append": false, 00:18:03.094 "compare": false, 00:18:03.094 "compare_and_write": false, 00:18:03.094 "abort": true, 00:18:03.094 "seek_hole": false, 00:18:03.094 "seek_data": false, 00:18:03.094 "copy": true, 00:18:03.094 "nvme_iov_md": false 00:18:03.094 }, 00:18:03.094 "memory_domains": [ 00:18:03.094 { 00:18:03.094 "dma_device_id": "system", 00:18:03.094 "dma_device_type": 1 00:18:03.094 }, 00:18:03.094 { 00:18:03.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.094 "dma_device_type": 2 00:18:03.094 } 00:18:03.094 ], 00:18:03.094 "driver_specific": {} 00:18:03.094 } 00:18:03.094 ] 00:18:03.094 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:03.094 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:03.094 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:03.094 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:03.351 [2024-07-25 06:34:16.661782] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:03.351 [2024-07-25 06:34:16.661818] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:03.351 [2024-07-25 06:34:16.661834] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:03.351 [2024-07-25 06:34:16.662949] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:03.351 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.352 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.609 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.609 "name": "Existed_Raid", 00:18:03.609 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:03.609 "strip_size_kb": 64, 00:18:03.609 "state": "configuring", 00:18:03.609 "raid_level": "concat", 00:18:03.609 "superblock": true, 00:18:03.609 "num_base_bdevs": 3, 00:18:03.609 "num_base_bdevs_discovered": 2, 00:18:03.609 "num_base_bdevs_operational": 3, 00:18:03.609 "base_bdevs_list": [ 00:18:03.609 { 00:18:03.609 "name": "BaseBdev1", 00:18:03.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.609 "is_configured": false, 00:18:03.609 "data_offset": 0, 00:18:03.609 "data_size": 0 00:18:03.609 }, 00:18:03.609 { 00:18:03.609 "name": "BaseBdev2", 00:18:03.609 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:03.609 "is_configured": true, 00:18:03.609 "data_offset": 2048, 00:18:03.609 "data_size": 63488 00:18:03.609 }, 00:18:03.609 { 00:18:03.609 "name": "BaseBdev3", 00:18:03.609 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:03.609 "is_configured": true, 00:18:03.609 "data_offset": 2048, 00:18:03.609 "data_size": 63488 00:18:03.609 } 00:18:03.609 ] 00:18:03.609 }' 00:18:03.609 06:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.609 06:34:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:04.176 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:04.433 [2024-07-25 06:34:17.756635] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:04.433 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.434 06:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:04.691 06:34:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.691 "name": "Existed_Raid", 00:18:04.691 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:04.691 "strip_size_kb": 64, 00:18:04.691 "state": "configuring", 00:18:04.691 "raid_level": "concat", 00:18:04.691 "superblock": true, 00:18:04.691 "num_base_bdevs": 3, 00:18:04.691 "num_base_bdevs_discovered": 1, 00:18:04.691 "num_base_bdevs_operational": 3, 00:18:04.691 "base_bdevs_list": [ 00:18:04.691 { 00:18:04.691 "name": "BaseBdev1", 00:18:04.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.691 "is_configured": false, 00:18:04.691 "data_offset": 0, 00:18:04.691 "data_size": 0 00:18:04.691 }, 00:18:04.691 { 00:18:04.691 "name": null, 00:18:04.691 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:04.691 "is_configured": false, 00:18:04.691 "data_offset": 2048, 00:18:04.691 "data_size": 63488 00:18:04.691 }, 00:18:04.691 { 00:18:04.691 "name": "BaseBdev3", 00:18:04.691 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:04.691 "is_configured": true, 00:18:04.691 "data_offset": 2048, 00:18:04.691 "data_size": 63488 00:18:04.691 } 00:18:04.691 ] 00:18:04.691 }' 00:18:04.691 06:34:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.691 06:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:05.257 06:34:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.257 06:34:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:05.257 06:34:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:05.257 06:34:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:05.514 [2024-07-25 06:34:19.011210] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:05.514 BaseBdev1 00:18:05.514 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:05.514 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:18:05.514 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:05.514 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:05.514 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:05.514 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:05.514 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:05.772 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:06.030 [ 00:18:06.030 { 00:18:06.030 "name": "BaseBdev1", 00:18:06.030 "aliases": [ 00:18:06.030 "ea6875cd-649b-482c-9744-472c6cb9507f" 00:18:06.030 ], 00:18:06.030 "product_name": "Malloc disk", 00:18:06.030 "block_size": 512, 00:18:06.030 "num_blocks": 65536, 00:18:06.030 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:06.030 "assigned_rate_limits": { 00:18:06.030 "rw_ios_per_sec": 0, 00:18:06.030 "rw_mbytes_per_sec": 0, 00:18:06.030 "r_mbytes_per_sec": 0, 00:18:06.030 "w_mbytes_per_sec": 0 00:18:06.030 }, 00:18:06.030 "claimed": true, 00:18:06.030 "claim_type": "exclusive_write", 00:18:06.030 "zoned": false, 00:18:06.030 "supported_io_types": { 00:18:06.030 "read": true, 00:18:06.030 "write": true, 00:18:06.030 "unmap": true, 00:18:06.030 "flush": true, 00:18:06.030 "reset": true, 00:18:06.030 "nvme_admin": false, 00:18:06.030 "nvme_io": false, 00:18:06.030 "nvme_io_md": false, 00:18:06.030 "write_zeroes": true, 00:18:06.030 "zcopy": true, 00:18:06.030 "get_zone_info": false, 00:18:06.030 "zone_management": false, 00:18:06.030 "zone_append": false, 00:18:06.030 "compare": false, 00:18:06.030 "compare_and_write": false, 00:18:06.030 "abort": true, 00:18:06.030 "seek_hole": false, 00:18:06.030 "seek_data": false, 00:18:06.030 "copy": true, 00:18:06.030 "nvme_iov_md": false 00:18:06.030 }, 00:18:06.030 "memory_domains": [ 00:18:06.030 { 00:18:06.030 "dma_device_id": "system", 00:18:06.030 "dma_device_type": 1 00:18:06.030 }, 00:18:06.030 { 00:18:06.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.030 "dma_device_type": 2 00:18:06.030 } 00:18:06.030 ], 00:18:06.030 "driver_specific": {} 00:18:06.030 } 00:18:06.030 ] 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.030 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.287 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.287 "name": "Existed_Raid", 00:18:06.288 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:06.288 "strip_size_kb": 64, 00:18:06.288 "state": "configuring", 00:18:06.288 "raid_level": "concat", 00:18:06.288 "superblock": true, 00:18:06.288 "num_base_bdevs": 3, 00:18:06.288 "num_base_bdevs_discovered": 2, 00:18:06.288 "num_base_bdevs_operational": 3, 00:18:06.288 "base_bdevs_list": [ 00:18:06.288 { 00:18:06.288 "name": "BaseBdev1", 00:18:06.288 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:06.288 "is_configured": true, 00:18:06.288 "data_offset": 2048, 00:18:06.288 "data_size": 63488 00:18:06.288 }, 00:18:06.288 { 00:18:06.288 "name": null, 00:18:06.288 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:06.288 "is_configured": false, 00:18:06.288 "data_offset": 2048, 00:18:06.288 "data_size": 63488 00:18:06.288 }, 00:18:06.288 { 00:18:06.288 "name": "BaseBdev3", 00:18:06.288 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:06.288 "is_configured": true, 00:18:06.288 "data_offset": 2048, 00:18:06.288 "data_size": 63488 00:18:06.288 } 00:18:06.288 ] 00:18:06.288 }' 00:18:06.288 06:34:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.288 06:34:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.852 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:06.852 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.110 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:07.110 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:07.368 [2024-07-25 06:34:20.723793] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.368 "name": "Existed_Raid", 00:18:07.368 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:07.368 "strip_size_kb": 64, 00:18:07.368 "state": "configuring", 00:18:07.368 "raid_level": "concat", 00:18:07.368 "superblock": true, 00:18:07.368 "num_base_bdevs": 3, 00:18:07.368 "num_base_bdevs_discovered": 1, 00:18:07.368 "num_base_bdevs_operational": 3, 00:18:07.368 "base_bdevs_list": [ 00:18:07.368 { 00:18:07.368 "name": "BaseBdev1", 00:18:07.368 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:07.368 "is_configured": true, 00:18:07.368 "data_offset": 2048, 00:18:07.368 "data_size": 63488 00:18:07.368 }, 00:18:07.368 { 00:18:07.368 "name": null, 00:18:07.368 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:07.368 "is_configured": false, 00:18:07.368 "data_offset": 2048, 00:18:07.368 "data_size": 63488 00:18:07.368 }, 00:18:07.368 { 00:18:07.368 "name": null, 00:18:07.368 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:07.368 "is_configured": false, 00:18:07.368 "data_offset": 2048, 00:18:07.368 "data_size": 63488 00:18:07.368 } 00:18:07.368 ] 00:18:07.368 }' 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.368 06:34:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:08.337 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.337 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:08.337 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:08.337 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:08.625 [2024-07-25 06:34:21.926987] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.625 06:34:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.625 06:34:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.625 "name": "Existed_Raid", 00:18:08.625 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:08.625 "strip_size_kb": 64, 00:18:08.625 "state": "configuring", 00:18:08.625 "raid_level": "concat", 00:18:08.625 "superblock": true, 00:18:08.625 "num_base_bdevs": 3, 00:18:08.625 "num_base_bdevs_discovered": 2, 00:18:08.625 "num_base_bdevs_operational": 3, 00:18:08.625 "base_bdevs_list": [ 00:18:08.625 { 00:18:08.625 "name": "BaseBdev1", 00:18:08.625 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:08.625 "is_configured": true, 00:18:08.625 "data_offset": 2048, 00:18:08.625 "data_size": 63488 00:18:08.625 }, 00:18:08.625 { 00:18:08.625 "name": null, 00:18:08.625 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:08.625 "is_configured": false, 00:18:08.625 "data_offset": 2048, 00:18:08.625 "data_size": 63488 00:18:08.625 }, 00:18:08.625 { 00:18:08.625 "name": "BaseBdev3", 00:18:08.625 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:08.625 "is_configured": true, 00:18:08.625 "data_offset": 2048, 00:18:08.625 "data_size": 63488 00:18:08.625 } 00:18:08.625 ] 00:18:08.625 }' 00:18:08.625 06:34:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.625 06:34:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:09.191 06:34:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.191 06:34:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:09.448 06:34:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:09.448 06:34:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:09.705 [2024-07-25 06:34:23.162324] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:09.705 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:18:09.705 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.705 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.705 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:09.706 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.706 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:09.706 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.706 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.706 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.706 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.706 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.706 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:09.963 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.963 "name": "Existed_Raid", 00:18:09.963 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:09.963 "strip_size_kb": 64, 00:18:09.963 "state": "configuring", 00:18:09.963 "raid_level": "concat", 00:18:09.963 "superblock": true, 00:18:09.963 "num_base_bdevs": 3, 00:18:09.963 "num_base_bdevs_discovered": 1, 00:18:09.963 "num_base_bdevs_operational": 3, 00:18:09.963 "base_bdevs_list": [ 00:18:09.963 { 00:18:09.963 "name": null, 00:18:09.963 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:09.963 "is_configured": false, 00:18:09.963 "data_offset": 2048, 00:18:09.963 "data_size": 63488 00:18:09.963 }, 00:18:09.963 { 00:18:09.963 "name": null, 00:18:09.963 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:09.963 "is_configured": false, 00:18:09.963 "data_offset": 2048, 00:18:09.963 "data_size": 63488 00:18:09.963 }, 00:18:09.963 { 00:18:09.963 "name": "BaseBdev3", 00:18:09.963 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:09.963 "is_configured": true, 00:18:09.963 "data_offset": 2048, 00:18:09.963 "data_size": 63488 00:18:09.963 } 00:18:09.963 ] 00:18:09.963 }' 00:18:09.963 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.963 06:34:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.528 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.528 06:34:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:10.786 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:10.786 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:11.043 [2024-07-25 06:34:24.415916] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.043 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.301 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.301 "name": "Existed_Raid", 00:18:11.301 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:11.301 "strip_size_kb": 64, 00:18:11.301 "state": "configuring", 00:18:11.301 "raid_level": "concat", 00:18:11.301 "superblock": true, 00:18:11.301 "num_base_bdevs": 3, 00:18:11.301 "num_base_bdevs_discovered": 2, 00:18:11.301 "num_base_bdevs_operational": 3, 00:18:11.301 "base_bdevs_list": [ 00:18:11.301 { 00:18:11.301 "name": null, 00:18:11.301 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:11.301 "is_configured": false, 00:18:11.301 "data_offset": 2048, 00:18:11.301 "data_size": 63488 00:18:11.301 }, 00:18:11.301 { 00:18:11.301 "name": "BaseBdev2", 00:18:11.301 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:11.301 "is_configured": true, 00:18:11.301 "data_offset": 2048, 00:18:11.301 "data_size": 63488 00:18:11.301 }, 00:18:11.301 { 00:18:11.301 "name": "BaseBdev3", 00:18:11.301 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:11.301 "is_configured": true, 00:18:11.301 "data_offset": 2048, 00:18:11.301 "data_size": 63488 00:18:11.301 } 00:18:11.301 ] 00:18:11.301 }' 00:18:11.301 06:34:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.301 06:34:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:11.865 06:34:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.865 06:34:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:12.123 06:34:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:12.123 06:34:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.123 06:34:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:12.123 06:34:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ea6875cd-649b-482c-9744-472c6cb9507f 00:18:12.380 [2024-07-25 06:34:25.890879] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:12.380 [2024-07-25 06:34:25.891012] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc67660 00:18:12.380 [2024-07-25 06:34:25.891024] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:12.380 [2024-07-25 06:34:25.891192] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc69290 00:18:12.380 [2024-07-25 06:34:25.891294] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc67660 00:18:12.380 [2024-07-25 06:34:25.891303] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc67660 00:18:12.381 [2024-07-25 06:34:25.891385] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:12.381 NewBaseBdev 00:18:12.381 06:34:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:12.381 06:34:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:18:12.381 06:34:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:12.381 06:34:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:12.381 06:34:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:12.381 06:34:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:12.381 06:34:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:12.638 06:34:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:12.895 [ 00:18:12.895 { 00:18:12.895 "name": "NewBaseBdev", 00:18:12.895 "aliases": [ 00:18:12.895 "ea6875cd-649b-482c-9744-472c6cb9507f" 00:18:12.895 ], 00:18:12.895 "product_name": "Malloc disk", 00:18:12.895 "block_size": 512, 00:18:12.895 "num_blocks": 65536, 00:18:12.895 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:12.895 "assigned_rate_limits": { 00:18:12.895 "rw_ios_per_sec": 0, 00:18:12.895 "rw_mbytes_per_sec": 0, 00:18:12.895 "r_mbytes_per_sec": 0, 00:18:12.895 "w_mbytes_per_sec": 0 00:18:12.895 }, 00:18:12.895 "claimed": true, 00:18:12.895 "claim_type": "exclusive_write", 00:18:12.895 "zoned": false, 00:18:12.895 "supported_io_types": { 00:18:12.895 "read": true, 00:18:12.895 "write": true, 00:18:12.895 "unmap": true, 00:18:12.895 "flush": true, 00:18:12.895 "reset": true, 00:18:12.895 "nvme_admin": false, 00:18:12.895 "nvme_io": false, 00:18:12.895 "nvme_io_md": false, 00:18:12.895 "write_zeroes": true, 00:18:12.895 "zcopy": true, 00:18:12.895 "get_zone_info": false, 00:18:12.895 "zone_management": false, 00:18:12.895 "zone_append": false, 00:18:12.895 "compare": false, 00:18:12.895 "compare_and_write": false, 00:18:12.895 "abort": true, 00:18:12.895 "seek_hole": false, 00:18:12.895 "seek_data": false, 00:18:12.895 "copy": true, 00:18:12.895 "nvme_iov_md": false 00:18:12.895 }, 00:18:12.895 "memory_domains": [ 00:18:12.895 { 00:18:12.895 "dma_device_id": "system", 00:18:12.895 "dma_device_type": 1 00:18:12.895 }, 00:18:12.895 { 00:18:12.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.895 "dma_device_type": 2 00:18:12.895 } 00:18:12.895 ], 00:18:12.895 "driver_specific": {} 00:18:12.895 } 00:18:12.895 ] 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.895 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.152 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.152 "name": "Existed_Raid", 00:18:13.152 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:13.152 "strip_size_kb": 64, 00:18:13.152 "state": "online", 00:18:13.152 "raid_level": "concat", 00:18:13.152 "superblock": true, 00:18:13.152 "num_base_bdevs": 3, 00:18:13.152 "num_base_bdevs_discovered": 3, 00:18:13.152 "num_base_bdevs_operational": 3, 00:18:13.152 "base_bdevs_list": [ 00:18:13.152 { 00:18:13.152 "name": "NewBaseBdev", 00:18:13.152 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:13.152 "is_configured": true, 00:18:13.152 "data_offset": 2048, 00:18:13.152 "data_size": 63488 00:18:13.152 }, 00:18:13.152 { 00:18:13.152 "name": "BaseBdev2", 00:18:13.152 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:13.152 "is_configured": true, 00:18:13.152 "data_offset": 2048, 00:18:13.152 "data_size": 63488 00:18:13.152 }, 00:18:13.152 { 00:18:13.152 "name": "BaseBdev3", 00:18:13.152 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:13.152 "is_configured": true, 00:18:13.152 "data_offset": 2048, 00:18:13.152 "data_size": 63488 00:18:13.152 } 00:18:13.152 ] 00:18:13.152 }' 00:18:13.152 06:34:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.152 06:34:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:13.718 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:13.718 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:13.718 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:13.718 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:13.718 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:13.718 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:13.718 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:13.718 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:13.976 [2024-07-25 06:34:27.371055] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:13.976 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:13.976 "name": "Existed_Raid", 00:18:13.976 "aliases": [ 00:18:13.976 "566fadd8-1c82-4b76-81f1-8086db5cee26" 00:18:13.976 ], 00:18:13.976 "product_name": "Raid Volume", 00:18:13.976 "block_size": 512, 00:18:13.976 "num_blocks": 190464, 00:18:13.976 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:13.976 "assigned_rate_limits": { 00:18:13.976 "rw_ios_per_sec": 0, 00:18:13.976 "rw_mbytes_per_sec": 0, 00:18:13.976 "r_mbytes_per_sec": 0, 00:18:13.976 "w_mbytes_per_sec": 0 00:18:13.976 }, 00:18:13.976 "claimed": false, 00:18:13.976 "zoned": false, 00:18:13.976 "supported_io_types": { 00:18:13.976 "read": true, 00:18:13.976 "write": true, 00:18:13.976 "unmap": true, 00:18:13.976 "flush": true, 00:18:13.976 "reset": true, 00:18:13.976 "nvme_admin": false, 00:18:13.976 "nvme_io": false, 00:18:13.976 "nvme_io_md": false, 00:18:13.976 "write_zeroes": true, 00:18:13.976 "zcopy": false, 00:18:13.976 "get_zone_info": false, 00:18:13.976 "zone_management": false, 00:18:13.976 "zone_append": false, 00:18:13.976 "compare": false, 00:18:13.976 "compare_and_write": false, 00:18:13.977 "abort": false, 00:18:13.977 "seek_hole": false, 00:18:13.977 "seek_data": false, 00:18:13.977 "copy": false, 00:18:13.977 "nvme_iov_md": false 00:18:13.977 }, 00:18:13.977 "memory_domains": [ 00:18:13.977 { 00:18:13.977 "dma_device_id": "system", 00:18:13.977 "dma_device_type": 1 00:18:13.977 }, 00:18:13.977 { 00:18:13.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.977 "dma_device_type": 2 00:18:13.977 }, 00:18:13.977 { 00:18:13.977 "dma_device_id": "system", 00:18:13.977 "dma_device_type": 1 00:18:13.977 }, 00:18:13.977 { 00:18:13.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.977 "dma_device_type": 2 00:18:13.977 }, 00:18:13.977 { 00:18:13.977 "dma_device_id": "system", 00:18:13.977 "dma_device_type": 1 00:18:13.977 }, 00:18:13.977 { 00:18:13.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.977 "dma_device_type": 2 00:18:13.977 } 00:18:13.977 ], 00:18:13.977 "driver_specific": { 00:18:13.977 "raid": { 00:18:13.977 "uuid": "566fadd8-1c82-4b76-81f1-8086db5cee26", 00:18:13.977 "strip_size_kb": 64, 00:18:13.977 "state": "online", 00:18:13.977 "raid_level": "concat", 00:18:13.977 "superblock": true, 00:18:13.977 "num_base_bdevs": 3, 00:18:13.977 "num_base_bdevs_discovered": 3, 00:18:13.977 "num_base_bdevs_operational": 3, 00:18:13.977 "base_bdevs_list": [ 00:18:13.977 { 00:18:13.977 "name": "NewBaseBdev", 00:18:13.977 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:13.977 "is_configured": true, 00:18:13.977 "data_offset": 2048, 00:18:13.977 "data_size": 63488 00:18:13.977 }, 00:18:13.977 { 00:18:13.977 "name": "BaseBdev2", 00:18:13.977 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:13.977 "is_configured": true, 00:18:13.977 "data_offset": 2048, 00:18:13.977 "data_size": 63488 00:18:13.977 }, 00:18:13.977 { 00:18:13.977 "name": "BaseBdev3", 00:18:13.977 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:13.977 "is_configured": true, 00:18:13.977 "data_offset": 2048, 00:18:13.977 "data_size": 63488 00:18:13.977 } 00:18:13.977 ] 00:18:13.977 } 00:18:13.977 } 00:18:13.977 }' 00:18:13.977 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:13.977 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:13.977 BaseBdev2 00:18:13.977 BaseBdev3' 00:18:13.977 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:13.977 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:13.977 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:14.235 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:14.235 "name": "NewBaseBdev", 00:18:14.235 "aliases": [ 00:18:14.235 "ea6875cd-649b-482c-9744-472c6cb9507f" 00:18:14.235 ], 00:18:14.235 "product_name": "Malloc disk", 00:18:14.235 "block_size": 512, 00:18:14.235 "num_blocks": 65536, 00:18:14.235 "uuid": "ea6875cd-649b-482c-9744-472c6cb9507f", 00:18:14.235 "assigned_rate_limits": { 00:18:14.235 "rw_ios_per_sec": 0, 00:18:14.235 "rw_mbytes_per_sec": 0, 00:18:14.235 "r_mbytes_per_sec": 0, 00:18:14.235 "w_mbytes_per_sec": 0 00:18:14.235 }, 00:18:14.235 "claimed": true, 00:18:14.235 "claim_type": "exclusive_write", 00:18:14.235 "zoned": false, 00:18:14.235 "supported_io_types": { 00:18:14.235 "read": true, 00:18:14.235 "write": true, 00:18:14.235 "unmap": true, 00:18:14.235 "flush": true, 00:18:14.235 "reset": true, 00:18:14.235 "nvme_admin": false, 00:18:14.235 "nvme_io": false, 00:18:14.235 "nvme_io_md": false, 00:18:14.235 "write_zeroes": true, 00:18:14.235 "zcopy": true, 00:18:14.235 "get_zone_info": false, 00:18:14.235 "zone_management": false, 00:18:14.235 "zone_append": false, 00:18:14.235 "compare": false, 00:18:14.235 "compare_and_write": false, 00:18:14.235 "abort": true, 00:18:14.235 "seek_hole": false, 00:18:14.235 "seek_data": false, 00:18:14.235 "copy": true, 00:18:14.235 "nvme_iov_md": false 00:18:14.235 }, 00:18:14.235 "memory_domains": [ 00:18:14.235 { 00:18:14.235 "dma_device_id": "system", 00:18:14.235 "dma_device_type": 1 00:18:14.235 }, 00:18:14.235 { 00:18:14.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.235 "dma_device_type": 2 00:18:14.235 } 00:18:14.235 ], 00:18:14.235 "driver_specific": {} 00:18:14.235 }' 00:18:14.236 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.236 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.236 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:14.236 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:14.494 06:34:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:14.753 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:14.753 "name": "BaseBdev2", 00:18:14.753 "aliases": [ 00:18:14.753 "e4709eaf-6662-4b5e-9ec9-8ee05a94891c" 00:18:14.753 ], 00:18:14.753 "product_name": "Malloc disk", 00:18:14.753 "block_size": 512, 00:18:14.753 "num_blocks": 65536, 00:18:14.753 "uuid": "e4709eaf-6662-4b5e-9ec9-8ee05a94891c", 00:18:14.753 "assigned_rate_limits": { 00:18:14.753 "rw_ios_per_sec": 0, 00:18:14.753 "rw_mbytes_per_sec": 0, 00:18:14.753 "r_mbytes_per_sec": 0, 00:18:14.753 "w_mbytes_per_sec": 0 00:18:14.753 }, 00:18:14.753 "claimed": true, 00:18:14.753 "claim_type": "exclusive_write", 00:18:14.753 "zoned": false, 00:18:14.753 "supported_io_types": { 00:18:14.753 "read": true, 00:18:14.753 "write": true, 00:18:14.753 "unmap": true, 00:18:14.753 "flush": true, 00:18:14.753 "reset": true, 00:18:14.753 "nvme_admin": false, 00:18:14.753 "nvme_io": false, 00:18:14.753 "nvme_io_md": false, 00:18:14.753 "write_zeroes": true, 00:18:14.753 "zcopy": true, 00:18:14.753 "get_zone_info": false, 00:18:14.753 "zone_management": false, 00:18:14.753 "zone_append": false, 00:18:14.753 "compare": false, 00:18:14.753 "compare_and_write": false, 00:18:14.753 "abort": true, 00:18:14.753 "seek_hole": false, 00:18:14.753 "seek_data": false, 00:18:14.753 "copy": true, 00:18:14.753 "nvme_iov_md": false 00:18:14.753 }, 00:18:14.753 "memory_domains": [ 00:18:14.753 { 00:18:14.753 "dma_device_id": "system", 00:18:14.753 "dma_device_type": 1 00:18:14.753 }, 00:18:14.753 { 00:18:14.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.753 "dma_device_type": 2 00:18:14.753 } 00:18:14.753 ], 00:18:14.753 "driver_specific": {} 00:18:14.753 }' 00:18:14.753 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.753 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.753 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:14.753 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.011 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.011 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:15.011 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.011 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.011 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:15.011 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.011 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.012 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:15.012 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:15.012 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:15.012 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:15.270 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:15.270 "name": "BaseBdev3", 00:18:15.270 "aliases": [ 00:18:15.270 "39e538f8-eb80-4fc3-a061-2712f795006c" 00:18:15.270 ], 00:18:15.270 "product_name": "Malloc disk", 00:18:15.270 "block_size": 512, 00:18:15.270 "num_blocks": 65536, 00:18:15.270 "uuid": "39e538f8-eb80-4fc3-a061-2712f795006c", 00:18:15.270 "assigned_rate_limits": { 00:18:15.270 "rw_ios_per_sec": 0, 00:18:15.270 "rw_mbytes_per_sec": 0, 00:18:15.270 "r_mbytes_per_sec": 0, 00:18:15.270 "w_mbytes_per_sec": 0 00:18:15.270 }, 00:18:15.270 "claimed": true, 00:18:15.270 "claim_type": "exclusive_write", 00:18:15.270 "zoned": false, 00:18:15.270 "supported_io_types": { 00:18:15.270 "read": true, 00:18:15.270 "write": true, 00:18:15.270 "unmap": true, 00:18:15.270 "flush": true, 00:18:15.270 "reset": true, 00:18:15.270 "nvme_admin": false, 00:18:15.270 "nvme_io": false, 00:18:15.270 "nvme_io_md": false, 00:18:15.270 "write_zeroes": true, 00:18:15.270 "zcopy": true, 00:18:15.270 "get_zone_info": false, 00:18:15.270 "zone_management": false, 00:18:15.270 "zone_append": false, 00:18:15.270 "compare": false, 00:18:15.270 "compare_and_write": false, 00:18:15.270 "abort": true, 00:18:15.270 "seek_hole": false, 00:18:15.270 "seek_data": false, 00:18:15.270 "copy": true, 00:18:15.270 "nvme_iov_md": false 00:18:15.270 }, 00:18:15.270 "memory_domains": [ 00:18:15.270 { 00:18:15.270 "dma_device_id": "system", 00:18:15.270 "dma_device_type": 1 00:18:15.270 }, 00:18:15.270 { 00:18:15.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.270 "dma_device_type": 2 00:18:15.270 } 00:18:15.270 ], 00:18:15.270 "driver_specific": {} 00:18:15.270 }' 00:18:15.270 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.270 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.529 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:15.529 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.529 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.529 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:15.529 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.529 06:34:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.529 06:34:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:15.529 06:34:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.529 06:34:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.787 06:34:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:15.787 06:34:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:15.787 [2024-07-25 06:34:29.324016] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:15.787 [2024-07-25 06:34:29.324038] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:15.787 [2024-07-25 06:34:29.324085] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:15.787 [2024-07-25 06:34:29.324132] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:15.787 [2024-07-25 06:34:29.324150] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc67660 name Existed_Raid, state offline 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1140112 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1140112 ']' 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1140112 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1140112 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1140112' 00:18:16.046 killing process with pid 1140112 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1140112 00:18:16.046 [2024-07-25 06:34:29.403675] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:16.046 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1140112 00:18:16.046 [2024-07-25 06:34:29.427662] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:16.305 06:34:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:16.305 00:18:16.305 real 0m26.475s 00:18:16.305 user 0m48.513s 00:18:16.305 sys 0m4.871s 00:18:16.305 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:16.305 06:34:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:16.305 ************************************ 00:18:16.305 END TEST raid_state_function_test_sb 00:18:16.305 ************************************ 00:18:16.305 06:34:29 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:18:16.305 06:34:29 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:18:16.305 06:34:29 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:16.305 06:34:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:16.305 ************************************ 00:18:16.305 START TEST raid_superblock_test 00:18:16.305 ************************************ 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 3 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1145191 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1145191 /var/tmp/spdk-raid.sock 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1145191 ']' 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:16.305 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:16.305 06:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.305 [2024-07-25 06:34:29.751048] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:18:16.305 [2024-07-25 06:34:29.751105] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1145191 ] 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:01.0 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:01.1 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:01.2 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:01.3 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:01.4 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:01.5 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:01.6 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:01.7 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:02.0 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:02.1 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:02.2 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:02.3 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:02.4 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.305 EAL: Requested device 0000:3d:02.5 cannot be used 00:18:16.305 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3d:02.6 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3d:02.7 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:01.0 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:01.1 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:01.2 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:01.3 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:01.4 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:01.5 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:01.6 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:01.7 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:02.0 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:02.1 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:02.2 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:02.3 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:02.4 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:02.5 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:02.6 cannot be used 00:18:16.306 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:16.306 EAL: Requested device 0000:3f:02.7 cannot be used 00:18:16.564 [2024-07-25 06:34:29.887750] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:16.564 [2024-07-25 06:34:29.933561] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:16.564 [2024-07-25 06:34:29.998074] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:16.564 [2024-07-25 06:34:29.998105] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:17.130 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:17.389 malloc1 00:18:17.389 06:34:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:17.647 [2024-07-25 06:34:31.084841] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:17.647 [2024-07-25 06:34:31.084884] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:17.647 [2024-07-25 06:34:31.084904] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc56d70 00:18:17.647 [2024-07-25 06:34:31.084916] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:17.647 [2024-07-25 06:34:31.086417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:17.647 [2024-07-25 06:34:31.086445] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:17.647 pt1 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:17.647 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:17.906 malloc2 00:18:17.906 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:18.165 [2024-07-25 06:34:31.550324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:18.165 [2024-07-25 06:34:31.550363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.165 [2024-07-25 06:34:31.550379] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaa5790 00:18:18.165 [2024-07-25 06:34:31.550390] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.165 [2024-07-25 06:34:31.551709] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.165 [2024-07-25 06:34:31.551736] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:18.165 pt2 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:18.165 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:18.423 malloc3 00:18:18.423 06:34:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:18.681 [2024-07-25 06:34:32.015749] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:18.681 [2024-07-25 06:34:32.015789] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.681 [2024-07-25 06:34:32.015805] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4a8c0 00:18:18.681 [2024-07-25 06:34:32.015816] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.681 [2024-07-25 06:34:32.017134] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.681 [2024-07-25 06:34:32.017167] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:18.681 pt3 00:18:18.681 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:18.681 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:18.681 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:18:18.939 [2024-07-25 06:34:32.240352] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:18.939 [2024-07-25 06:34:32.241488] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:18.939 [2024-07-25 06:34:32.241537] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:18.939 [2024-07-25 06:34:32.241676] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc4c0e0 00:18:18.939 [2024-07-25 06:34:32.241686] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:18.939 [2024-07-25 06:34:32.241865] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa9cbd0 00:18:18.939 [2024-07-25 06:34:32.241993] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc4c0e0 00:18:18.939 [2024-07-25 06:34:32.242002] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc4c0e0 00:18:18.939 [2024-07-25 06:34:32.242087] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:18.939 "name": "raid_bdev1", 00:18:18.939 "uuid": "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44", 00:18:18.939 "strip_size_kb": 64, 00:18:18.939 "state": "online", 00:18:18.939 "raid_level": "concat", 00:18:18.939 "superblock": true, 00:18:18.939 "num_base_bdevs": 3, 00:18:18.939 "num_base_bdevs_discovered": 3, 00:18:18.939 "num_base_bdevs_operational": 3, 00:18:18.939 "base_bdevs_list": [ 00:18:18.939 { 00:18:18.939 "name": "pt1", 00:18:18.939 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:18.939 "is_configured": true, 00:18:18.939 "data_offset": 2048, 00:18:18.939 "data_size": 63488 00:18:18.939 }, 00:18:18.939 { 00:18:18.939 "name": "pt2", 00:18:18.939 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:18.939 "is_configured": true, 00:18:18.939 "data_offset": 2048, 00:18:18.939 "data_size": 63488 00:18:18.939 }, 00:18:18.939 { 00:18:18.939 "name": "pt3", 00:18:18.939 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:18.939 "is_configured": true, 00:18:18.939 "data_offset": 2048, 00:18:18.939 "data_size": 63488 00:18:18.939 } 00:18:18.939 ] 00:18:18.939 }' 00:18:18.939 06:34:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:18.940 06:34:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:19.505 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:18:19.505 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:19.505 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:19.505 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:19.505 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:19.505 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:19.763 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:19.763 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:19.763 [2024-07-25 06:34:33.271316] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:19.763 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:19.763 "name": "raid_bdev1", 00:18:19.763 "aliases": [ 00:18:19.763 "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44" 00:18:19.763 ], 00:18:19.763 "product_name": "Raid Volume", 00:18:19.763 "block_size": 512, 00:18:19.763 "num_blocks": 190464, 00:18:19.763 "uuid": "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44", 00:18:19.763 "assigned_rate_limits": { 00:18:19.763 "rw_ios_per_sec": 0, 00:18:19.763 "rw_mbytes_per_sec": 0, 00:18:19.763 "r_mbytes_per_sec": 0, 00:18:19.763 "w_mbytes_per_sec": 0 00:18:19.763 }, 00:18:19.763 "claimed": false, 00:18:19.763 "zoned": false, 00:18:19.763 "supported_io_types": { 00:18:19.763 "read": true, 00:18:19.763 "write": true, 00:18:19.763 "unmap": true, 00:18:19.763 "flush": true, 00:18:19.763 "reset": true, 00:18:19.763 "nvme_admin": false, 00:18:19.763 "nvme_io": false, 00:18:19.763 "nvme_io_md": false, 00:18:19.763 "write_zeroes": true, 00:18:19.763 "zcopy": false, 00:18:19.763 "get_zone_info": false, 00:18:19.763 "zone_management": false, 00:18:19.763 "zone_append": false, 00:18:19.763 "compare": false, 00:18:19.763 "compare_and_write": false, 00:18:19.763 "abort": false, 00:18:19.763 "seek_hole": false, 00:18:19.763 "seek_data": false, 00:18:19.763 "copy": false, 00:18:19.763 "nvme_iov_md": false 00:18:19.763 }, 00:18:19.763 "memory_domains": [ 00:18:19.763 { 00:18:19.763 "dma_device_id": "system", 00:18:19.763 "dma_device_type": 1 00:18:19.763 }, 00:18:19.763 { 00:18:19.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.763 "dma_device_type": 2 00:18:19.763 }, 00:18:19.763 { 00:18:19.763 "dma_device_id": "system", 00:18:19.763 "dma_device_type": 1 00:18:19.763 }, 00:18:19.763 { 00:18:19.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.763 "dma_device_type": 2 00:18:19.763 }, 00:18:19.763 { 00:18:19.763 "dma_device_id": "system", 00:18:19.763 "dma_device_type": 1 00:18:19.763 }, 00:18:19.763 { 00:18:19.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.763 "dma_device_type": 2 00:18:19.763 } 00:18:19.763 ], 00:18:19.763 "driver_specific": { 00:18:19.763 "raid": { 00:18:19.763 "uuid": "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44", 00:18:19.763 "strip_size_kb": 64, 00:18:19.763 "state": "online", 00:18:19.763 "raid_level": "concat", 00:18:19.763 "superblock": true, 00:18:19.763 "num_base_bdevs": 3, 00:18:19.763 "num_base_bdevs_discovered": 3, 00:18:19.763 "num_base_bdevs_operational": 3, 00:18:19.763 "base_bdevs_list": [ 00:18:19.763 { 00:18:19.763 "name": "pt1", 00:18:19.763 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:19.763 "is_configured": true, 00:18:19.763 "data_offset": 2048, 00:18:19.763 "data_size": 63488 00:18:19.763 }, 00:18:19.763 { 00:18:19.763 "name": "pt2", 00:18:19.763 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:19.763 "is_configured": true, 00:18:19.763 "data_offset": 2048, 00:18:19.763 "data_size": 63488 00:18:19.763 }, 00:18:19.763 { 00:18:19.763 "name": "pt3", 00:18:19.763 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:19.763 "is_configured": true, 00:18:19.763 "data_offset": 2048, 00:18:19.763 "data_size": 63488 00:18:19.763 } 00:18:19.763 ] 00:18:19.763 } 00:18:19.763 } 00:18:19.763 }' 00:18:19.763 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:20.020 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:20.020 pt2 00:18:20.020 pt3' 00:18:20.020 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:20.020 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:20.020 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:20.020 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:20.020 "name": "pt1", 00:18:20.020 "aliases": [ 00:18:20.020 "00000000-0000-0000-0000-000000000001" 00:18:20.020 ], 00:18:20.020 "product_name": "passthru", 00:18:20.020 "block_size": 512, 00:18:20.020 "num_blocks": 65536, 00:18:20.020 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:20.020 "assigned_rate_limits": { 00:18:20.020 "rw_ios_per_sec": 0, 00:18:20.020 "rw_mbytes_per_sec": 0, 00:18:20.020 "r_mbytes_per_sec": 0, 00:18:20.020 "w_mbytes_per_sec": 0 00:18:20.020 }, 00:18:20.020 "claimed": true, 00:18:20.020 "claim_type": "exclusive_write", 00:18:20.020 "zoned": false, 00:18:20.020 "supported_io_types": { 00:18:20.020 "read": true, 00:18:20.020 "write": true, 00:18:20.020 "unmap": true, 00:18:20.020 "flush": true, 00:18:20.020 "reset": true, 00:18:20.020 "nvme_admin": false, 00:18:20.020 "nvme_io": false, 00:18:20.020 "nvme_io_md": false, 00:18:20.020 "write_zeroes": true, 00:18:20.020 "zcopy": true, 00:18:20.020 "get_zone_info": false, 00:18:20.020 "zone_management": false, 00:18:20.020 "zone_append": false, 00:18:20.020 "compare": false, 00:18:20.020 "compare_and_write": false, 00:18:20.020 "abort": true, 00:18:20.020 "seek_hole": false, 00:18:20.020 "seek_data": false, 00:18:20.020 "copy": true, 00:18:20.020 "nvme_iov_md": false 00:18:20.020 }, 00:18:20.020 "memory_domains": [ 00:18:20.020 { 00:18:20.020 "dma_device_id": "system", 00:18:20.020 "dma_device_type": 1 00:18:20.020 }, 00:18:20.020 { 00:18:20.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.020 "dma_device_type": 2 00:18:20.020 } 00:18:20.020 ], 00:18:20.020 "driver_specific": { 00:18:20.020 "passthru": { 00:18:20.020 "name": "pt1", 00:18:20.020 "base_bdev_name": "malloc1" 00:18:20.020 } 00:18:20.020 } 00:18:20.020 }' 00:18:20.020 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:20.277 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.535 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.535 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:20.535 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:20.535 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:20.535 06:34:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:20.793 "name": "pt2", 00:18:20.793 "aliases": [ 00:18:20.793 "00000000-0000-0000-0000-000000000002" 00:18:20.793 ], 00:18:20.793 "product_name": "passthru", 00:18:20.793 "block_size": 512, 00:18:20.793 "num_blocks": 65536, 00:18:20.793 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:20.793 "assigned_rate_limits": { 00:18:20.793 "rw_ios_per_sec": 0, 00:18:20.793 "rw_mbytes_per_sec": 0, 00:18:20.793 "r_mbytes_per_sec": 0, 00:18:20.793 "w_mbytes_per_sec": 0 00:18:20.793 }, 00:18:20.793 "claimed": true, 00:18:20.793 "claim_type": "exclusive_write", 00:18:20.793 "zoned": false, 00:18:20.793 "supported_io_types": { 00:18:20.793 "read": true, 00:18:20.793 "write": true, 00:18:20.793 "unmap": true, 00:18:20.793 "flush": true, 00:18:20.793 "reset": true, 00:18:20.793 "nvme_admin": false, 00:18:20.793 "nvme_io": false, 00:18:20.793 "nvme_io_md": false, 00:18:20.793 "write_zeroes": true, 00:18:20.793 "zcopy": true, 00:18:20.793 "get_zone_info": false, 00:18:20.793 "zone_management": false, 00:18:20.793 "zone_append": false, 00:18:20.793 "compare": false, 00:18:20.793 "compare_and_write": false, 00:18:20.793 "abort": true, 00:18:20.793 "seek_hole": false, 00:18:20.793 "seek_data": false, 00:18:20.793 "copy": true, 00:18:20.793 "nvme_iov_md": false 00:18:20.793 }, 00:18:20.793 "memory_domains": [ 00:18:20.793 { 00:18:20.793 "dma_device_id": "system", 00:18:20.793 "dma_device_type": 1 00:18:20.793 }, 00:18:20.793 { 00:18:20.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.793 "dma_device_type": 2 00:18:20.793 } 00:18:20.793 ], 00:18:20.793 "driver_specific": { 00:18:20.793 "passthru": { 00:18:20.793 "name": "pt2", 00:18:20.793 "base_bdev_name": "malloc2" 00:18:20.793 } 00:18:20.793 } 00:18:20.793 }' 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.793 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.065 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:21.065 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.065 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.065 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:21.065 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:21.065 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:21.065 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:21.376 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:21.376 "name": "pt3", 00:18:21.376 "aliases": [ 00:18:21.376 "00000000-0000-0000-0000-000000000003" 00:18:21.377 ], 00:18:21.377 "product_name": "passthru", 00:18:21.377 "block_size": 512, 00:18:21.377 "num_blocks": 65536, 00:18:21.377 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:21.377 "assigned_rate_limits": { 00:18:21.377 "rw_ios_per_sec": 0, 00:18:21.377 "rw_mbytes_per_sec": 0, 00:18:21.377 "r_mbytes_per_sec": 0, 00:18:21.377 "w_mbytes_per_sec": 0 00:18:21.377 }, 00:18:21.377 "claimed": true, 00:18:21.377 "claim_type": "exclusive_write", 00:18:21.377 "zoned": false, 00:18:21.377 "supported_io_types": { 00:18:21.377 "read": true, 00:18:21.377 "write": true, 00:18:21.377 "unmap": true, 00:18:21.377 "flush": true, 00:18:21.377 "reset": true, 00:18:21.377 "nvme_admin": false, 00:18:21.377 "nvme_io": false, 00:18:21.377 "nvme_io_md": false, 00:18:21.377 "write_zeroes": true, 00:18:21.377 "zcopy": true, 00:18:21.377 "get_zone_info": false, 00:18:21.377 "zone_management": false, 00:18:21.377 "zone_append": false, 00:18:21.377 "compare": false, 00:18:21.377 "compare_and_write": false, 00:18:21.377 "abort": true, 00:18:21.377 "seek_hole": false, 00:18:21.377 "seek_data": false, 00:18:21.377 "copy": true, 00:18:21.377 "nvme_iov_md": false 00:18:21.377 }, 00:18:21.377 "memory_domains": [ 00:18:21.377 { 00:18:21.377 "dma_device_id": "system", 00:18:21.377 "dma_device_type": 1 00:18:21.377 }, 00:18:21.377 { 00:18:21.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.377 "dma_device_type": 2 00:18:21.377 } 00:18:21.377 ], 00:18:21.377 "driver_specific": { 00:18:21.377 "passthru": { 00:18:21.377 "name": "pt3", 00:18:21.377 "base_bdev_name": "malloc3" 00:18:21.377 } 00:18:21.377 } 00:18:21.377 }' 00:18:21.377 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.377 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.377 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:21.377 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.377 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.377 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:21.377 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.377 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.635 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:21.635 06:34:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.635 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.635 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:21.635 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:18:21.635 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:21.893 [2024-07-25 06:34:35.256525] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:21.893 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44 00:18:21.893 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44 ']' 00:18:21.893 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:22.150 [2024-07-25 06:34:35.484864] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:22.150 [2024-07-25 06:34:35.484883] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:22.151 [2024-07-25 06:34:35.484930] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:22.151 [2024-07-25 06:34:35.484981] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:22.151 [2024-07-25 06:34:35.484992] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc4c0e0 name raid_bdev1, state offline 00:18:22.151 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.151 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:18:22.408 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:18:22.408 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:18:22.408 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:22.408 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:22.408 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:22.408 06:34:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:22.666 06:34:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:22.666 06:34:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:22.924 06:34:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:22.924 06:34:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:23.182 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:23.440 [2024-07-25 06:34:36.848568] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:23.440 [2024-07-25 06:34:36.849799] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:23.440 [2024-07-25 06:34:36.849838] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:23.440 [2024-07-25 06:34:36.849879] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:23.440 [2024-07-25 06:34:36.849917] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:23.440 [2024-07-25 06:34:36.849938] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:23.440 [2024-07-25 06:34:36.849953] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:23.440 [2024-07-25 06:34:36.849962] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa9bdc0 name raid_bdev1, state configuring 00:18:23.440 request: 00:18:23.440 { 00:18:23.440 "name": "raid_bdev1", 00:18:23.440 "raid_level": "concat", 00:18:23.440 "base_bdevs": [ 00:18:23.440 "malloc1", 00:18:23.440 "malloc2", 00:18:23.440 "malloc3" 00:18:23.440 ], 00:18:23.440 "strip_size_kb": 64, 00:18:23.440 "superblock": false, 00:18:23.440 "method": "bdev_raid_create", 00:18:23.440 "req_id": 1 00:18:23.440 } 00:18:23.440 Got JSON-RPC error response 00:18:23.440 response: 00:18:23.440 { 00:18:23.440 "code": -17, 00:18:23.440 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:23.440 } 00:18:23.440 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:18:23.440 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:18:23.440 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:18:23.440 06:34:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:18:23.440 06:34:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:18:23.440 06:34:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.698 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:18:23.698 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:18:23.698 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:23.955 [2024-07-25 06:34:37.309707] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:23.955 [2024-07-25 06:34:37.309745] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:23.955 [2024-07-25 06:34:37.309761] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc47f60 00:18:23.955 [2024-07-25 06:34:37.309772] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:23.955 [2024-07-25 06:34:37.311225] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:23.955 [2024-07-25 06:34:37.311252] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:23.955 [2024-07-25 06:34:37.311309] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:23.956 [2024-07-25 06:34:37.311332] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:23.956 pt1 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.956 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:24.214 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.214 "name": "raid_bdev1", 00:18:24.214 "uuid": "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44", 00:18:24.214 "strip_size_kb": 64, 00:18:24.214 "state": "configuring", 00:18:24.214 "raid_level": "concat", 00:18:24.214 "superblock": true, 00:18:24.214 "num_base_bdevs": 3, 00:18:24.214 "num_base_bdevs_discovered": 1, 00:18:24.214 "num_base_bdevs_operational": 3, 00:18:24.214 "base_bdevs_list": [ 00:18:24.214 { 00:18:24.214 "name": "pt1", 00:18:24.214 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:24.214 "is_configured": true, 00:18:24.214 "data_offset": 2048, 00:18:24.214 "data_size": 63488 00:18:24.214 }, 00:18:24.214 { 00:18:24.214 "name": null, 00:18:24.214 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:24.214 "is_configured": false, 00:18:24.214 "data_offset": 2048, 00:18:24.214 "data_size": 63488 00:18:24.214 }, 00:18:24.214 { 00:18:24.214 "name": null, 00:18:24.214 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:24.214 "is_configured": false, 00:18:24.214 "data_offset": 2048, 00:18:24.214 "data_size": 63488 00:18:24.214 } 00:18:24.214 ] 00:18:24.214 }' 00:18:24.214 06:34:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.214 06:34:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:24.781 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:18:24.781 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:24.781 [2024-07-25 06:34:38.288295] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:24.781 [2024-07-25 06:34:38.288344] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:24.781 [2024-07-25 06:34:38.288362] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa9aea0 00:18:24.781 [2024-07-25 06:34:38.288374] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:24.781 [2024-07-25 06:34:38.288671] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:24.781 [2024-07-25 06:34:38.288687] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:24.781 [2024-07-25 06:34:38.288742] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:24.781 [2024-07-25 06:34:38.288760] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:24.781 pt2 00:18:24.781 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:25.040 [2024-07-25 06:34:38.460755] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.040 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:25.297 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.297 "name": "raid_bdev1", 00:18:25.297 "uuid": "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44", 00:18:25.297 "strip_size_kb": 64, 00:18:25.297 "state": "configuring", 00:18:25.297 "raid_level": "concat", 00:18:25.297 "superblock": true, 00:18:25.297 "num_base_bdevs": 3, 00:18:25.297 "num_base_bdevs_discovered": 1, 00:18:25.297 "num_base_bdevs_operational": 3, 00:18:25.297 "base_bdevs_list": [ 00:18:25.297 { 00:18:25.297 "name": "pt1", 00:18:25.297 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:25.297 "is_configured": true, 00:18:25.297 "data_offset": 2048, 00:18:25.297 "data_size": 63488 00:18:25.297 }, 00:18:25.297 { 00:18:25.297 "name": null, 00:18:25.297 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:25.297 "is_configured": false, 00:18:25.297 "data_offset": 2048, 00:18:25.297 "data_size": 63488 00:18:25.297 }, 00:18:25.297 { 00:18:25.297 "name": null, 00:18:25.297 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:25.297 "is_configured": false, 00:18:25.297 "data_offset": 2048, 00:18:25.297 "data_size": 63488 00:18:25.297 } 00:18:25.297 ] 00:18:25.297 }' 00:18:25.297 06:34:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.297 06:34:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.862 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:18:25.862 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:25.862 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:26.119 [2024-07-25 06:34:39.483434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:26.119 [2024-07-25 06:34:39.483480] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:26.119 [2024-07-25 06:34:39.483497] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaa4470 00:18:26.119 [2024-07-25 06:34:39.483513] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:26.120 [2024-07-25 06:34:39.483814] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:26.120 [2024-07-25 06:34:39.483831] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:26.120 [2024-07-25 06:34:39.483889] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:26.120 [2024-07-25 06:34:39.483905] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:26.120 pt2 00:18:26.120 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:18:26.120 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:26.120 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:26.377 [2024-07-25 06:34:39.712037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:26.377 [2024-07-25 06:34:39.712068] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:26.377 [2024-07-25 06:34:39.712082] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa9b460 00:18:26.377 [2024-07-25 06:34:39.712093] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:26.377 [2024-07-25 06:34:39.712377] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:26.377 [2024-07-25 06:34:39.712394] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:26.377 [2024-07-25 06:34:39.712442] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:26.377 [2024-07-25 06:34:39.712459] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:26.378 [2024-07-25 06:34:39.712553] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc4cd00 00:18:26.378 [2024-07-25 06:34:39.712563] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:26.378 [2024-07-25 06:34:39.712716] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa9f7f0 00:18:26.378 [2024-07-25 06:34:39.712824] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc4cd00 00:18:26.378 [2024-07-25 06:34:39.712832] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc4cd00 00:18:26.378 [2024-07-25 06:34:39.712915] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:26.378 pt3 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.378 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:26.636 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.636 "name": "raid_bdev1", 00:18:26.636 "uuid": "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44", 00:18:26.636 "strip_size_kb": 64, 00:18:26.636 "state": "online", 00:18:26.636 "raid_level": "concat", 00:18:26.636 "superblock": true, 00:18:26.636 "num_base_bdevs": 3, 00:18:26.636 "num_base_bdevs_discovered": 3, 00:18:26.636 "num_base_bdevs_operational": 3, 00:18:26.636 "base_bdevs_list": [ 00:18:26.636 { 00:18:26.636 "name": "pt1", 00:18:26.636 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:26.636 "is_configured": true, 00:18:26.636 "data_offset": 2048, 00:18:26.636 "data_size": 63488 00:18:26.636 }, 00:18:26.636 { 00:18:26.636 "name": "pt2", 00:18:26.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:26.636 "is_configured": true, 00:18:26.636 "data_offset": 2048, 00:18:26.636 "data_size": 63488 00:18:26.636 }, 00:18:26.636 { 00:18:26.636 "name": "pt3", 00:18:26.636 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:26.636 "is_configured": true, 00:18:26.636 "data_offset": 2048, 00:18:26.636 "data_size": 63488 00:18:26.636 } 00:18:26.636 ] 00:18:26.636 }' 00:18:26.636 06:34:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.636 06:34:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:27.201 [2024-07-25 06:34:40.694869] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:27.201 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:27.201 "name": "raid_bdev1", 00:18:27.201 "aliases": [ 00:18:27.201 "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44" 00:18:27.201 ], 00:18:27.201 "product_name": "Raid Volume", 00:18:27.201 "block_size": 512, 00:18:27.201 "num_blocks": 190464, 00:18:27.201 "uuid": "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44", 00:18:27.201 "assigned_rate_limits": { 00:18:27.201 "rw_ios_per_sec": 0, 00:18:27.201 "rw_mbytes_per_sec": 0, 00:18:27.201 "r_mbytes_per_sec": 0, 00:18:27.201 "w_mbytes_per_sec": 0 00:18:27.201 }, 00:18:27.201 "claimed": false, 00:18:27.201 "zoned": false, 00:18:27.201 "supported_io_types": { 00:18:27.201 "read": true, 00:18:27.201 "write": true, 00:18:27.201 "unmap": true, 00:18:27.201 "flush": true, 00:18:27.201 "reset": true, 00:18:27.201 "nvme_admin": false, 00:18:27.201 "nvme_io": false, 00:18:27.201 "nvme_io_md": false, 00:18:27.201 "write_zeroes": true, 00:18:27.201 "zcopy": false, 00:18:27.201 "get_zone_info": false, 00:18:27.201 "zone_management": false, 00:18:27.201 "zone_append": false, 00:18:27.201 "compare": false, 00:18:27.201 "compare_and_write": false, 00:18:27.201 "abort": false, 00:18:27.201 "seek_hole": false, 00:18:27.201 "seek_data": false, 00:18:27.201 "copy": false, 00:18:27.201 "nvme_iov_md": false 00:18:27.201 }, 00:18:27.201 "memory_domains": [ 00:18:27.201 { 00:18:27.201 "dma_device_id": "system", 00:18:27.201 "dma_device_type": 1 00:18:27.201 }, 00:18:27.201 { 00:18:27.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:27.201 "dma_device_type": 2 00:18:27.201 }, 00:18:27.201 { 00:18:27.201 "dma_device_id": "system", 00:18:27.201 "dma_device_type": 1 00:18:27.201 }, 00:18:27.201 { 00:18:27.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:27.201 "dma_device_type": 2 00:18:27.201 }, 00:18:27.201 { 00:18:27.201 "dma_device_id": "system", 00:18:27.201 "dma_device_type": 1 00:18:27.201 }, 00:18:27.201 { 00:18:27.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:27.201 "dma_device_type": 2 00:18:27.201 } 00:18:27.201 ], 00:18:27.202 "driver_specific": { 00:18:27.202 "raid": { 00:18:27.202 "uuid": "ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44", 00:18:27.202 "strip_size_kb": 64, 00:18:27.202 "state": "online", 00:18:27.202 "raid_level": "concat", 00:18:27.202 "superblock": true, 00:18:27.202 "num_base_bdevs": 3, 00:18:27.202 "num_base_bdevs_discovered": 3, 00:18:27.202 "num_base_bdevs_operational": 3, 00:18:27.202 "base_bdevs_list": [ 00:18:27.202 { 00:18:27.202 "name": "pt1", 00:18:27.202 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:27.202 "is_configured": true, 00:18:27.202 "data_offset": 2048, 00:18:27.202 "data_size": 63488 00:18:27.202 }, 00:18:27.202 { 00:18:27.202 "name": "pt2", 00:18:27.202 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:27.202 "is_configured": true, 00:18:27.202 "data_offset": 2048, 00:18:27.202 "data_size": 63488 00:18:27.202 }, 00:18:27.202 { 00:18:27.202 "name": "pt3", 00:18:27.202 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:27.202 "is_configured": true, 00:18:27.202 "data_offset": 2048, 00:18:27.202 "data_size": 63488 00:18:27.202 } 00:18:27.202 ] 00:18:27.202 } 00:18:27.202 } 00:18:27.202 }' 00:18:27.202 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:27.459 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:27.460 pt2 00:18:27.460 pt3' 00:18:27.460 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:27.460 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:27.460 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:27.460 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:27.460 "name": "pt1", 00:18:27.460 "aliases": [ 00:18:27.460 "00000000-0000-0000-0000-000000000001" 00:18:27.460 ], 00:18:27.460 "product_name": "passthru", 00:18:27.460 "block_size": 512, 00:18:27.460 "num_blocks": 65536, 00:18:27.460 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:27.460 "assigned_rate_limits": { 00:18:27.460 "rw_ios_per_sec": 0, 00:18:27.460 "rw_mbytes_per_sec": 0, 00:18:27.460 "r_mbytes_per_sec": 0, 00:18:27.460 "w_mbytes_per_sec": 0 00:18:27.460 }, 00:18:27.460 "claimed": true, 00:18:27.460 "claim_type": "exclusive_write", 00:18:27.460 "zoned": false, 00:18:27.460 "supported_io_types": { 00:18:27.460 "read": true, 00:18:27.460 "write": true, 00:18:27.460 "unmap": true, 00:18:27.460 "flush": true, 00:18:27.460 "reset": true, 00:18:27.460 "nvme_admin": false, 00:18:27.460 "nvme_io": false, 00:18:27.460 "nvme_io_md": false, 00:18:27.460 "write_zeroes": true, 00:18:27.460 "zcopy": true, 00:18:27.460 "get_zone_info": false, 00:18:27.460 "zone_management": false, 00:18:27.460 "zone_append": false, 00:18:27.460 "compare": false, 00:18:27.460 "compare_and_write": false, 00:18:27.460 "abort": true, 00:18:27.460 "seek_hole": false, 00:18:27.460 "seek_data": false, 00:18:27.460 "copy": true, 00:18:27.460 "nvme_iov_md": false 00:18:27.460 }, 00:18:27.460 "memory_domains": [ 00:18:27.460 { 00:18:27.460 "dma_device_id": "system", 00:18:27.460 "dma_device_type": 1 00:18:27.460 }, 00:18:27.460 { 00:18:27.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:27.460 "dma_device_type": 2 00:18:27.460 } 00:18:27.460 ], 00:18:27.460 "driver_specific": { 00:18:27.460 "passthru": { 00:18:27.460 "name": "pt1", 00:18:27.460 "base_bdev_name": "malloc1" 00:18:27.460 } 00:18:27.460 } 00:18:27.460 }' 00:18:27.460 06:34:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:27.718 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:27.975 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:27.975 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:27.975 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:27.975 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:27.975 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:28.234 "name": "pt2", 00:18:28.234 "aliases": [ 00:18:28.234 "00000000-0000-0000-0000-000000000002" 00:18:28.234 ], 00:18:28.234 "product_name": "passthru", 00:18:28.234 "block_size": 512, 00:18:28.234 "num_blocks": 65536, 00:18:28.234 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:28.234 "assigned_rate_limits": { 00:18:28.234 "rw_ios_per_sec": 0, 00:18:28.234 "rw_mbytes_per_sec": 0, 00:18:28.234 "r_mbytes_per_sec": 0, 00:18:28.234 "w_mbytes_per_sec": 0 00:18:28.234 }, 00:18:28.234 "claimed": true, 00:18:28.234 "claim_type": "exclusive_write", 00:18:28.234 "zoned": false, 00:18:28.234 "supported_io_types": { 00:18:28.234 "read": true, 00:18:28.234 "write": true, 00:18:28.234 "unmap": true, 00:18:28.234 "flush": true, 00:18:28.234 "reset": true, 00:18:28.234 "nvme_admin": false, 00:18:28.234 "nvme_io": false, 00:18:28.234 "nvme_io_md": false, 00:18:28.234 "write_zeroes": true, 00:18:28.234 "zcopy": true, 00:18:28.234 "get_zone_info": false, 00:18:28.234 "zone_management": false, 00:18:28.234 "zone_append": false, 00:18:28.234 "compare": false, 00:18:28.234 "compare_and_write": false, 00:18:28.234 "abort": true, 00:18:28.234 "seek_hole": false, 00:18:28.234 "seek_data": false, 00:18:28.234 "copy": true, 00:18:28.234 "nvme_iov_md": false 00:18:28.234 }, 00:18:28.234 "memory_domains": [ 00:18:28.234 { 00:18:28.234 "dma_device_id": "system", 00:18:28.234 "dma_device_type": 1 00:18:28.234 }, 00:18:28.234 { 00:18:28.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.234 "dma_device_type": 2 00:18:28.234 } 00:18:28.234 ], 00:18:28.234 "driver_specific": { 00:18:28.234 "passthru": { 00:18:28.234 "name": "pt2", 00:18:28.234 "base_bdev_name": "malloc2" 00:18:28.234 } 00:18:28.234 } 00:18:28.234 }' 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:28.234 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:28.492 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:28.492 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:28.492 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:28.492 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:28.492 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:28.492 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:28.492 06:34:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:28.751 "name": "pt3", 00:18:28.751 "aliases": [ 00:18:28.751 "00000000-0000-0000-0000-000000000003" 00:18:28.751 ], 00:18:28.751 "product_name": "passthru", 00:18:28.751 "block_size": 512, 00:18:28.751 "num_blocks": 65536, 00:18:28.751 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:28.751 "assigned_rate_limits": { 00:18:28.751 "rw_ios_per_sec": 0, 00:18:28.751 "rw_mbytes_per_sec": 0, 00:18:28.751 "r_mbytes_per_sec": 0, 00:18:28.751 "w_mbytes_per_sec": 0 00:18:28.751 }, 00:18:28.751 "claimed": true, 00:18:28.751 "claim_type": "exclusive_write", 00:18:28.751 "zoned": false, 00:18:28.751 "supported_io_types": { 00:18:28.751 "read": true, 00:18:28.751 "write": true, 00:18:28.751 "unmap": true, 00:18:28.751 "flush": true, 00:18:28.751 "reset": true, 00:18:28.751 "nvme_admin": false, 00:18:28.751 "nvme_io": false, 00:18:28.751 "nvme_io_md": false, 00:18:28.751 "write_zeroes": true, 00:18:28.751 "zcopy": true, 00:18:28.751 "get_zone_info": false, 00:18:28.751 "zone_management": false, 00:18:28.751 "zone_append": false, 00:18:28.751 "compare": false, 00:18:28.751 "compare_and_write": false, 00:18:28.751 "abort": true, 00:18:28.751 "seek_hole": false, 00:18:28.751 "seek_data": false, 00:18:28.751 "copy": true, 00:18:28.751 "nvme_iov_md": false 00:18:28.751 }, 00:18:28.751 "memory_domains": [ 00:18:28.751 { 00:18:28.751 "dma_device_id": "system", 00:18:28.751 "dma_device_type": 1 00:18:28.751 }, 00:18:28.751 { 00:18:28.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.751 "dma_device_type": 2 00:18:28.751 } 00:18:28.751 ], 00:18:28.751 "driver_specific": { 00:18:28.751 "passthru": { 00:18:28.751 "name": "pt3", 00:18:28.751 "base_bdev_name": "malloc3" 00:18:28.751 } 00:18:28.751 } 00:18:28.751 }' 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:28.751 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.010 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.010 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:29.010 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:29.010 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:18:29.269 [2024-07-25 06:34:42.595872] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44 '!=' ae2bdf8a-4b52-4b38-9d39-1cd8b7e5cf44 ']' 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1145191 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1145191 ']' 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1145191 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1145191 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1145191' 00:18:29.269 killing process with pid 1145191 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1145191 00:18:29.269 [2024-07-25 06:34:42.672236] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:29.269 [2024-07-25 06:34:42.672285] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:29.269 [2024-07-25 06:34:42.672336] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:29.269 [2024-07-25 06:34:42.672347] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc4cd00 name raid_bdev1, state offline 00:18:29.269 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1145191 00:18:29.269 [2024-07-25 06:34:42.695865] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:29.528 06:34:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:18:29.528 00:18:29.528 real 0m13.179s 00:18:29.528 user 0m23.627s 00:18:29.528 sys 0m2.497s 00:18:29.528 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:29.528 06:34:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.528 ************************************ 00:18:29.528 END TEST raid_superblock_test 00:18:29.528 ************************************ 00:18:29.528 06:34:42 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:18:29.528 06:34:42 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:29.528 06:34:42 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:29.528 06:34:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:29.528 ************************************ 00:18:29.528 START TEST raid_read_error_test 00:18:29.528 ************************************ 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 read 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.ui06vsqDcF 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1147819 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1147819 /var/tmp/spdk-raid.sock 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1147819 ']' 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:29.528 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:29.528 06:34:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.528 [2024-07-25 06:34:43.025540] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:18:29.528 [2024-07-25 06:34:43.025595] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1147819 ] 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:01.0 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:01.1 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:01.2 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:01.3 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:01.4 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:01.5 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:01.6 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:01.7 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:02.0 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:02.1 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:02.2 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:02.3 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:02.4 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:02.5 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:02.6 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3d:02.7 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:01.0 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:01.1 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:01.2 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:01.3 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:01.4 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:01.5 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:01.6 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:01.7 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:02.0 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:02.1 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:02.2 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:02.3 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:02.4 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:02.5 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:02.6 cannot be used 00:18:29.787 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:29.787 EAL: Requested device 0000:3f:02.7 cannot be used 00:18:29.787 [2024-07-25 06:34:43.160587] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:29.787 [2024-07-25 06:34:43.204560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:29.787 [2024-07-25 06:34:43.262190] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:29.787 [2024-07-25 06:34:43.262232] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:30.723 06:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:30.723 06:34:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:18:30.723 06:34:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:30.723 06:34:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:30.723 BaseBdev1_malloc 00:18:30.723 06:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:30.981 true 00:18:30.981 06:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:31.239 [2024-07-25 06:34:44.593674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:31.239 [2024-07-25 06:34:44.593714] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:31.239 [2024-07-25 06:34:44.593732] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd0fa60 00:18:31.239 [2024-07-25 06:34:44.593744] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:31.239 [2024-07-25 06:34:44.595208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:31.239 [2024-07-25 06:34:44.595237] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:31.239 BaseBdev1 00:18:31.239 06:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:31.239 06:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:31.497 BaseBdev2_malloc 00:18:31.497 06:34:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:31.497 true 00:18:31.755 06:34:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:31.755 [2024-07-25 06:34:45.255606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:31.755 [2024-07-25 06:34:45.255642] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:31.755 [2024-07-25 06:34:45.255661] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd14dc0 00:18:31.755 [2024-07-25 06:34:45.255673] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:31.755 [2024-07-25 06:34:45.256973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:31.755 [2024-07-25 06:34:45.256999] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:31.755 BaseBdev2 00:18:31.755 06:34:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:31.755 06:34:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:32.014 BaseBdev3_malloc 00:18:32.014 06:34:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:32.366 true 00:18:32.366 06:34:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:32.626 [2024-07-25 06:34:45.945801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:32.626 [2024-07-25 06:34:45.945843] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:32.626 [2024-07-25 06:34:45.945863] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd15420 00:18:32.626 [2024-07-25 06:34:45.945875] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:32.626 [2024-07-25 06:34:45.947340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:32.626 [2024-07-25 06:34:45.947368] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:32.626 BaseBdev3 00:18:32.626 06:34:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:32.626 [2024-07-25 06:34:46.114278] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:32.626 [2024-07-25 06:34:46.115341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:32.626 [2024-07-25 06:34:46.115405] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:32.626 [2024-07-25 06:34:46.115588] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd181d0 00:18:32.626 [2024-07-25 06:34:46.115599] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:32.626 [2024-07-25 06:34:46.115766] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb6b420 00:18:32.626 [2024-07-25 06:34:46.115901] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd181d0 00:18:32.626 [2024-07-25 06:34:46.115910] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd181d0 00:18:32.626 [2024-07-25 06:34:46.115999] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.626 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:32.885 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.885 "name": "raid_bdev1", 00:18:32.885 "uuid": "03416a7c-6f16-4424-91b2-e79cc03ee451", 00:18:32.885 "strip_size_kb": 64, 00:18:32.885 "state": "online", 00:18:32.885 "raid_level": "concat", 00:18:32.885 "superblock": true, 00:18:32.885 "num_base_bdevs": 3, 00:18:32.885 "num_base_bdevs_discovered": 3, 00:18:32.885 "num_base_bdevs_operational": 3, 00:18:32.886 "base_bdevs_list": [ 00:18:32.886 { 00:18:32.886 "name": "BaseBdev1", 00:18:32.886 "uuid": "758f28e0-7f3c-5242-91d4-e318308ea559", 00:18:32.886 "is_configured": true, 00:18:32.886 "data_offset": 2048, 00:18:32.886 "data_size": 63488 00:18:32.886 }, 00:18:32.886 { 00:18:32.886 "name": "BaseBdev2", 00:18:32.886 "uuid": "048295fe-0108-53c8-a0ab-b1eeafaca6dc", 00:18:32.886 "is_configured": true, 00:18:32.886 "data_offset": 2048, 00:18:32.886 "data_size": 63488 00:18:32.886 }, 00:18:32.886 { 00:18:32.886 "name": "BaseBdev3", 00:18:32.886 "uuid": "bd8c2aa9-ad7c-5aa8-b08c-b06fe8d57494", 00:18:32.886 "is_configured": true, 00:18:32.886 "data_offset": 2048, 00:18:32.886 "data_size": 63488 00:18:32.886 } 00:18:32.886 ] 00:18:32.886 }' 00:18:32.886 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.886 06:34:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:33.453 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:18:33.453 06:34:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:33.712 [2024-07-25 06:34:47.032918] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd14650 00:18:34.650 06:34:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.650 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:34.910 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:34.910 "name": "raid_bdev1", 00:18:34.910 "uuid": "03416a7c-6f16-4424-91b2-e79cc03ee451", 00:18:34.910 "strip_size_kb": 64, 00:18:34.910 "state": "online", 00:18:34.910 "raid_level": "concat", 00:18:34.910 "superblock": true, 00:18:34.910 "num_base_bdevs": 3, 00:18:34.910 "num_base_bdevs_discovered": 3, 00:18:34.910 "num_base_bdevs_operational": 3, 00:18:34.910 "base_bdevs_list": [ 00:18:34.910 { 00:18:34.910 "name": "BaseBdev1", 00:18:34.910 "uuid": "758f28e0-7f3c-5242-91d4-e318308ea559", 00:18:34.910 "is_configured": true, 00:18:34.910 "data_offset": 2048, 00:18:34.910 "data_size": 63488 00:18:34.910 }, 00:18:34.910 { 00:18:34.910 "name": "BaseBdev2", 00:18:34.910 "uuid": "048295fe-0108-53c8-a0ab-b1eeafaca6dc", 00:18:34.910 "is_configured": true, 00:18:34.910 "data_offset": 2048, 00:18:34.910 "data_size": 63488 00:18:34.910 }, 00:18:34.910 { 00:18:34.910 "name": "BaseBdev3", 00:18:34.910 "uuid": "bd8c2aa9-ad7c-5aa8-b08c-b06fe8d57494", 00:18:34.910 "is_configured": true, 00:18:34.910 "data_offset": 2048, 00:18:34.910 "data_size": 63488 00:18:34.910 } 00:18:34.910 ] 00:18:34.910 }' 00:18:34.910 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:34.910 06:34:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:35.478 06:34:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:35.738 [2024-07-25 06:34:49.162990] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:35.738 [2024-07-25 06:34:49.163022] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:35.738 [2024-07-25 06:34:49.165933] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:35.738 [2024-07-25 06:34:49.165965] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:35.738 [2024-07-25 06:34:49.165993] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:35.738 [2024-07-25 06:34:49.166003] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd181d0 name raid_bdev1, state offline 00:18:35.738 0 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1147819 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1147819 ']' 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1147819 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1147819 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1147819' 00:18:35.738 killing process with pid 1147819 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1147819 00:18:35.738 [2024-07-25 06:34:49.238612] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:35.738 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1147819 00:18:35.738 [2024-07-25 06:34:49.257253] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.ui06vsqDcF 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:18:35.998 00:18:35.998 real 0m6.499s 00:18:35.998 user 0m10.203s 00:18:35.998 sys 0m1.163s 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:35.998 06:34:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:35.998 ************************************ 00:18:35.998 END TEST raid_read_error_test 00:18:35.998 ************************************ 00:18:35.998 06:34:49 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:18:35.998 06:34:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:35.998 06:34:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:35.998 06:34:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:35.998 ************************************ 00:18:35.998 START TEST raid_write_error_test 00:18:35.998 ************************************ 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 write 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:18:35.998 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.k4AImX63Ev 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1148981 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1148981 /var/tmp/spdk-raid.sock 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1148981 ']' 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:36.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:36.258 06:34:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:36.258 [2024-07-25 06:34:49.622380] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:18:36.258 [2024-07-25 06:34:49.622446] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1148981 ] 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:01.0 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:01.1 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:01.2 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:01.3 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:01.4 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:01.5 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:01.6 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:01.7 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:02.0 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:02.1 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:02.2 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:02.3 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:02.4 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:02.5 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:02.6 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3d:02.7 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3f:01.0 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3f:01.1 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3f:01.2 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3f:01.3 cannot be used 00:18:36.258 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.258 EAL: Requested device 0000:3f:01.4 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:01.5 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:01.6 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:01.7 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:02.0 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:02.1 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:02.2 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:02.3 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:02.4 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:02.5 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:02.6 cannot be used 00:18:36.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:36.259 EAL: Requested device 0000:3f:02.7 cannot be used 00:18:36.259 [2024-07-25 06:34:49.759918] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:36.259 [2024-07-25 06:34:49.802668] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:36.518 [2024-07-25 06:34:49.865947] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:36.518 [2024-07-25 06:34:49.865985] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:37.085 06:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:37.085 06:34:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:18:37.085 06:34:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:37.085 06:34:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:37.343 BaseBdev1_malloc 00:18:37.343 06:34:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:37.602 true 00:18:37.602 06:34:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:37.861 [2024-07-25 06:34:51.182476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:37.861 [2024-07-25 06:34:51.182521] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:37.861 [2024-07-25 06:34:51.182537] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe8a60 00:18:37.861 [2024-07-25 06:34:51.182548] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:37.861 [2024-07-25 06:34:51.183940] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:37.861 [2024-07-25 06:34:51.183970] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:37.861 BaseBdev1 00:18:37.861 06:34:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:37.861 06:34:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:37.861 BaseBdev2_malloc 00:18:38.120 06:34:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:38.120 true 00:18:38.120 06:34:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:38.379 [2024-07-25 06:34:51.864369] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:38.379 [2024-07-25 06:34:51.864405] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:38.379 [2024-07-25 06:34:51.864424] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfeddc0 00:18:38.379 [2024-07-25 06:34:51.864435] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:38.379 [2024-07-25 06:34:51.865715] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:38.379 [2024-07-25 06:34:51.865743] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:38.379 BaseBdev2 00:18:38.379 06:34:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:38.379 06:34:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:38.638 BaseBdev3_malloc 00:18:38.638 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:38.898 true 00:18:38.898 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:39.158 [2024-07-25 06:34:52.562353] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:39.158 [2024-07-25 06:34:52.562389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:39.158 [2024-07-25 06:34:52.562405] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfee420 00:18:39.158 [2024-07-25 06:34:52.562416] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:39.158 [2024-07-25 06:34:52.563685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:39.158 [2024-07-25 06:34:52.563712] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:39.158 BaseBdev3 00:18:39.158 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:39.417 [2024-07-25 06:34:52.782966] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:39.417 [2024-07-25 06:34:52.784011] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:39.417 [2024-07-25 06:34:52.784071] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:39.417 [2024-07-25 06:34:52.784259] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xff11d0 00:18:39.417 [2024-07-25 06:34:52.784269] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:39.417 [2024-07-25 06:34:52.784422] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe44420 00:18:39.417 [2024-07-25 06:34:52.784554] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xff11d0 00:18:39.417 [2024-07-25 06:34:52.784563] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xff11d0 00:18:39.417 [2024-07-25 06:34:52.784648] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.417 06:34:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:39.677 06:34:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.677 "name": "raid_bdev1", 00:18:39.677 "uuid": "cb6c231f-69fd-4c66-97ff-e29290405a5d", 00:18:39.677 "strip_size_kb": 64, 00:18:39.677 "state": "online", 00:18:39.677 "raid_level": "concat", 00:18:39.677 "superblock": true, 00:18:39.677 "num_base_bdevs": 3, 00:18:39.677 "num_base_bdevs_discovered": 3, 00:18:39.677 "num_base_bdevs_operational": 3, 00:18:39.677 "base_bdevs_list": [ 00:18:39.677 { 00:18:39.677 "name": "BaseBdev1", 00:18:39.677 "uuid": "1a6d7c47-b2ec-551f-883e-ca89bb584d05", 00:18:39.677 "is_configured": true, 00:18:39.677 "data_offset": 2048, 00:18:39.677 "data_size": 63488 00:18:39.677 }, 00:18:39.677 { 00:18:39.677 "name": "BaseBdev2", 00:18:39.677 "uuid": "5c0d740c-39bb-54d8-be36-a48b981b6dfb", 00:18:39.677 "is_configured": true, 00:18:39.677 "data_offset": 2048, 00:18:39.677 "data_size": 63488 00:18:39.677 }, 00:18:39.677 { 00:18:39.677 "name": "BaseBdev3", 00:18:39.677 "uuid": "694236b4-b9b4-5337-bf66-a8de6885715e", 00:18:39.677 "is_configured": true, 00:18:39.677 "data_offset": 2048, 00:18:39.677 "data_size": 63488 00:18:39.677 } 00:18:39.677 ] 00:18:39.677 }' 00:18:39.677 06:34:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.677 06:34:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.245 06:34:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:18:40.245 06:34:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:40.245 [2024-07-25 06:34:53.685739] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfed650 00:18:41.180 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.438 06:34:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:41.697 06:34:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.697 "name": "raid_bdev1", 00:18:41.697 "uuid": "cb6c231f-69fd-4c66-97ff-e29290405a5d", 00:18:41.697 "strip_size_kb": 64, 00:18:41.697 "state": "online", 00:18:41.697 "raid_level": "concat", 00:18:41.697 "superblock": true, 00:18:41.697 "num_base_bdevs": 3, 00:18:41.697 "num_base_bdevs_discovered": 3, 00:18:41.697 "num_base_bdevs_operational": 3, 00:18:41.697 "base_bdevs_list": [ 00:18:41.697 { 00:18:41.697 "name": "BaseBdev1", 00:18:41.697 "uuid": "1a6d7c47-b2ec-551f-883e-ca89bb584d05", 00:18:41.697 "is_configured": true, 00:18:41.697 "data_offset": 2048, 00:18:41.697 "data_size": 63488 00:18:41.697 }, 00:18:41.697 { 00:18:41.697 "name": "BaseBdev2", 00:18:41.697 "uuid": "5c0d740c-39bb-54d8-be36-a48b981b6dfb", 00:18:41.697 "is_configured": true, 00:18:41.697 "data_offset": 2048, 00:18:41.697 "data_size": 63488 00:18:41.697 }, 00:18:41.697 { 00:18:41.697 "name": "BaseBdev3", 00:18:41.697 "uuid": "694236b4-b9b4-5337-bf66-a8de6885715e", 00:18:41.697 "is_configured": true, 00:18:41.697 "data_offset": 2048, 00:18:41.697 "data_size": 63488 00:18:41.697 } 00:18:41.697 ] 00:18:41.697 }' 00:18:41.697 06:34:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.697 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.263 06:34:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:42.522 [2024-07-25 06:34:55.860646] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:42.522 [2024-07-25 06:34:55.860681] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:42.522 [2024-07-25 06:34:55.863598] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:42.522 [2024-07-25 06:34:55.863632] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:42.522 [2024-07-25 06:34:55.863661] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:42.522 [2024-07-25 06:34:55.863670] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xff11d0 name raid_bdev1, state offline 00:18:42.522 0 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1148981 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1148981 ']' 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1148981 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1148981 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1148981' 00:18:42.522 killing process with pid 1148981 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1148981 00:18:42.522 [2024-07-25 06:34:55.936062] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:42.522 06:34:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1148981 00:18:42.522 [2024-07-25 06:34:55.955222] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.k4AImX63Ev 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:18:42.781 00:18:42.781 real 0m6.607s 00:18:42.781 user 0m10.388s 00:18:42.781 sys 0m1.182s 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:42.781 06:34:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.781 ************************************ 00:18:42.781 END TEST raid_write_error_test 00:18:42.781 ************************************ 00:18:42.781 06:34:56 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:18:42.781 06:34:56 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:18:42.781 06:34:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:42.781 06:34:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:42.781 06:34:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:42.781 ************************************ 00:18:42.781 START TEST raid_state_function_test 00:18:42.781 ************************************ 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 false 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1150137 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1150137' 00:18:42.781 Process raid pid: 1150137 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1150137 /var/tmp/spdk-raid.sock 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1150137 ']' 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:42.781 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.781 06:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:42.781 [2024-07-25 06:34:56.290818] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:18:42.781 [2024-07-25 06:34:56.290871] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:01.0 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:01.1 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:01.2 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:01.3 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:01.4 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:01.5 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:01.6 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:01.7 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:02.0 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:02.1 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:02.2 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:02.3 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:02.4 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:02.5 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:02.6 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3d:02.7 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:01.0 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:01.1 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:01.2 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:01.3 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:01.4 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:01.5 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:01.6 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:01.7 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:02.0 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:02.1 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:02.2 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:02.3 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:02.4 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:02.5 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:02.6 cannot be used 00:18:43.040 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:18:43.040 EAL: Requested device 0000:3f:02.7 cannot be used 00:18:43.040 [2024-07-25 06:34:56.427054] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:43.040 [2024-07-25 06:34:56.471452] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:43.040 [2024-07-25 06:34:56.531834] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:43.040 [2024-07-25 06:34:56.531868] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:43.975 06:34:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:43.975 06:34:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:18:43.975 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:44.234 [2024-07-25 06:34:57.648873] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:44.234 [2024-07-25 06:34:57.648909] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:44.234 [2024-07-25 06:34:57.648919] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:44.234 [2024-07-25 06:34:57.648930] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:44.234 [2024-07-25 06:34:57.648938] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:44.234 [2024-07-25 06:34:57.648947] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.234 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.492 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.492 "name": "Existed_Raid", 00:18:44.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.492 "strip_size_kb": 0, 00:18:44.492 "state": "configuring", 00:18:44.492 "raid_level": "raid1", 00:18:44.492 "superblock": false, 00:18:44.492 "num_base_bdevs": 3, 00:18:44.492 "num_base_bdevs_discovered": 0, 00:18:44.492 "num_base_bdevs_operational": 3, 00:18:44.492 "base_bdevs_list": [ 00:18:44.492 { 00:18:44.492 "name": "BaseBdev1", 00:18:44.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.492 "is_configured": false, 00:18:44.492 "data_offset": 0, 00:18:44.492 "data_size": 0 00:18:44.492 }, 00:18:44.492 { 00:18:44.492 "name": "BaseBdev2", 00:18:44.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.492 "is_configured": false, 00:18:44.492 "data_offset": 0, 00:18:44.492 "data_size": 0 00:18:44.492 }, 00:18:44.492 { 00:18:44.492 "name": "BaseBdev3", 00:18:44.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.492 "is_configured": false, 00:18:44.492 "data_offset": 0, 00:18:44.492 "data_size": 0 00:18:44.492 } 00:18:44.492 ] 00:18:44.492 }' 00:18:44.492 06:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.492 06:34:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.427 06:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:45.427 [2024-07-25 06:34:58.960183] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:45.427 [2024-07-25 06:34:58.960209] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2750470 name Existed_Raid, state configuring 00:18:45.427 06:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:45.687 [2024-07-25 06:34:59.188794] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:45.687 [2024-07-25 06:34:59.188820] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:45.687 [2024-07-25 06:34:59.188829] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:45.687 [2024-07-25 06:34:59.188839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:45.687 [2024-07-25 06:34:59.188847] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:45.687 [2024-07-25 06:34:59.188857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:45.687 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:45.947 [2024-07-25 06:34:59.426736] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:45.947 BaseBdev1 00:18:45.947 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:45.947 06:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:18:45.947 06:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:45.947 06:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:45.947 06:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:45.947 06:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:45.947 06:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:46.205 06:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:46.463 [ 00:18:46.463 { 00:18:46.463 "name": "BaseBdev1", 00:18:46.463 "aliases": [ 00:18:46.463 "21e7b150-568b-440c-8344-e4756fc5d822" 00:18:46.463 ], 00:18:46.463 "product_name": "Malloc disk", 00:18:46.463 "block_size": 512, 00:18:46.463 "num_blocks": 65536, 00:18:46.463 "uuid": "21e7b150-568b-440c-8344-e4756fc5d822", 00:18:46.463 "assigned_rate_limits": { 00:18:46.463 "rw_ios_per_sec": 0, 00:18:46.463 "rw_mbytes_per_sec": 0, 00:18:46.463 "r_mbytes_per_sec": 0, 00:18:46.463 "w_mbytes_per_sec": 0 00:18:46.463 }, 00:18:46.463 "claimed": true, 00:18:46.463 "claim_type": "exclusive_write", 00:18:46.463 "zoned": false, 00:18:46.463 "supported_io_types": { 00:18:46.463 "read": true, 00:18:46.463 "write": true, 00:18:46.463 "unmap": true, 00:18:46.463 "flush": true, 00:18:46.463 "reset": true, 00:18:46.463 "nvme_admin": false, 00:18:46.463 "nvme_io": false, 00:18:46.463 "nvme_io_md": false, 00:18:46.463 "write_zeroes": true, 00:18:46.463 "zcopy": true, 00:18:46.463 "get_zone_info": false, 00:18:46.463 "zone_management": false, 00:18:46.463 "zone_append": false, 00:18:46.463 "compare": false, 00:18:46.463 "compare_and_write": false, 00:18:46.463 "abort": true, 00:18:46.463 "seek_hole": false, 00:18:46.463 "seek_data": false, 00:18:46.463 "copy": true, 00:18:46.463 "nvme_iov_md": false 00:18:46.463 }, 00:18:46.463 "memory_domains": [ 00:18:46.463 { 00:18:46.463 "dma_device_id": "system", 00:18:46.463 "dma_device_type": 1 00:18:46.463 }, 00:18:46.463 { 00:18:46.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.463 "dma_device_type": 2 00:18:46.463 } 00:18:46.463 ], 00:18:46.463 "driver_specific": {} 00:18:46.463 } 00:18:46.463 ] 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.463 06:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.722 06:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.722 "name": "Existed_Raid", 00:18:46.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.722 "strip_size_kb": 0, 00:18:46.722 "state": "configuring", 00:18:46.722 "raid_level": "raid1", 00:18:46.722 "superblock": false, 00:18:46.722 "num_base_bdevs": 3, 00:18:46.722 "num_base_bdevs_discovered": 1, 00:18:46.722 "num_base_bdevs_operational": 3, 00:18:46.722 "base_bdevs_list": [ 00:18:46.722 { 00:18:46.722 "name": "BaseBdev1", 00:18:46.722 "uuid": "21e7b150-568b-440c-8344-e4756fc5d822", 00:18:46.722 "is_configured": true, 00:18:46.722 "data_offset": 0, 00:18:46.722 "data_size": 65536 00:18:46.722 }, 00:18:46.722 { 00:18:46.722 "name": "BaseBdev2", 00:18:46.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.722 "is_configured": false, 00:18:46.722 "data_offset": 0, 00:18:46.722 "data_size": 0 00:18:46.722 }, 00:18:46.722 { 00:18:46.722 "name": "BaseBdev3", 00:18:46.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.722 "is_configured": false, 00:18:46.722 "data_offset": 0, 00:18:46.722 "data_size": 0 00:18:46.722 } 00:18:46.722 ] 00:18:46.722 }' 00:18:46.722 06:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.722 06:35:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.289 06:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:47.547 [2024-07-25 06:35:00.902716] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:47.547 [2024-07-25 06:35:00.902752] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x274fce0 name Existed_Raid, state configuring 00:18:47.547 06:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:47.805 [2024-07-25 06:35:01.131345] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:47.805 [2024-07-25 06:35:01.132757] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:47.805 [2024-07-25 06:35:01.132790] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:47.805 [2024-07-25 06:35:01.132802] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:47.805 [2024-07-25 06:35:01.132813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.805 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.063 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.063 "name": "Existed_Raid", 00:18:48.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.063 "strip_size_kb": 0, 00:18:48.063 "state": "configuring", 00:18:48.063 "raid_level": "raid1", 00:18:48.063 "superblock": false, 00:18:48.063 "num_base_bdevs": 3, 00:18:48.063 "num_base_bdevs_discovered": 1, 00:18:48.063 "num_base_bdevs_operational": 3, 00:18:48.063 "base_bdevs_list": [ 00:18:48.063 { 00:18:48.063 "name": "BaseBdev1", 00:18:48.063 "uuid": "21e7b150-568b-440c-8344-e4756fc5d822", 00:18:48.063 "is_configured": true, 00:18:48.063 "data_offset": 0, 00:18:48.063 "data_size": 65536 00:18:48.063 }, 00:18:48.063 { 00:18:48.063 "name": "BaseBdev2", 00:18:48.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.063 "is_configured": false, 00:18:48.063 "data_offset": 0, 00:18:48.063 "data_size": 0 00:18:48.063 }, 00:18:48.063 { 00:18:48.063 "name": "BaseBdev3", 00:18:48.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.063 "is_configured": false, 00:18:48.063 "data_offset": 0, 00:18:48.063 "data_size": 0 00:18:48.063 } 00:18:48.063 ] 00:18:48.063 }' 00:18:48.063 06:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.063 06:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.995 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:48.995 [2024-07-25 06:35:02.457927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:48.995 BaseBdev2 00:18:48.995 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:48.996 06:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:18:48.996 06:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:48.996 06:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:48.996 06:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:48.996 06:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:48.996 06:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:49.253 06:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:49.511 [ 00:18:49.511 { 00:18:49.511 "name": "BaseBdev2", 00:18:49.511 "aliases": [ 00:18:49.511 "da6dd4a5-2283-44f9-acf9-09cbb637760f" 00:18:49.511 ], 00:18:49.511 "product_name": "Malloc disk", 00:18:49.511 "block_size": 512, 00:18:49.511 "num_blocks": 65536, 00:18:49.511 "uuid": "da6dd4a5-2283-44f9-acf9-09cbb637760f", 00:18:49.511 "assigned_rate_limits": { 00:18:49.511 "rw_ios_per_sec": 0, 00:18:49.511 "rw_mbytes_per_sec": 0, 00:18:49.511 "r_mbytes_per_sec": 0, 00:18:49.511 "w_mbytes_per_sec": 0 00:18:49.511 }, 00:18:49.511 "claimed": true, 00:18:49.511 "claim_type": "exclusive_write", 00:18:49.511 "zoned": false, 00:18:49.511 "supported_io_types": { 00:18:49.511 "read": true, 00:18:49.511 "write": true, 00:18:49.511 "unmap": true, 00:18:49.511 "flush": true, 00:18:49.511 "reset": true, 00:18:49.511 "nvme_admin": false, 00:18:49.511 "nvme_io": false, 00:18:49.511 "nvme_io_md": false, 00:18:49.511 "write_zeroes": true, 00:18:49.511 "zcopy": true, 00:18:49.511 "get_zone_info": false, 00:18:49.511 "zone_management": false, 00:18:49.511 "zone_append": false, 00:18:49.511 "compare": false, 00:18:49.511 "compare_and_write": false, 00:18:49.511 "abort": true, 00:18:49.511 "seek_hole": false, 00:18:49.511 "seek_data": false, 00:18:49.511 "copy": true, 00:18:49.511 "nvme_iov_md": false 00:18:49.511 }, 00:18:49.511 "memory_domains": [ 00:18:49.511 { 00:18:49.511 "dma_device_id": "system", 00:18:49.511 "dma_device_type": 1 00:18:49.511 }, 00:18:49.511 { 00:18:49.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.512 "dma_device_type": 2 00:18:49.512 } 00:18:49.512 ], 00:18:49.512 "driver_specific": {} 00:18:49.512 } 00:18:49.512 ] 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.512 06:35:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:49.770 06:35:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.770 "name": "Existed_Raid", 00:18:49.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.770 "strip_size_kb": 0, 00:18:49.770 "state": "configuring", 00:18:49.770 "raid_level": "raid1", 00:18:49.770 "superblock": false, 00:18:49.770 "num_base_bdevs": 3, 00:18:49.770 "num_base_bdevs_discovered": 2, 00:18:49.770 "num_base_bdevs_operational": 3, 00:18:49.770 "base_bdevs_list": [ 00:18:49.770 { 00:18:49.770 "name": "BaseBdev1", 00:18:49.770 "uuid": "21e7b150-568b-440c-8344-e4756fc5d822", 00:18:49.770 "is_configured": true, 00:18:49.770 "data_offset": 0, 00:18:49.770 "data_size": 65536 00:18:49.770 }, 00:18:49.770 { 00:18:49.770 "name": "BaseBdev2", 00:18:49.770 "uuid": "da6dd4a5-2283-44f9-acf9-09cbb637760f", 00:18:49.770 "is_configured": true, 00:18:49.770 "data_offset": 0, 00:18:49.770 "data_size": 65536 00:18:49.770 }, 00:18:49.770 { 00:18:49.770 "name": "BaseBdev3", 00:18:49.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.770 "is_configured": false, 00:18:49.770 "data_offset": 0, 00:18:49.770 "data_size": 0 00:18:49.770 } 00:18:49.770 ] 00:18:49.770 }' 00:18:49.770 06:35:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.770 06:35:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:50.366 06:35:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:50.366 [2024-07-25 06:35:03.920847] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:50.366 [2024-07-25 06:35:03.920886] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2903380 00:18:50.366 [2024-07-25 06:35:03.920894] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:50.366 [2024-07-25 06:35:03.921067] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28fc360 00:18:50.366 [2024-07-25 06:35:03.921192] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2903380 00:18:50.366 [2024-07-25 06:35:03.921201] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2903380 00:18:50.366 [2024-07-25 06:35:03.921346] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:50.624 BaseBdev3 00:18:50.624 06:35:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:50.624 06:35:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:18:50.624 06:35:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:50.624 06:35:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:50.624 06:35:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:50.624 06:35:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:50.624 06:35:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:50.624 06:35:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:50.882 [ 00:18:50.882 { 00:18:50.882 "name": "BaseBdev3", 00:18:50.882 "aliases": [ 00:18:50.882 "69b66360-c69d-4b2a-ab59-fc6260bde138" 00:18:50.882 ], 00:18:50.882 "product_name": "Malloc disk", 00:18:50.882 "block_size": 512, 00:18:50.882 "num_blocks": 65536, 00:18:50.882 "uuid": "69b66360-c69d-4b2a-ab59-fc6260bde138", 00:18:50.882 "assigned_rate_limits": { 00:18:50.882 "rw_ios_per_sec": 0, 00:18:50.882 "rw_mbytes_per_sec": 0, 00:18:50.882 "r_mbytes_per_sec": 0, 00:18:50.882 "w_mbytes_per_sec": 0 00:18:50.882 }, 00:18:50.882 "claimed": true, 00:18:50.882 "claim_type": "exclusive_write", 00:18:50.882 "zoned": false, 00:18:50.882 "supported_io_types": { 00:18:50.882 "read": true, 00:18:50.882 "write": true, 00:18:50.882 "unmap": true, 00:18:50.882 "flush": true, 00:18:50.882 "reset": true, 00:18:50.882 "nvme_admin": false, 00:18:50.882 "nvme_io": false, 00:18:50.882 "nvme_io_md": false, 00:18:50.882 "write_zeroes": true, 00:18:50.882 "zcopy": true, 00:18:50.882 "get_zone_info": false, 00:18:50.882 "zone_management": false, 00:18:50.882 "zone_append": false, 00:18:50.882 "compare": false, 00:18:50.882 "compare_and_write": false, 00:18:50.882 "abort": true, 00:18:50.882 "seek_hole": false, 00:18:50.882 "seek_data": false, 00:18:50.882 "copy": true, 00:18:50.882 "nvme_iov_md": false 00:18:50.882 }, 00:18:50.882 "memory_domains": [ 00:18:50.882 { 00:18:50.882 "dma_device_id": "system", 00:18:50.882 "dma_device_type": 1 00:18:50.882 }, 00:18:50.882 { 00:18:50.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.882 "dma_device_type": 2 00:18:50.882 } 00:18:50.882 ], 00:18:50.882 "driver_specific": {} 00:18:50.882 } 00:18:50.882 ] 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.882 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:51.140 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.140 "name": "Existed_Raid", 00:18:51.140 "uuid": "bf7486df-58e5-49d8-b70f-3ab6ff1ad49d", 00:18:51.140 "strip_size_kb": 0, 00:18:51.140 "state": "online", 00:18:51.140 "raid_level": "raid1", 00:18:51.140 "superblock": false, 00:18:51.140 "num_base_bdevs": 3, 00:18:51.140 "num_base_bdevs_discovered": 3, 00:18:51.140 "num_base_bdevs_operational": 3, 00:18:51.140 "base_bdevs_list": [ 00:18:51.140 { 00:18:51.140 "name": "BaseBdev1", 00:18:51.140 "uuid": "21e7b150-568b-440c-8344-e4756fc5d822", 00:18:51.140 "is_configured": true, 00:18:51.140 "data_offset": 0, 00:18:51.140 "data_size": 65536 00:18:51.140 }, 00:18:51.140 { 00:18:51.140 "name": "BaseBdev2", 00:18:51.140 "uuid": "da6dd4a5-2283-44f9-acf9-09cbb637760f", 00:18:51.140 "is_configured": true, 00:18:51.140 "data_offset": 0, 00:18:51.140 "data_size": 65536 00:18:51.140 }, 00:18:51.140 { 00:18:51.140 "name": "BaseBdev3", 00:18:51.140 "uuid": "69b66360-c69d-4b2a-ab59-fc6260bde138", 00:18:51.140 "is_configured": true, 00:18:51.140 "data_offset": 0, 00:18:51.140 "data_size": 65536 00:18:51.140 } 00:18:51.140 ] 00:18:51.140 }' 00:18:51.140 06:35:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.140 06:35:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.704 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:51.704 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:51.704 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:51.704 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:51.704 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:51.704 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:51.705 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:51.705 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:51.962 [2024-07-25 06:35:05.364926] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:51.962 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:51.962 "name": "Existed_Raid", 00:18:51.962 "aliases": [ 00:18:51.962 "bf7486df-58e5-49d8-b70f-3ab6ff1ad49d" 00:18:51.962 ], 00:18:51.962 "product_name": "Raid Volume", 00:18:51.962 "block_size": 512, 00:18:51.962 "num_blocks": 65536, 00:18:51.962 "uuid": "bf7486df-58e5-49d8-b70f-3ab6ff1ad49d", 00:18:51.962 "assigned_rate_limits": { 00:18:51.962 "rw_ios_per_sec": 0, 00:18:51.962 "rw_mbytes_per_sec": 0, 00:18:51.962 "r_mbytes_per_sec": 0, 00:18:51.962 "w_mbytes_per_sec": 0 00:18:51.962 }, 00:18:51.962 "claimed": false, 00:18:51.962 "zoned": false, 00:18:51.962 "supported_io_types": { 00:18:51.962 "read": true, 00:18:51.962 "write": true, 00:18:51.962 "unmap": false, 00:18:51.962 "flush": false, 00:18:51.962 "reset": true, 00:18:51.962 "nvme_admin": false, 00:18:51.962 "nvme_io": false, 00:18:51.962 "nvme_io_md": false, 00:18:51.962 "write_zeroes": true, 00:18:51.962 "zcopy": false, 00:18:51.962 "get_zone_info": false, 00:18:51.962 "zone_management": false, 00:18:51.962 "zone_append": false, 00:18:51.962 "compare": false, 00:18:51.962 "compare_and_write": false, 00:18:51.962 "abort": false, 00:18:51.962 "seek_hole": false, 00:18:51.962 "seek_data": false, 00:18:51.962 "copy": false, 00:18:51.962 "nvme_iov_md": false 00:18:51.962 }, 00:18:51.962 "memory_domains": [ 00:18:51.962 { 00:18:51.962 "dma_device_id": "system", 00:18:51.962 "dma_device_type": 1 00:18:51.962 }, 00:18:51.962 { 00:18:51.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.962 "dma_device_type": 2 00:18:51.962 }, 00:18:51.962 { 00:18:51.962 "dma_device_id": "system", 00:18:51.962 "dma_device_type": 1 00:18:51.962 }, 00:18:51.962 { 00:18:51.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.962 "dma_device_type": 2 00:18:51.962 }, 00:18:51.962 { 00:18:51.962 "dma_device_id": "system", 00:18:51.962 "dma_device_type": 1 00:18:51.962 }, 00:18:51.962 { 00:18:51.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.962 "dma_device_type": 2 00:18:51.962 } 00:18:51.962 ], 00:18:51.962 "driver_specific": { 00:18:51.962 "raid": { 00:18:51.962 "uuid": "bf7486df-58e5-49d8-b70f-3ab6ff1ad49d", 00:18:51.962 "strip_size_kb": 0, 00:18:51.962 "state": "online", 00:18:51.962 "raid_level": "raid1", 00:18:51.962 "superblock": false, 00:18:51.962 "num_base_bdevs": 3, 00:18:51.962 "num_base_bdevs_discovered": 3, 00:18:51.962 "num_base_bdevs_operational": 3, 00:18:51.962 "base_bdevs_list": [ 00:18:51.962 { 00:18:51.962 "name": "BaseBdev1", 00:18:51.962 "uuid": "21e7b150-568b-440c-8344-e4756fc5d822", 00:18:51.962 "is_configured": true, 00:18:51.962 "data_offset": 0, 00:18:51.962 "data_size": 65536 00:18:51.962 }, 00:18:51.962 { 00:18:51.962 "name": "BaseBdev2", 00:18:51.962 "uuid": "da6dd4a5-2283-44f9-acf9-09cbb637760f", 00:18:51.962 "is_configured": true, 00:18:51.962 "data_offset": 0, 00:18:51.962 "data_size": 65536 00:18:51.962 }, 00:18:51.962 { 00:18:51.962 "name": "BaseBdev3", 00:18:51.963 "uuid": "69b66360-c69d-4b2a-ab59-fc6260bde138", 00:18:51.963 "is_configured": true, 00:18:51.963 "data_offset": 0, 00:18:51.963 "data_size": 65536 00:18:51.963 } 00:18:51.963 ] 00:18:51.963 } 00:18:51.963 } 00:18:51.963 }' 00:18:51.963 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:51.963 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:51.963 BaseBdev2 00:18:51.963 BaseBdev3' 00:18:51.963 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:51.963 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:51.963 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.221 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.221 "name": "BaseBdev1", 00:18:52.221 "aliases": [ 00:18:52.221 "21e7b150-568b-440c-8344-e4756fc5d822" 00:18:52.221 ], 00:18:52.221 "product_name": "Malloc disk", 00:18:52.221 "block_size": 512, 00:18:52.221 "num_blocks": 65536, 00:18:52.221 "uuid": "21e7b150-568b-440c-8344-e4756fc5d822", 00:18:52.221 "assigned_rate_limits": { 00:18:52.221 "rw_ios_per_sec": 0, 00:18:52.221 "rw_mbytes_per_sec": 0, 00:18:52.221 "r_mbytes_per_sec": 0, 00:18:52.221 "w_mbytes_per_sec": 0 00:18:52.221 }, 00:18:52.221 "claimed": true, 00:18:52.221 "claim_type": "exclusive_write", 00:18:52.221 "zoned": false, 00:18:52.221 "supported_io_types": { 00:18:52.221 "read": true, 00:18:52.221 "write": true, 00:18:52.221 "unmap": true, 00:18:52.221 "flush": true, 00:18:52.221 "reset": true, 00:18:52.221 "nvme_admin": false, 00:18:52.221 "nvme_io": false, 00:18:52.221 "nvme_io_md": false, 00:18:52.221 "write_zeroes": true, 00:18:52.221 "zcopy": true, 00:18:52.221 "get_zone_info": false, 00:18:52.221 "zone_management": false, 00:18:52.221 "zone_append": false, 00:18:52.221 "compare": false, 00:18:52.221 "compare_and_write": false, 00:18:52.221 "abort": true, 00:18:52.221 "seek_hole": false, 00:18:52.221 "seek_data": false, 00:18:52.221 "copy": true, 00:18:52.221 "nvme_iov_md": false 00:18:52.221 }, 00:18:52.221 "memory_domains": [ 00:18:52.221 { 00:18:52.221 "dma_device_id": "system", 00:18:52.221 "dma_device_type": 1 00:18:52.221 }, 00:18:52.221 { 00:18:52.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.221 "dma_device_type": 2 00:18:52.221 } 00:18:52.221 ], 00:18:52.221 "driver_specific": {} 00:18:52.221 }' 00:18:52.221 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.221 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.221 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.221 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:52.479 06:35:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.737 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.737 "name": "BaseBdev2", 00:18:52.737 "aliases": [ 00:18:52.737 "da6dd4a5-2283-44f9-acf9-09cbb637760f" 00:18:52.737 ], 00:18:52.737 "product_name": "Malloc disk", 00:18:52.737 "block_size": 512, 00:18:52.737 "num_blocks": 65536, 00:18:52.737 "uuid": "da6dd4a5-2283-44f9-acf9-09cbb637760f", 00:18:52.737 "assigned_rate_limits": { 00:18:52.737 "rw_ios_per_sec": 0, 00:18:52.737 "rw_mbytes_per_sec": 0, 00:18:52.737 "r_mbytes_per_sec": 0, 00:18:52.737 "w_mbytes_per_sec": 0 00:18:52.737 }, 00:18:52.737 "claimed": true, 00:18:52.737 "claim_type": "exclusive_write", 00:18:52.737 "zoned": false, 00:18:52.737 "supported_io_types": { 00:18:52.737 "read": true, 00:18:52.737 "write": true, 00:18:52.737 "unmap": true, 00:18:52.737 "flush": true, 00:18:52.737 "reset": true, 00:18:52.737 "nvme_admin": false, 00:18:52.737 "nvme_io": false, 00:18:52.737 "nvme_io_md": false, 00:18:52.737 "write_zeroes": true, 00:18:52.737 "zcopy": true, 00:18:52.737 "get_zone_info": false, 00:18:52.737 "zone_management": false, 00:18:52.737 "zone_append": false, 00:18:52.737 "compare": false, 00:18:52.737 "compare_and_write": false, 00:18:52.737 "abort": true, 00:18:52.737 "seek_hole": false, 00:18:52.737 "seek_data": false, 00:18:52.737 "copy": true, 00:18:52.737 "nvme_iov_md": false 00:18:52.737 }, 00:18:52.737 "memory_domains": [ 00:18:52.737 { 00:18:52.737 "dma_device_id": "system", 00:18:52.737 "dma_device_type": 1 00:18:52.737 }, 00:18:52.737 { 00:18:52.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.737 "dma_device_type": 2 00:18:52.737 } 00:18:52.737 ], 00:18:52.737 "driver_specific": {} 00:18:52.737 }' 00:18:52.737 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.737 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.995 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.252 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.252 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.252 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:53.252 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.252 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.252 "name": "BaseBdev3", 00:18:53.252 "aliases": [ 00:18:53.252 "69b66360-c69d-4b2a-ab59-fc6260bde138" 00:18:53.252 ], 00:18:53.252 "product_name": "Malloc disk", 00:18:53.252 "block_size": 512, 00:18:53.252 "num_blocks": 65536, 00:18:53.252 "uuid": "69b66360-c69d-4b2a-ab59-fc6260bde138", 00:18:53.252 "assigned_rate_limits": { 00:18:53.252 "rw_ios_per_sec": 0, 00:18:53.252 "rw_mbytes_per_sec": 0, 00:18:53.252 "r_mbytes_per_sec": 0, 00:18:53.252 "w_mbytes_per_sec": 0 00:18:53.252 }, 00:18:53.252 "claimed": true, 00:18:53.252 "claim_type": "exclusive_write", 00:18:53.252 "zoned": false, 00:18:53.252 "supported_io_types": { 00:18:53.252 "read": true, 00:18:53.252 "write": true, 00:18:53.252 "unmap": true, 00:18:53.252 "flush": true, 00:18:53.252 "reset": true, 00:18:53.252 "nvme_admin": false, 00:18:53.252 "nvme_io": false, 00:18:53.252 "nvme_io_md": false, 00:18:53.252 "write_zeroes": true, 00:18:53.252 "zcopy": true, 00:18:53.252 "get_zone_info": false, 00:18:53.252 "zone_management": false, 00:18:53.252 "zone_append": false, 00:18:53.252 "compare": false, 00:18:53.252 "compare_and_write": false, 00:18:53.252 "abort": true, 00:18:53.252 "seek_hole": false, 00:18:53.252 "seek_data": false, 00:18:53.252 "copy": true, 00:18:53.252 "nvme_iov_md": false 00:18:53.252 }, 00:18:53.252 "memory_domains": [ 00:18:53.252 { 00:18:53.252 "dma_device_id": "system", 00:18:53.252 "dma_device_type": 1 00:18:53.252 }, 00:18:53.252 { 00:18:53.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.252 "dma_device_type": 2 00:18:53.252 } 00:18:53.252 ], 00:18:53.252 "driver_specific": {} 00:18:53.252 }' 00:18:53.253 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.510 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.510 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.510 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.510 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.510 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:53.510 06:35:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.510 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.510 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.510 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.768 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.768 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.768 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:54.026 [2024-07-25 06:35:07.325929] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.027 "name": "Existed_Raid", 00:18:54.027 "uuid": "bf7486df-58e5-49d8-b70f-3ab6ff1ad49d", 00:18:54.027 "strip_size_kb": 0, 00:18:54.027 "state": "online", 00:18:54.027 "raid_level": "raid1", 00:18:54.027 "superblock": false, 00:18:54.027 "num_base_bdevs": 3, 00:18:54.027 "num_base_bdevs_discovered": 2, 00:18:54.027 "num_base_bdevs_operational": 2, 00:18:54.027 "base_bdevs_list": [ 00:18:54.027 { 00:18:54.027 "name": null, 00:18:54.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.027 "is_configured": false, 00:18:54.027 "data_offset": 0, 00:18:54.027 "data_size": 65536 00:18:54.027 }, 00:18:54.027 { 00:18:54.027 "name": "BaseBdev2", 00:18:54.027 "uuid": "da6dd4a5-2283-44f9-acf9-09cbb637760f", 00:18:54.027 "is_configured": true, 00:18:54.027 "data_offset": 0, 00:18:54.027 "data_size": 65536 00:18:54.027 }, 00:18:54.027 { 00:18:54.027 "name": "BaseBdev3", 00:18:54.027 "uuid": "69b66360-c69d-4b2a-ab59-fc6260bde138", 00:18:54.027 "is_configured": true, 00:18:54.027 "data_offset": 0, 00:18:54.027 "data_size": 65536 00:18:54.027 } 00:18:54.027 ] 00:18:54.027 }' 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.027 06:35:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:54.958 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:54.958 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:54.958 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.958 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:54.958 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:54.958 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:54.958 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:55.216 [2024-07-25 06:35:08.598216] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:55.216 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:55.216 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:55.217 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.217 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:55.475 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:55.475 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:55.475 06:35:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:55.733 [2024-07-25 06:35:09.053402] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:55.733 [2024-07-25 06:35:09.053474] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:55.733 [2024-07-25 06:35:09.063610] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:55.733 [2024-07-25 06:35:09.063637] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:55.733 [2024-07-25 06:35:09.063647] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2903380 name Existed_Raid, state offline 00:18:55.733 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:55.733 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:55.733 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.733 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:55.991 BaseBdev2 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:55.991 06:35:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:56.248 06:35:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:56.814 [ 00:18:56.814 { 00:18:56.814 "name": "BaseBdev2", 00:18:56.814 "aliases": [ 00:18:56.815 "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456" 00:18:56.815 ], 00:18:56.815 "product_name": "Malloc disk", 00:18:56.815 "block_size": 512, 00:18:56.815 "num_blocks": 65536, 00:18:56.815 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:18:56.815 "assigned_rate_limits": { 00:18:56.815 "rw_ios_per_sec": 0, 00:18:56.815 "rw_mbytes_per_sec": 0, 00:18:56.815 "r_mbytes_per_sec": 0, 00:18:56.815 "w_mbytes_per_sec": 0 00:18:56.815 }, 00:18:56.815 "claimed": false, 00:18:56.815 "zoned": false, 00:18:56.815 "supported_io_types": { 00:18:56.815 "read": true, 00:18:56.815 "write": true, 00:18:56.815 "unmap": true, 00:18:56.815 "flush": true, 00:18:56.815 "reset": true, 00:18:56.815 "nvme_admin": false, 00:18:56.815 "nvme_io": false, 00:18:56.815 "nvme_io_md": false, 00:18:56.815 "write_zeroes": true, 00:18:56.815 "zcopy": true, 00:18:56.815 "get_zone_info": false, 00:18:56.815 "zone_management": false, 00:18:56.815 "zone_append": false, 00:18:56.815 "compare": false, 00:18:56.815 "compare_and_write": false, 00:18:56.815 "abort": true, 00:18:56.815 "seek_hole": false, 00:18:56.815 "seek_data": false, 00:18:56.815 "copy": true, 00:18:56.815 "nvme_iov_md": false 00:18:56.815 }, 00:18:56.815 "memory_domains": [ 00:18:56.815 { 00:18:56.815 "dma_device_id": "system", 00:18:56.815 "dma_device_type": 1 00:18:56.815 }, 00:18:56.815 { 00:18:56.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.815 "dma_device_type": 2 00:18:56.815 } 00:18:56.815 ], 00:18:56.815 "driver_specific": {} 00:18:56.815 } 00:18:56.815 ] 00:18:56.815 06:35:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:56.815 06:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:56.815 06:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:56.815 06:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:57.073 BaseBdev3 00:18:57.073 06:35:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:57.073 06:35:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:18:57.073 06:35:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:57.073 06:35:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:57.073 06:35:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:57.073 06:35:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:57.073 06:35:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:57.639 06:35:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:57.897 [ 00:18:57.897 { 00:18:57.897 "name": "BaseBdev3", 00:18:57.897 "aliases": [ 00:18:57.897 "b636724b-a5a7-47f0-969f-3ab5e2827096" 00:18:57.897 ], 00:18:57.897 "product_name": "Malloc disk", 00:18:57.897 "block_size": 512, 00:18:57.897 "num_blocks": 65536, 00:18:57.897 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:18:57.897 "assigned_rate_limits": { 00:18:57.897 "rw_ios_per_sec": 0, 00:18:57.897 "rw_mbytes_per_sec": 0, 00:18:57.897 "r_mbytes_per_sec": 0, 00:18:57.897 "w_mbytes_per_sec": 0 00:18:57.897 }, 00:18:57.897 "claimed": false, 00:18:57.897 "zoned": false, 00:18:57.897 "supported_io_types": { 00:18:57.897 "read": true, 00:18:57.897 "write": true, 00:18:57.897 "unmap": true, 00:18:57.897 "flush": true, 00:18:57.897 "reset": true, 00:18:57.897 "nvme_admin": false, 00:18:57.897 "nvme_io": false, 00:18:57.897 "nvme_io_md": false, 00:18:57.897 "write_zeroes": true, 00:18:57.897 "zcopy": true, 00:18:57.897 "get_zone_info": false, 00:18:57.897 "zone_management": false, 00:18:57.897 "zone_append": false, 00:18:57.897 "compare": false, 00:18:57.897 "compare_and_write": false, 00:18:57.897 "abort": true, 00:18:57.897 "seek_hole": false, 00:18:57.897 "seek_data": false, 00:18:57.897 "copy": true, 00:18:57.897 "nvme_iov_md": false 00:18:57.897 }, 00:18:57.897 "memory_domains": [ 00:18:57.897 { 00:18:57.897 "dma_device_id": "system", 00:18:57.897 "dma_device_type": 1 00:18:57.897 }, 00:18:57.897 { 00:18:57.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.897 "dma_device_type": 2 00:18:57.897 } 00:18:57.897 ], 00:18:57.897 "driver_specific": {} 00:18:57.897 } 00:18:57.897 ] 00:18:57.897 06:35:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:57.897 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:57.897 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:57.897 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:58.463 [2024-07-25 06:35:11.722888] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:58.463 [2024-07-25 06:35:11.722926] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:58.463 [2024-07-25 06:35:11.722946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:58.463 [2024-07-25 06:35:11.724158] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:58.463 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:58.463 "name": "Existed_Raid", 00:18:58.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.463 "strip_size_kb": 0, 00:18:58.463 "state": "configuring", 00:18:58.463 "raid_level": "raid1", 00:18:58.463 "superblock": false, 00:18:58.463 "num_base_bdevs": 3, 00:18:58.463 "num_base_bdevs_discovered": 2, 00:18:58.463 "num_base_bdevs_operational": 3, 00:18:58.463 "base_bdevs_list": [ 00:18:58.463 { 00:18:58.463 "name": "BaseBdev1", 00:18:58.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.463 "is_configured": false, 00:18:58.463 "data_offset": 0, 00:18:58.463 "data_size": 0 00:18:58.463 }, 00:18:58.463 { 00:18:58.463 "name": "BaseBdev2", 00:18:58.463 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:18:58.463 "is_configured": true, 00:18:58.463 "data_offset": 0, 00:18:58.464 "data_size": 65536 00:18:58.464 }, 00:18:58.464 { 00:18:58.464 "name": "BaseBdev3", 00:18:58.464 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:18:58.464 "is_configured": true, 00:18:58.464 "data_offset": 0, 00:18:58.464 "data_size": 65536 00:18:58.464 } 00:18:58.464 ] 00:18:58.464 }' 00:18:58.464 06:35:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:58.464 06:35:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:59.030 06:35:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:59.595 [2024-07-25 06:35:13.018276] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.595 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.852 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.852 "name": "Existed_Raid", 00:18:59.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.852 "strip_size_kb": 0, 00:18:59.852 "state": "configuring", 00:18:59.852 "raid_level": "raid1", 00:18:59.852 "superblock": false, 00:18:59.852 "num_base_bdevs": 3, 00:18:59.852 "num_base_bdevs_discovered": 1, 00:18:59.852 "num_base_bdevs_operational": 3, 00:18:59.852 "base_bdevs_list": [ 00:18:59.852 { 00:18:59.852 "name": "BaseBdev1", 00:18:59.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.852 "is_configured": false, 00:18:59.852 "data_offset": 0, 00:18:59.852 "data_size": 0 00:18:59.852 }, 00:18:59.852 { 00:18:59.852 "name": null, 00:18:59.852 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:18:59.852 "is_configured": false, 00:18:59.852 "data_offset": 0, 00:18:59.852 "data_size": 65536 00:18:59.852 }, 00:18:59.852 { 00:18:59.852 "name": "BaseBdev3", 00:18:59.852 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:18:59.852 "is_configured": true, 00:18:59.852 "data_offset": 0, 00:18:59.852 "data_size": 65536 00:18:59.852 } 00:18:59.852 ] 00:18:59.852 }' 00:18:59.852 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.852 06:35:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:00.417 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.417 06:35:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:00.675 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:00.675 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:00.934 [2024-07-25 06:35:14.260727] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:00.934 BaseBdev1 00:19:00.934 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:00.934 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:00.934 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:00.934 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:00.934 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:00.934 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:00.934 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:01.192 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:01.192 [ 00:19:01.192 { 00:19:01.192 "name": "BaseBdev1", 00:19:01.192 "aliases": [ 00:19:01.192 "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8" 00:19:01.192 ], 00:19:01.192 "product_name": "Malloc disk", 00:19:01.192 "block_size": 512, 00:19:01.192 "num_blocks": 65536, 00:19:01.192 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:01.192 "assigned_rate_limits": { 00:19:01.192 "rw_ios_per_sec": 0, 00:19:01.192 "rw_mbytes_per_sec": 0, 00:19:01.192 "r_mbytes_per_sec": 0, 00:19:01.192 "w_mbytes_per_sec": 0 00:19:01.192 }, 00:19:01.192 "claimed": true, 00:19:01.192 "claim_type": "exclusive_write", 00:19:01.192 "zoned": false, 00:19:01.192 "supported_io_types": { 00:19:01.192 "read": true, 00:19:01.192 "write": true, 00:19:01.192 "unmap": true, 00:19:01.192 "flush": true, 00:19:01.192 "reset": true, 00:19:01.192 "nvme_admin": false, 00:19:01.192 "nvme_io": false, 00:19:01.192 "nvme_io_md": false, 00:19:01.192 "write_zeroes": true, 00:19:01.192 "zcopy": true, 00:19:01.192 "get_zone_info": false, 00:19:01.192 "zone_management": false, 00:19:01.192 "zone_append": false, 00:19:01.192 "compare": false, 00:19:01.192 "compare_and_write": false, 00:19:01.192 "abort": true, 00:19:01.192 "seek_hole": false, 00:19:01.192 "seek_data": false, 00:19:01.192 "copy": true, 00:19:01.192 "nvme_iov_md": false 00:19:01.192 }, 00:19:01.192 "memory_domains": [ 00:19:01.192 { 00:19:01.192 "dma_device_id": "system", 00:19:01.192 "dma_device_type": 1 00:19:01.192 }, 00:19:01.192 { 00:19:01.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.192 "dma_device_type": 2 00:19:01.192 } 00:19:01.192 ], 00:19:01.192 "driver_specific": {} 00:19:01.192 } 00:19:01.192 ] 00:19:01.192 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:01.192 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:01.192 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:01.192 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.193 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.451 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.451 "name": "Existed_Raid", 00:19:01.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.451 "strip_size_kb": 0, 00:19:01.451 "state": "configuring", 00:19:01.451 "raid_level": "raid1", 00:19:01.451 "superblock": false, 00:19:01.451 "num_base_bdevs": 3, 00:19:01.451 "num_base_bdevs_discovered": 2, 00:19:01.451 "num_base_bdevs_operational": 3, 00:19:01.451 "base_bdevs_list": [ 00:19:01.451 { 00:19:01.451 "name": "BaseBdev1", 00:19:01.451 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:01.451 "is_configured": true, 00:19:01.451 "data_offset": 0, 00:19:01.451 "data_size": 65536 00:19:01.451 }, 00:19:01.451 { 00:19:01.451 "name": null, 00:19:01.451 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:19:01.451 "is_configured": false, 00:19:01.451 "data_offset": 0, 00:19:01.451 "data_size": 65536 00:19:01.451 }, 00:19:01.451 { 00:19:01.451 "name": "BaseBdev3", 00:19:01.451 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:19:01.451 "is_configured": true, 00:19:01.451 "data_offset": 0, 00:19:01.451 "data_size": 65536 00:19:01.451 } 00:19:01.451 ] 00:19:01.451 }' 00:19:01.451 06:35:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.451 06:35:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:02.017 06:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.018 06:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:02.276 06:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:02.276 06:35:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:02.842 [2024-07-25 06:35:16.229942] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.842 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:03.102 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.102 "name": "Existed_Raid", 00:19:03.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.102 "strip_size_kb": 0, 00:19:03.102 "state": "configuring", 00:19:03.102 "raid_level": "raid1", 00:19:03.102 "superblock": false, 00:19:03.102 "num_base_bdevs": 3, 00:19:03.102 "num_base_bdevs_discovered": 1, 00:19:03.102 "num_base_bdevs_operational": 3, 00:19:03.102 "base_bdevs_list": [ 00:19:03.102 { 00:19:03.102 "name": "BaseBdev1", 00:19:03.102 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:03.102 "is_configured": true, 00:19:03.102 "data_offset": 0, 00:19:03.102 "data_size": 65536 00:19:03.102 }, 00:19:03.102 { 00:19:03.102 "name": null, 00:19:03.102 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:19:03.102 "is_configured": false, 00:19:03.102 "data_offset": 0, 00:19:03.102 "data_size": 65536 00:19:03.102 }, 00:19:03.102 { 00:19:03.102 "name": null, 00:19:03.102 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:19:03.102 "is_configured": false, 00:19:03.102 "data_offset": 0, 00:19:03.102 "data_size": 65536 00:19:03.102 } 00:19:03.102 ] 00:19:03.102 }' 00:19:03.102 06:35:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.102 06:35:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.724 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.724 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:03.983 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:03.983 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:04.241 [2024-07-25 06:35:17.774039] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.500 06:35:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.500 06:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.500 "name": "Existed_Raid", 00:19:04.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.500 "strip_size_kb": 0, 00:19:04.500 "state": "configuring", 00:19:04.500 "raid_level": "raid1", 00:19:04.500 "superblock": false, 00:19:04.500 "num_base_bdevs": 3, 00:19:04.500 "num_base_bdevs_discovered": 2, 00:19:04.500 "num_base_bdevs_operational": 3, 00:19:04.500 "base_bdevs_list": [ 00:19:04.500 { 00:19:04.500 "name": "BaseBdev1", 00:19:04.500 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:04.500 "is_configured": true, 00:19:04.500 "data_offset": 0, 00:19:04.500 "data_size": 65536 00:19:04.500 }, 00:19:04.500 { 00:19:04.500 "name": null, 00:19:04.500 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:19:04.500 "is_configured": false, 00:19:04.500 "data_offset": 0, 00:19:04.500 "data_size": 65536 00:19:04.500 }, 00:19:04.500 { 00:19:04.500 "name": "BaseBdev3", 00:19:04.500 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:19:04.500 "is_configured": true, 00:19:04.500 "data_offset": 0, 00:19:04.500 "data_size": 65536 00:19:04.500 } 00:19:04.500 ] 00:19:04.500 }' 00:19:04.500 06:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.500 06:35:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.067 06:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.067 06:35:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:05.633 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:05.633 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:05.891 [2024-07-25 06:35:19.302076] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.891 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.892 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.892 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.892 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.150 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.150 "name": "Existed_Raid", 00:19:06.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.150 "strip_size_kb": 0, 00:19:06.150 "state": "configuring", 00:19:06.150 "raid_level": "raid1", 00:19:06.150 "superblock": false, 00:19:06.150 "num_base_bdevs": 3, 00:19:06.150 "num_base_bdevs_discovered": 1, 00:19:06.150 "num_base_bdevs_operational": 3, 00:19:06.150 "base_bdevs_list": [ 00:19:06.150 { 00:19:06.150 "name": null, 00:19:06.150 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:06.150 "is_configured": false, 00:19:06.150 "data_offset": 0, 00:19:06.150 "data_size": 65536 00:19:06.150 }, 00:19:06.150 { 00:19:06.150 "name": null, 00:19:06.150 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:19:06.150 "is_configured": false, 00:19:06.150 "data_offset": 0, 00:19:06.150 "data_size": 65536 00:19:06.150 }, 00:19:06.150 { 00:19:06.150 "name": "BaseBdev3", 00:19:06.150 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:19:06.150 "is_configured": true, 00:19:06.150 "data_offset": 0, 00:19:06.150 "data_size": 65536 00:19:06.150 } 00:19:06.150 ] 00:19:06.150 }' 00:19:06.150 06:35:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.150 06:35:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.716 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.716 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:06.974 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:06.974 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:07.541 [2024-07-25 06:35:20.836217] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.541 06:35:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:07.799 06:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.799 "name": "Existed_Raid", 00:19:07.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.799 "strip_size_kb": 0, 00:19:07.799 "state": "configuring", 00:19:07.799 "raid_level": "raid1", 00:19:07.799 "superblock": false, 00:19:07.799 "num_base_bdevs": 3, 00:19:07.799 "num_base_bdevs_discovered": 2, 00:19:07.799 "num_base_bdevs_operational": 3, 00:19:07.799 "base_bdevs_list": [ 00:19:07.799 { 00:19:07.799 "name": null, 00:19:07.799 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:07.799 "is_configured": false, 00:19:07.799 "data_offset": 0, 00:19:07.799 "data_size": 65536 00:19:07.799 }, 00:19:07.799 { 00:19:07.799 "name": "BaseBdev2", 00:19:07.799 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:19:07.799 "is_configured": true, 00:19:07.799 "data_offset": 0, 00:19:07.799 "data_size": 65536 00:19:07.799 }, 00:19:07.799 { 00:19:07.799 "name": "BaseBdev3", 00:19:07.799 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:19:07.799 "is_configured": true, 00:19:07.799 "data_offset": 0, 00:19:07.799 "data_size": 65536 00:19:07.799 } 00:19:07.799 ] 00:19:07.799 }' 00:19:07.799 06:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.799 06:35:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.366 06:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.366 06:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:08.366 06:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:08.366 06:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.366 06:35:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:08.623 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8 00:19:08.881 [2024-07-25 06:35:22.327175] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:08.881 [2024-07-25 06:35:22.327207] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x274ec50 00:19:08.881 [2024-07-25 06:35:22.327215] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:08.881 [2024-07-25 06:35:22.327387] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2900870 00:19:08.881 [2024-07-25 06:35:22.327496] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x274ec50 00:19:08.881 [2024-07-25 06:35:22.327505] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x274ec50 00:19:08.881 [2024-07-25 06:35:22.327647] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.881 NewBaseBdev 00:19:08.881 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:08.881 06:35:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:19:08.881 06:35:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:08.881 06:35:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:08.881 06:35:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:08.881 06:35:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:08.881 06:35:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:09.140 06:35:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:09.398 [ 00:19:09.398 { 00:19:09.398 "name": "NewBaseBdev", 00:19:09.398 "aliases": [ 00:19:09.398 "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8" 00:19:09.398 ], 00:19:09.398 "product_name": "Malloc disk", 00:19:09.398 "block_size": 512, 00:19:09.398 "num_blocks": 65536, 00:19:09.398 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:09.398 "assigned_rate_limits": { 00:19:09.398 "rw_ios_per_sec": 0, 00:19:09.398 "rw_mbytes_per_sec": 0, 00:19:09.398 "r_mbytes_per_sec": 0, 00:19:09.398 "w_mbytes_per_sec": 0 00:19:09.398 }, 00:19:09.398 "claimed": true, 00:19:09.398 "claim_type": "exclusive_write", 00:19:09.398 "zoned": false, 00:19:09.398 "supported_io_types": { 00:19:09.398 "read": true, 00:19:09.398 "write": true, 00:19:09.398 "unmap": true, 00:19:09.398 "flush": true, 00:19:09.398 "reset": true, 00:19:09.398 "nvme_admin": false, 00:19:09.398 "nvme_io": false, 00:19:09.398 "nvme_io_md": false, 00:19:09.398 "write_zeroes": true, 00:19:09.398 "zcopy": true, 00:19:09.398 "get_zone_info": false, 00:19:09.398 "zone_management": false, 00:19:09.398 "zone_append": false, 00:19:09.398 "compare": false, 00:19:09.398 "compare_and_write": false, 00:19:09.398 "abort": true, 00:19:09.398 "seek_hole": false, 00:19:09.398 "seek_data": false, 00:19:09.398 "copy": true, 00:19:09.398 "nvme_iov_md": false 00:19:09.398 }, 00:19:09.398 "memory_domains": [ 00:19:09.398 { 00:19:09.398 "dma_device_id": "system", 00:19:09.398 "dma_device_type": 1 00:19:09.398 }, 00:19:09.398 { 00:19:09.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.398 "dma_device_type": 2 00:19:09.398 } 00:19:09.398 ], 00:19:09.398 "driver_specific": {} 00:19:09.398 } 00:19:09.398 ] 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.398 06:35:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:09.657 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.657 "name": "Existed_Raid", 00:19:09.657 "uuid": "6f673b23-8f17-48ff-ae59-094ebdbaaac4", 00:19:09.657 "strip_size_kb": 0, 00:19:09.657 "state": "online", 00:19:09.657 "raid_level": "raid1", 00:19:09.657 "superblock": false, 00:19:09.657 "num_base_bdevs": 3, 00:19:09.657 "num_base_bdevs_discovered": 3, 00:19:09.657 "num_base_bdevs_operational": 3, 00:19:09.657 "base_bdevs_list": [ 00:19:09.657 { 00:19:09.657 "name": "NewBaseBdev", 00:19:09.657 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:09.657 "is_configured": true, 00:19:09.657 "data_offset": 0, 00:19:09.657 "data_size": 65536 00:19:09.657 }, 00:19:09.657 { 00:19:09.657 "name": "BaseBdev2", 00:19:09.657 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:19:09.657 "is_configured": true, 00:19:09.657 "data_offset": 0, 00:19:09.657 "data_size": 65536 00:19:09.657 }, 00:19:09.657 { 00:19:09.657 "name": "BaseBdev3", 00:19:09.657 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:19:09.657 "is_configured": true, 00:19:09.657 "data_offset": 0, 00:19:09.657 "data_size": 65536 00:19:09.657 } 00:19:09.657 ] 00:19:09.657 }' 00:19:09.657 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.657 06:35:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.223 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:10.223 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:10.223 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:10.223 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:10.223 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:10.223 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:10.224 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:10.224 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:10.482 [2024-07-25 06:35:23.807472] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:10.482 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:10.482 "name": "Existed_Raid", 00:19:10.482 "aliases": [ 00:19:10.482 "6f673b23-8f17-48ff-ae59-094ebdbaaac4" 00:19:10.482 ], 00:19:10.482 "product_name": "Raid Volume", 00:19:10.482 "block_size": 512, 00:19:10.482 "num_blocks": 65536, 00:19:10.482 "uuid": "6f673b23-8f17-48ff-ae59-094ebdbaaac4", 00:19:10.482 "assigned_rate_limits": { 00:19:10.482 "rw_ios_per_sec": 0, 00:19:10.482 "rw_mbytes_per_sec": 0, 00:19:10.482 "r_mbytes_per_sec": 0, 00:19:10.482 "w_mbytes_per_sec": 0 00:19:10.482 }, 00:19:10.482 "claimed": false, 00:19:10.482 "zoned": false, 00:19:10.482 "supported_io_types": { 00:19:10.482 "read": true, 00:19:10.482 "write": true, 00:19:10.482 "unmap": false, 00:19:10.482 "flush": false, 00:19:10.482 "reset": true, 00:19:10.482 "nvme_admin": false, 00:19:10.482 "nvme_io": false, 00:19:10.482 "nvme_io_md": false, 00:19:10.482 "write_zeroes": true, 00:19:10.482 "zcopy": false, 00:19:10.482 "get_zone_info": false, 00:19:10.482 "zone_management": false, 00:19:10.482 "zone_append": false, 00:19:10.482 "compare": false, 00:19:10.482 "compare_and_write": false, 00:19:10.482 "abort": false, 00:19:10.482 "seek_hole": false, 00:19:10.482 "seek_data": false, 00:19:10.482 "copy": false, 00:19:10.482 "nvme_iov_md": false 00:19:10.482 }, 00:19:10.482 "memory_domains": [ 00:19:10.482 { 00:19:10.482 "dma_device_id": "system", 00:19:10.482 "dma_device_type": 1 00:19:10.482 }, 00:19:10.482 { 00:19:10.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.482 "dma_device_type": 2 00:19:10.482 }, 00:19:10.482 { 00:19:10.482 "dma_device_id": "system", 00:19:10.482 "dma_device_type": 1 00:19:10.482 }, 00:19:10.482 { 00:19:10.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.482 "dma_device_type": 2 00:19:10.482 }, 00:19:10.482 { 00:19:10.482 "dma_device_id": "system", 00:19:10.482 "dma_device_type": 1 00:19:10.482 }, 00:19:10.482 { 00:19:10.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.482 "dma_device_type": 2 00:19:10.482 } 00:19:10.482 ], 00:19:10.482 "driver_specific": { 00:19:10.482 "raid": { 00:19:10.482 "uuid": "6f673b23-8f17-48ff-ae59-094ebdbaaac4", 00:19:10.482 "strip_size_kb": 0, 00:19:10.482 "state": "online", 00:19:10.482 "raid_level": "raid1", 00:19:10.482 "superblock": false, 00:19:10.482 "num_base_bdevs": 3, 00:19:10.482 "num_base_bdevs_discovered": 3, 00:19:10.482 "num_base_bdevs_operational": 3, 00:19:10.482 "base_bdevs_list": [ 00:19:10.482 { 00:19:10.482 "name": "NewBaseBdev", 00:19:10.482 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:10.482 "is_configured": true, 00:19:10.482 "data_offset": 0, 00:19:10.482 "data_size": 65536 00:19:10.482 }, 00:19:10.482 { 00:19:10.482 "name": "BaseBdev2", 00:19:10.482 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:19:10.482 "is_configured": true, 00:19:10.482 "data_offset": 0, 00:19:10.482 "data_size": 65536 00:19:10.482 }, 00:19:10.482 { 00:19:10.483 "name": "BaseBdev3", 00:19:10.483 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:19:10.483 "is_configured": true, 00:19:10.483 "data_offset": 0, 00:19:10.483 "data_size": 65536 00:19:10.483 } 00:19:10.483 ] 00:19:10.483 } 00:19:10.483 } 00:19:10.483 }' 00:19:10.483 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:10.483 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:10.483 BaseBdev2 00:19:10.483 BaseBdev3' 00:19:10.483 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.483 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:10.483 06:35:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:10.741 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:10.741 "name": "NewBaseBdev", 00:19:10.741 "aliases": [ 00:19:10.741 "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8" 00:19:10.741 ], 00:19:10.741 "product_name": "Malloc disk", 00:19:10.741 "block_size": 512, 00:19:10.741 "num_blocks": 65536, 00:19:10.741 "uuid": "66f0d1b0-1e3e-4148-b035-fcd2d35bd7a8", 00:19:10.741 "assigned_rate_limits": { 00:19:10.741 "rw_ios_per_sec": 0, 00:19:10.741 "rw_mbytes_per_sec": 0, 00:19:10.741 "r_mbytes_per_sec": 0, 00:19:10.741 "w_mbytes_per_sec": 0 00:19:10.741 }, 00:19:10.741 "claimed": true, 00:19:10.741 "claim_type": "exclusive_write", 00:19:10.741 "zoned": false, 00:19:10.741 "supported_io_types": { 00:19:10.741 "read": true, 00:19:10.741 "write": true, 00:19:10.741 "unmap": true, 00:19:10.741 "flush": true, 00:19:10.741 "reset": true, 00:19:10.741 "nvme_admin": false, 00:19:10.741 "nvme_io": false, 00:19:10.741 "nvme_io_md": false, 00:19:10.741 "write_zeroes": true, 00:19:10.741 "zcopy": true, 00:19:10.741 "get_zone_info": false, 00:19:10.741 "zone_management": false, 00:19:10.741 "zone_append": false, 00:19:10.741 "compare": false, 00:19:10.741 "compare_and_write": false, 00:19:10.741 "abort": true, 00:19:10.741 "seek_hole": false, 00:19:10.741 "seek_data": false, 00:19:10.741 "copy": true, 00:19:10.741 "nvme_iov_md": false 00:19:10.741 }, 00:19:10.741 "memory_domains": [ 00:19:10.741 { 00:19:10.741 "dma_device_id": "system", 00:19:10.741 "dma_device_type": 1 00:19:10.741 }, 00:19:10.741 { 00:19:10.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.741 "dma_device_type": 2 00:19:10.741 } 00:19:10.741 ], 00:19:10.741 "driver_specific": {} 00:19:10.741 }' 00:19:10.741 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.741 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.741 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.741 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.741 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.741 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.741 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.999 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.999 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:10.999 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.999 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.999 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:10.999 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.999 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:10.999 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:11.256 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:11.256 "name": "BaseBdev2", 00:19:11.256 "aliases": [ 00:19:11.256 "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456" 00:19:11.256 ], 00:19:11.256 "product_name": "Malloc disk", 00:19:11.256 "block_size": 512, 00:19:11.256 "num_blocks": 65536, 00:19:11.256 "uuid": "3e3b6eb5-3686-4f9d-9f2b-ea9a98809456", 00:19:11.256 "assigned_rate_limits": { 00:19:11.256 "rw_ios_per_sec": 0, 00:19:11.256 "rw_mbytes_per_sec": 0, 00:19:11.256 "r_mbytes_per_sec": 0, 00:19:11.256 "w_mbytes_per_sec": 0 00:19:11.256 }, 00:19:11.256 "claimed": true, 00:19:11.256 "claim_type": "exclusive_write", 00:19:11.256 "zoned": false, 00:19:11.256 "supported_io_types": { 00:19:11.256 "read": true, 00:19:11.256 "write": true, 00:19:11.256 "unmap": true, 00:19:11.256 "flush": true, 00:19:11.256 "reset": true, 00:19:11.256 "nvme_admin": false, 00:19:11.256 "nvme_io": false, 00:19:11.256 "nvme_io_md": false, 00:19:11.256 "write_zeroes": true, 00:19:11.256 "zcopy": true, 00:19:11.256 "get_zone_info": false, 00:19:11.256 "zone_management": false, 00:19:11.256 "zone_append": false, 00:19:11.256 "compare": false, 00:19:11.256 "compare_and_write": false, 00:19:11.256 "abort": true, 00:19:11.256 "seek_hole": false, 00:19:11.256 "seek_data": false, 00:19:11.256 "copy": true, 00:19:11.256 "nvme_iov_md": false 00:19:11.256 }, 00:19:11.256 "memory_domains": [ 00:19:11.256 { 00:19:11.256 "dma_device_id": "system", 00:19:11.256 "dma_device_type": 1 00:19:11.256 }, 00:19:11.256 { 00:19:11.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.256 "dma_device_type": 2 00:19:11.256 } 00:19:11.256 ], 00:19:11.256 "driver_specific": {} 00:19:11.256 }' 00:19:11.256 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.256 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.256 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:11.256 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.256 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.513 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:11.513 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.513 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.513 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:11.513 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.513 06:35:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.513 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:11.513 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:11.513 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:11.513 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:11.770 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:11.770 "name": "BaseBdev3", 00:19:11.770 "aliases": [ 00:19:11.770 "b636724b-a5a7-47f0-969f-3ab5e2827096" 00:19:11.770 ], 00:19:11.770 "product_name": "Malloc disk", 00:19:11.770 "block_size": 512, 00:19:11.770 "num_blocks": 65536, 00:19:11.770 "uuid": "b636724b-a5a7-47f0-969f-3ab5e2827096", 00:19:11.770 "assigned_rate_limits": { 00:19:11.770 "rw_ios_per_sec": 0, 00:19:11.770 "rw_mbytes_per_sec": 0, 00:19:11.770 "r_mbytes_per_sec": 0, 00:19:11.770 "w_mbytes_per_sec": 0 00:19:11.770 }, 00:19:11.771 "claimed": true, 00:19:11.771 "claim_type": "exclusive_write", 00:19:11.771 "zoned": false, 00:19:11.771 "supported_io_types": { 00:19:11.771 "read": true, 00:19:11.771 "write": true, 00:19:11.771 "unmap": true, 00:19:11.771 "flush": true, 00:19:11.771 "reset": true, 00:19:11.771 "nvme_admin": false, 00:19:11.771 "nvme_io": false, 00:19:11.771 "nvme_io_md": false, 00:19:11.771 "write_zeroes": true, 00:19:11.771 "zcopy": true, 00:19:11.771 "get_zone_info": false, 00:19:11.771 "zone_management": false, 00:19:11.771 "zone_append": false, 00:19:11.771 "compare": false, 00:19:11.771 "compare_and_write": false, 00:19:11.771 "abort": true, 00:19:11.771 "seek_hole": false, 00:19:11.771 "seek_data": false, 00:19:11.771 "copy": true, 00:19:11.771 "nvme_iov_md": false 00:19:11.771 }, 00:19:11.771 "memory_domains": [ 00:19:11.771 { 00:19:11.771 "dma_device_id": "system", 00:19:11.771 "dma_device_type": 1 00:19:11.771 }, 00:19:11.771 { 00:19:11.771 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.771 "dma_device_type": 2 00:19:11.771 } 00:19:11.771 ], 00:19:11.771 "driver_specific": {} 00:19:11.771 }' 00:19:11.771 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.771 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.771 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:11.771 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.028 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:12.287 [2024-07-25 06:35:25.780431] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:12.287 [2024-07-25 06:35:25.780454] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:12.287 [2024-07-25 06:35:25.780506] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:12.287 [2024-07-25 06:35:25.780735] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:12.287 [2024-07-25 06:35:25.780746] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x274ec50 name Existed_Raid, state offline 00:19:12.287 06:35:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1150137 00:19:12.287 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1150137 ']' 00:19:12.287 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1150137 00:19:12.287 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:19:12.287 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:12.287 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1150137 00:19:12.545 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:12.545 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:12.545 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1150137' 00:19:12.545 killing process with pid 1150137 00:19:12.545 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1150137 00:19:12.545 [2024-07-25 06:35:25.856132] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:12.545 06:35:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1150137 00:19:12.545 [2024-07-25 06:35:25.880768] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:12.545 06:35:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:12.545 00:19:12.545 real 0m29.828s 00:19:12.545 user 0m54.846s 00:19:12.545 sys 0m5.177s 00:19:12.545 06:35:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:12.545 06:35:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.545 ************************************ 00:19:12.545 END TEST raid_state_function_test 00:19:12.545 ************************************ 00:19:12.804 06:35:26 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:19:12.804 06:35:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:19:12.804 06:35:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:12.804 06:35:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:12.804 ************************************ 00:19:12.804 START TEST raid_state_function_test_sb 00:19:12.804 ************************************ 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 true 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1155768 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1155768' 00:19:12.804 Process raid pid: 1155768 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1155768 /var/tmp/spdk-raid.sock 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1155768 ']' 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:12.804 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:12.804 06:35:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:12.804 [2024-07-25 06:35:26.204271] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:19:12.805 [2024-07-25 06:35:26.204327] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:01.0 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:01.1 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:01.2 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:01.3 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:01.4 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:01.5 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:01.6 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:01.7 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:02.0 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:02.1 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:02.2 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:02.3 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:02.4 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:02.5 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:02.6 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3d:02.7 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:01.0 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:01.1 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:01.2 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:01.3 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:01.4 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:01.5 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:01.6 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:01.7 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:02.0 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:02.1 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:02.2 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:02.3 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:02.4 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:02.5 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:02.6 cannot be used 00:19:12.805 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:12.805 EAL: Requested device 0000:3f:02.7 cannot be used 00:19:12.805 [2024-07-25 06:35:26.342444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:13.063 [2024-07-25 06:35:26.386907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:13.063 [2024-07-25 06:35:26.449783] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:13.063 [2024-07-25 06:35:26.449818] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:13.628 06:35:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:13.628 06:35:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:19:13.628 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:13.887 [2024-07-25 06:35:27.314178] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:13.887 [2024-07-25 06:35:27.314214] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:13.887 [2024-07-25 06:35:27.314225] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:13.887 [2024-07-25 06:35:27.314235] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:13.887 [2024-07-25 06:35:27.314243] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:13.887 [2024-07-25 06:35:27.314254] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.887 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:14.145 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.145 "name": "Existed_Raid", 00:19:14.145 "uuid": "11844323-fba8-43b4-af92-d84479e7dc4a", 00:19:14.145 "strip_size_kb": 0, 00:19:14.145 "state": "configuring", 00:19:14.145 "raid_level": "raid1", 00:19:14.145 "superblock": true, 00:19:14.145 "num_base_bdevs": 3, 00:19:14.145 "num_base_bdevs_discovered": 0, 00:19:14.145 "num_base_bdevs_operational": 3, 00:19:14.145 "base_bdevs_list": [ 00:19:14.145 { 00:19:14.145 "name": "BaseBdev1", 00:19:14.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.145 "is_configured": false, 00:19:14.145 "data_offset": 0, 00:19:14.145 "data_size": 0 00:19:14.145 }, 00:19:14.145 { 00:19:14.145 "name": "BaseBdev2", 00:19:14.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.145 "is_configured": false, 00:19:14.145 "data_offset": 0, 00:19:14.145 "data_size": 0 00:19:14.145 }, 00:19:14.145 { 00:19:14.145 "name": "BaseBdev3", 00:19:14.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.145 "is_configured": false, 00:19:14.145 "data_offset": 0, 00:19:14.145 "data_size": 0 00:19:14.145 } 00:19:14.145 ] 00:19:14.145 }' 00:19:14.145 06:35:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.145 06:35:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:14.710 06:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:14.968 [2024-07-25 06:35:28.320698] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:14.968 [2024-07-25 06:35:28.320721] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20a9470 name Existed_Raid, state configuring 00:19:14.968 06:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:15.226 [2024-07-25 06:35:28.545305] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:15.226 [2024-07-25 06:35:28.545330] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:15.226 [2024-07-25 06:35:28.545339] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:15.226 [2024-07-25 06:35:28.545349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:15.226 [2024-07-25 06:35:28.545357] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:15.226 [2024-07-25 06:35:28.545367] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:15.226 06:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:15.226 [2024-07-25 06:35:28.767189] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:15.226 BaseBdev1 00:19:15.484 06:35:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:15.484 06:35:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:15.484 06:35:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:15.484 06:35:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:15.484 06:35:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:15.484 06:35:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:15.484 06:35:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.484 06:35:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:15.743 [ 00:19:15.743 { 00:19:15.743 "name": "BaseBdev1", 00:19:15.743 "aliases": [ 00:19:15.743 "bfef999d-d022-4380-b058-2877a588ce0d" 00:19:15.743 ], 00:19:15.743 "product_name": "Malloc disk", 00:19:15.743 "block_size": 512, 00:19:15.743 "num_blocks": 65536, 00:19:15.743 "uuid": "bfef999d-d022-4380-b058-2877a588ce0d", 00:19:15.743 "assigned_rate_limits": { 00:19:15.743 "rw_ios_per_sec": 0, 00:19:15.743 "rw_mbytes_per_sec": 0, 00:19:15.743 "r_mbytes_per_sec": 0, 00:19:15.743 "w_mbytes_per_sec": 0 00:19:15.743 }, 00:19:15.743 "claimed": true, 00:19:15.743 "claim_type": "exclusive_write", 00:19:15.743 "zoned": false, 00:19:15.743 "supported_io_types": { 00:19:15.743 "read": true, 00:19:15.743 "write": true, 00:19:15.743 "unmap": true, 00:19:15.743 "flush": true, 00:19:15.743 "reset": true, 00:19:15.743 "nvme_admin": false, 00:19:15.743 "nvme_io": false, 00:19:15.743 "nvme_io_md": false, 00:19:15.743 "write_zeroes": true, 00:19:15.743 "zcopy": true, 00:19:15.743 "get_zone_info": false, 00:19:15.743 "zone_management": false, 00:19:15.743 "zone_append": false, 00:19:15.743 "compare": false, 00:19:15.743 "compare_and_write": false, 00:19:15.743 "abort": true, 00:19:15.743 "seek_hole": false, 00:19:15.743 "seek_data": false, 00:19:15.743 "copy": true, 00:19:15.743 "nvme_iov_md": false 00:19:15.743 }, 00:19:15.743 "memory_domains": [ 00:19:15.743 { 00:19:15.743 "dma_device_id": "system", 00:19:15.743 "dma_device_type": 1 00:19:15.743 }, 00:19:15.743 { 00:19:15.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.743 "dma_device_type": 2 00:19:15.743 } 00:19:15.743 ], 00:19:15.743 "driver_specific": {} 00:19:15.743 } 00:19:15.743 ] 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.743 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:16.001 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.001 "name": "Existed_Raid", 00:19:16.001 "uuid": "be7350ae-6821-4c3a-9fb7-0e26e4cedc92", 00:19:16.001 "strip_size_kb": 0, 00:19:16.001 "state": "configuring", 00:19:16.001 "raid_level": "raid1", 00:19:16.001 "superblock": true, 00:19:16.001 "num_base_bdevs": 3, 00:19:16.001 "num_base_bdevs_discovered": 1, 00:19:16.001 "num_base_bdevs_operational": 3, 00:19:16.001 "base_bdevs_list": [ 00:19:16.001 { 00:19:16.001 "name": "BaseBdev1", 00:19:16.001 "uuid": "bfef999d-d022-4380-b058-2877a588ce0d", 00:19:16.001 "is_configured": true, 00:19:16.001 "data_offset": 2048, 00:19:16.001 "data_size": 63488 00:19:16.001 }, 00:19:16.001 { 00:19:16.001 "name": "BaseBdev2", 00:19:16.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.001 "is_configured": false, 00:19:16.001 "data_offset": 0, 00:19:16.001 "data_size": 0 00:19:16.001 }, 00:19:16.001 { 00:19:16.001 "name": "BaseBdev3", 00:19:16.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.001 "is_configured": false, 00:19:16.001 "data_offset": 0, 00:19:16.001 "data_size": 0 00:19:16.001 } 00:19:16.001 ] 00:19:16.001 }' 00:19:16.001 06:35:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.001 06:35:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.566 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:16.824 [2024-07-25 06:35:30.247174] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:16.824 [2024-07-25 06:35:30.247211] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20a8ce0 name Existed_Raid, state configuring 00:19:16.824 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:17.082 [2024-07-25 06:35:30.459760] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:17.082 [2024-07-25 06:35:30.461135] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:17.082 [2024-07-25 06:35:30.461175] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:17.082 [2024-07-25 06:35:30.461184] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:17.082 [2024-07-25 06:35:30.461195] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.082 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.340 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.340 "name": "Existed_Raid", 00:19:17.340 "uuid": "9dfe4da9-213b-4101-ac1b-4f954e313d83", 00:19:17.340 "strip_size_kb": 0, 00:19:17.340 "state": "configuring", 00:19:17.340 "raid_level": "raid1", 00:19:17.340 "superblock": true, 00:19:17.340 "num_base_bdevs": 3, 00:19:17.340 "num_base_bdevs_discovered": 1, 00:19:17.340 "num_base_bdevs_operational": 3, 00:19:17.340 "base_bdevs_list": [ 00:19:17.340 { 00:19:17.340 "name": "BaseBdev1", 00:19:17.340 "uuid": "bfef999d-d022-4380-b058-2877a588ce0d", 00:19:17.340 "is_configured": true, 00:19:17.340 "data_offset": 2048, 00:19:17.340 "data_size": 63488 00:19:17.340 }, 00:19:17.340 { 00:19:17.340 "name": "BaseBdev2", 00:19:17.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.340 "is_configured": false, 00:19:17.340 "data_offset": 0, 00:19:17.340 "data_size": 0 00:19:17.340 }, 00:19:17.340 { 00:19:17.340 "name": "BaseBdev3", 00:19:17.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.340 "is_configured": false, 00:19:17.340 "data_offset": 0, 00:19:17.340 "data_size": 0 00:19:17.340 } 00:19:17.340 ] 00:19:17.340 }' 00:19:17.340 06:35:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.340 06:35:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:17.927 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:18.184 [2024-07-25 06:35:31.509609] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:18.184 BaseBdev2 00:19:18.184 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:18.184 06:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:18.184 06:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:18.184 06:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:18.184 06:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:18.184 06:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:18.184 06:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.441 06:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:18.441 [ 00:19:18.441 { 00:19:18.441 "name": "BaseBdev2", 00:19:18.441 "aliases": [ 00:19:18.441 "aa374e04-3367-427a-864a-941509737345" 00:19:18.441 ], 00:19:18.441 "product_name": "Malloc disk", 00:19:18.441 "block_size": 512, 00:19:18.441 "num_blocks": 65536, 00:19:18.441 "uuid": "aa374e04-3367-427a-864a-941509737345", 00:19:18.441 "assigned_rate_limits": { 00:19:18.441 "rw_ios_per_sec": 0, 00:19:18.441 "rw_mbytes_per_sec": 0, 00:19:18.441 "r_mbytes_per_sec": 0, 00:19:18.441 "w_mbytes_per_sec": 0 00:19:18.441 }, 00:19:18.441 "claimed": true, 00:19:18.441 "claim_type": "exclusive_write", 00:19:18.441 "zoned": false, 00:19:18.441 "supported_io_types": { 00:19:18.441 "read": true, 00:19:18.441 "write": true, 00:19:18.441 "unmap": true, 00:19:18.441 "flush": true, 00:19:18.441 "reset": true, 00:19:18.441 "nvme_admin": false, 00:19:18.441 "nvme_io": false, 00:19:18.441 "nvme_io_md": false, 00:19:18.441 "write_zeroes": true, 00:19:18.441 "zcopy": true, 00:19:18.441 "get_zone_info": false, 00:19:18.441 "zone_management": false, 00:19:18.441 "zone_append": false, 00:19:18.441 "compare": false, 00:19:18.441 "compare_and_write": false, 00:19:18.441 "abort": true, 00:19:18.441 "seek_hole": false, 00:19:18.441 "seek_data": false, 00:19:18.441 "copy": true, 00:19:18.441 "nvme_iov_md": false 00:19:18.441 }, 00:19:18.441 "memory_domains": [ 00:19:18.441 { 00:19:18.441 "dma_device_id": "system", 00:19:18.441 "dma_device_type": 1 00:19:18.441 }, 00:19:18.441 { 00:19:18.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.441 "dma_device_type": 2 00:19:18.441 } 00:19:18.441 ], 00:19:18.442 "driver_specific": {} 00:19:18.442 } 00:19:18.442 ] 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.442 06:35:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.699 06:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.699 "name": "Existed_Raid", 00:19:18.699 "uuid": "9dfe4da9-213b-4101-ac1b-4f954e313d83", 00:19:18.699 "strip_size_kb": 0, 00:19:18.699 "state": "configuring", 00:19:18.699 "raid_level": "raid1", 00:19:18.699 "superblock": true, 00:19:18.699 "num_base_bdevs": 3, 00:19:18.699 "num_base_bdevs_discovered": 2, 00:19:18.699 "num_base_bdevs_operational": 3, 00:19:18.699 "base_bdevs_list": [ 00:19:18.699 { 00:19:18.699 "name": "BaseBdev1", 00:19:18.699 "uuid": "bfef999d-d022-4380-b058-2877a588ce0d", 00:19:18.699 "is_configured": true, 00:19:18.699 "data_offset": 2048, 00:19:18.699 "data_size": 63488 00:19:18.699 }, 00:19:18.699 { 00:19:18.699 "name": "BaseBdev2", 00:19:18.699 "uuid": "aa374e04-3367-427a-864a-941509737345", 00:19:18.699 "is_configured": true, 00:19:18.699 "data_offset": 2048, 00:19:18.699 "data_size": 63488 00:19:18.699 }, 00:19:18.699 { 00:19:18.699 "name": "BaseBdev3", 00:19:18.699 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.699 "is_configured": false, 00:19:18.699 "data_offset": 0, 00:19:18.699 "data_size": 0 00:19:18.699 } 00:19:18.699 ] 00:19:18.699 }' 00:19:18.699 06:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.699 06:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:19.262 06:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:19.519 [2024-07-25 06:35:32.948570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:19.519 [2024-07-25 06:35:32.948709] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x225c380 00:19:19.519 [2024-07-25 06:35:32.948721] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:19.519 [2024-07-25 06:35:32.948880] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20a8330 00:19:19.519 [2024-07-25 06:35:32.948996] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x225c380 00:19:19.519 [2024-07-25 06:35:32.949006] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x225c380 00:19:19.519 [2024-07-25 06:35:32.949088] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:19.519 BaseBdev3 00:19:19.519 06:35:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:19.519 06:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:19.519 06:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:19.519 06:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:19.519 06:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:19.519 06:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:19.519 06:35:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:19.776 06:35:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:20.032 [ 00:19:20.032 { 00:19:20.032 "name": "BaseBdev3", 00:19:20.032 "aliases": [ 00:19:20.032 "499a3ecb-a172-473c-94d5-fdfbdb537b62" 00:19:20.032 ], 00:19:20.032 "product_name": "Malloc disk", 00:19:20.032 "block_size": 512, 00:19:20.032 "num_blocks": 65536, 00:19:20.032 "uuid": "499a3ecb-a172-473c-94d5-fdfbdb537b62", 00:19:20.032 "assigned_rate_limits": { 00:19:20.032 "rw_ios_per_sec": 0, 00:19:20.032 "rw_mbytes_per_sec": 0, 00:19:20.032 "r_mbytes_per_sec": 0, 00:19:20.032 "w_mbytes_per_sec": 0 00:19:20.032 }, 00:19:20.032 "claimed": true, 00:19:20.032 "claim_type": "exclusive_write", 00:19:20.032 "zoned": false, 00:19:20.032 "supported_io_types": { 00:19:20.032 "read": true, 00:19:20.032 "write": true, 00:19:20.032 "unmap": true, 00:19:20.032 "flush": true, 00:19:20.032 "reset": true, 00:19:20.032 "nvme_admin": false, 00:19:20.032 "nvme_io": false, 00:19:20.032 "nvme_io_md": false, 00:19:20.032 "write_zeroes": true, 00:19:20.032 "zcopy": true, 00:19:20.032 "get_zone_info": false, 00:19:20.032 "zone_management": false, 00:19:20.032 "zone_append": false, 00:19:20.032 "compare": false, 00:19:20.032 "compare_and_write": false, 00:19:20.032 "abort": true, 00:19:20.032 "seek_hole": false, 00:19:20.032 "seek_data": false, 00:19:20.032 "copy": true, 00:19:20.032 "nvme_iov_md": false 00:19:20.032 }, 00:19:20.032 "memory_domains": [ 00:19:20.032 { 00:19:20.032 "dma_device_id": "system", 00:19:20.032 "dma_device_type": 1 00:19:20.032 }, 00:19:20.032 { 00:19:20.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.032 "dma_device_type": 2 00:19:20.032 } 00:19:20.032 ], 00:19:20.032 "driver_specific": {} 00:19:20.032 } 00:19:20.032 ] 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.032 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.032 "name": "Existed_Raid", 00:19:20.032 "uuid": "9dfe4da9-213b-4101-ac1b-4f954e313d83", 00:19:20.032 "strip_size_kb": 0, 00:19:20.032 "state": "online", 00:19:20.032 "raid_level": "raid1", 00:19:20.032 "superblock": true, 00:19:20.032 "num_base_bdevs": 3, 00:19:20.032 "num_base_bdevs_discovered": 3, 00:19:20.032 "num_base_bdevs_operational": 3, 00:19:20.032 "base_bdevs_list": [ 00:19:20.032 { 00:19:20.032 "name": "BaseBdev1", 00:19:20.032 "uuid": "bfef999d-d022-4380-b058-2877a588ce0d", 00:19:20.032 "is_configured": true, 00:19:20.032 "data_offset": 2048, 00:19:20.032 "data_size": 63488 00:19:20.032 }, 00:19:20.032 { 00:19:20.032 "name": "BaseBdev2", 00:19:20.032 "uuid": "aa374e04-3367-427a-864a-941509737345", 00:19:20.032 "is_configured": true, 00:19:20.032 "data_offset": 2048, 00:19:20.032 "data_size": 63488 00:19:20.033 }, 00:19:20.033 { 00:19:20.033 "name": "BaseBdev3", 00:19:20.033 "uuid": "499a3ecb-a172-473c-94d5-fdfbdb537b62", 00:19:20.033 "is_configured": true, 00:19:20.033 "data_offset": 2048, 00:19:20.033 "data_size": 63488 00:19:20.033 } 00:19:20.033 ] 00:19:20.033 }' 00:19:20.033 06:35:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.033 06:35:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.595 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:20.595 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:20.595 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:20.595 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:20.595 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:20.595 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:20.595 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:20.595 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:20.852 [2024-07-25 06:35:34.312455] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:20.852 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:20.852 "name": "Existed_Raid", 00:19:20.852 "aliases": [ 00:19:20.852 "9dfe4da9-213b-4101-ac1b-4f954e313d83" 00:19:20.852 ], 00:19:20.852 "product_name": "Raid Volume", 00:19:20.852 "block_size": 512, 00:19:20.852 "num_blocks": 63488, 00:19:20.852 "uuid": "9dfe4da9-213b-4101-ac1b-4f954e313d83", 00:19:20.852 "assigned_rate_limits": { 00:19:20.852 "rw_ios_per_sec": 0, 00:19:20.852 "rw_mbytes_per_sec": 0, 00:19:20.852 "r_mbytes_per_sec": 0, 00:19:20.852 "w_mbytes_per_sec": 0 00:19:20.852 }, 00:19:20.852 "claimed": false, 00:19:20.852 "zoned": false, 00:19:20.852 "supported_io_types": { 00:19:20.852 "read": true, 00:19:20.852 "write": true, 00:19:20.852 "unmap": false, 00:19:20.852 "flush": false, 00:19:20.852 "reset": true, 00:19:20.852 "nvme_admin": false, 00:19:20.852 "nvme_io": false, 00:19:20.852 "nvme_io_md": false, 00:19:20.852 "write_zeroes": true, 00:19:20.852 "zcopy": false, 00:19:20.852 "get_zone_info": false, 00:19:20.852 "zone_management": false, 00:19:20.852 "zone_append": false, 00:19:20.852 "compare": false, 00:19:20.852 "compare_and_write": false, 00:19:20.852 "abort": false, 00:19:20.852 "seek_hole": false, 00:19:20.852 "seek_data": false, 00:19:20.852 "copy": false, 00:19:20.852 "nvme_iov_md": false 00:19:20.852 }, 00:19:20.852 "memory_domains": [ 00:19:20.852 { 00:19:20.852 "dma_device_id": "system", 00:19:20.852 "dma_device_type": 1 00:19:20.852 }, 00:19:20.852 { 00:19:20.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.852 "dma_device_type": 2 00:19:20.852 }, 00:19:20.852 { 00:19:20.852 "dma_device_id": "system", 00:19:20.852 "dma_device_type": 1 00:19:20.852 }, 00:19:20.852 { 00:19:20.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.852 "dma_device_type": 2 00:19:20.852 }, 00:19:20.852 { 00:19:20.852 "dma_device_id": "system", 00:19:20.852 "dma_device_type": 1 00:19:20.852 }, 00:19:20.852 { 00:19:20.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.853 "dma_device_type": 2 00:19:20.853 } 00:19:20.853 ], 00:19:20.853 "driver_specific": { 00:19:20.853 "raid": { 00:19:20.853 "uuid": "9dfe4da9-213b-4101-ac1b-4f954e313d83", 00:19:20.853 "strip_size_kb": 0, 00:19:20.853 "state": "online", 00:19:20.853 "raid_level": "raid1", 00:19:20.853 "superblock": true, 00:19:20.853 "num_base_bdevs": 3, 00:19:20.853 "num_base_bdevs_discovered": 3, 00:19:20.853 "num_base_bdevs_operational": 3, 00:19:20.853 "base_bdevs_list": [ 00:19:20.853 { 00:19:20.853 "name": "BaseBdev1", 00:19:20.853 "uuid": "bfef999d-d022-4380-b058-2877a588ce0d", 00:19:20.853 "is_configured": true, 00:19:20.853 "data_offset": 2048, 00:19:20.853 "data_size": 63488 00:19:20.853 }, 00:19:20.853 { 00:19:20.853 "name": "BaseBdev2", 00:19:20.853 "uuid": "aa374e04-3367-427a-864a-941509737345", 00:19:20.853 "is_configured": true, 00:19:20.853 "data_offset": 2048, 00:19:20.853 "data_size": 63488 00:19:20.853 }, 00:19:20.853 { 00:19:20.853 "name": "BaseBdev3", 00:19:20.853 "uuid": "499a3ecb-a172-473c-94d5-fdfbdb537b62", 00:19:20.853 "is_configured": true, 00:19:20.853 "data_offset": 2048, 00:19:20.853 "data_size": 63488 00:19:20.853 } 00:19:20.853 ] 00:19:20.853 } 00:19:20.853 } 00:19:20.853 }' 00:19:20.853 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:20.853 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:20.853 BaseBdev2 00:19:20.853 BaseBdev3' 00:19:20.853 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:20.853 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:20.853 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:21.110 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:21.110 "name": "BaseBdev1", 00:19:21.110 "aliases": [ 00:19:21.110 "bfef999d-d022-4380-b058-2877a588ce0d" 00:19:21.110 ], 00:19:21.110 "product_name": "Malloc disk", 00:19:21.110 "block_size": 512, 00:19:21.110 "num_blocks": 65536, 00:19:21.110 "uuid": "bfef999d-d022-4380-b058-2877a588ce0d", 00:19:21.110 "assigned_rate_limits": { 00:19:21.110 "rw_ios_per_sec": 0, 00:19:21.110 "rw_mbytes_per_sec": 0, 00:19:21.110 "r_mbytes_per_sec": 0, 00:19:21.110 "w_mbytes_per_sec": 0 00:19:21.110 }, 00:19:21.110 "claimed": true, 00:19:21.110 "claim_type": "exclusive_write", 00:19:21.110 "zoned": false, 00:19:21.110 "supported_io_types": { 00:19:21.110 "read": true, 00:19:21.110 "write": true, 00:19:21.110 "unmap": true, 00:19:21.110 "flush": true, 00:19:21.110 "reset": true, 00:19:21.110 "nvme_admin": false, 00:19:21.110 "nvme_io": false, 00:19:21.110 "nvme_io_md": false, 00:19:21.110 "write_zeroes": true, 00:19:21.110 "zcopy": true, 00:19:21.110 "get_zone_info": false, 00:19:21.110 "zone_management": false, 00:19:21.110 "zone_append": false, 00:19:21.110 "compare": false, 00:19:21.110 "compare_and_write": false, 00:19:21.110 "abort": true, 00:19:21.110 "seek_hole": false, 00:19:21.110 "seek_data": false, 00:19:21.110 "copy": true, 00:19:21.110 "nvme_iov_md": false 00:19:21.110 }, 00:19:21.110 "memory_domains": [ 00:19:21.110 { 00:19:21.110 "dma_device_id": "system", 00:19:21.110 "dma_device_type": 1 00:19:21.110 }, 00:19:21.110 { 00:19:21.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.110 "dma_device_type": 2 00:19:21.110 } 00:19:21.110 ], 00:19:21.110 "driver_specific": {} 00:19:21.110 }' 00:19:21.110 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.110 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.367 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.624 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:21.624 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:21.624 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:21.624 06:35:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:21.881 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:21.881 "name": "BaseBdev2", 00:19:21.881 "aliases": [ 00:19:21.881 "aa374e04-3367-427a-864a-941509737345" 00:19:21.881 ], 00:19:21.881 "product_name": "Malloc disk", 00:19:21.881 "block_size": 512, 00:19:21.881 "num_blocks": 65536, 00:19:21.882 "uuid": "aa374e04-3367-427a-864a-941509737345", 00:19:21.882 "assigned_rate_limits": { 00:19:21.882 "rw_ios_per_sec": 0, 00:19:21.882 "rw_mbytes_per_sec": 0, 00:19:21.882 "r_mbytes_per_sec": 0, 00:19:21.882 "w_mbytes_per_sec": 0 00:19:21.882 }, 00:19:21.882 "claimed": true, 00:19:21.882 "claim_type": "exclusive_write", 00:19:21.882 "zoned": false, 00:19:21.882 "supported_io_types": { 00:19:21.882 "read": true, 00:19:21.882 "write": true, 00:19:21.882 "unmap": true, 00:19:21.882 "flush": true, 00:19:21.882 "reset": true, 00:19:21.882 "nvme_admin": false, 00:19:21.882 "nvme_io": false, 00:19:21.882 "nvme_io_md": false, 00:19:21.882 "write_zeroes": true, 00:19:21.882 "zcopy": true, 00:19:21.882 "get_zone_info": false, 00:19:21.882 "zone_management": false, 00:19:21.882 "zone_append": false, 00:19:21.882 "compare": false, 00:19:21.882 "compare_and_write": false, 00:19:21.882 "abort": true, 00:19:21.882 "seek_hole": false, 00:19:21.882 "seek_data": false, 00:19:21.882 "copy": true, 00:19:21.882 "nvme_iov_md": false 00:19:21.882 }, 00:19:21.882 "memory_domains": [ 00:19:21.882 { 00:19:21.882 "dma_device_id": "system", 00:19:21.882 "dma_device_type": 1 00:19:21.882 }, 00:19:21.882 { 00:19:21.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.882 "dma_device_type": 2 00:19:21.882 } 00:19:21.882 ], 00:19:21.882 "driver_specific": {} 00:19:21.882 }' 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:21.882 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.139 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.139 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:22.139 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:22.139 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:22.139 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:22.396 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:22.396 "name": "BaseBdev3", 00:19:22.396 "aliases": [ 00:19:22.396 "499a3ecb-a172-473c-94d5-fdfbdb537b62" 00:19:22.396 ], 00:19:22.397 "product_name": "Malloc disk", 00:19:22.397 "block_size": 512, 00:19:22.397 "num_blocks": 65536, 00:19:22.397 "uuid": "499a3ecb-a172-473c-94d5-fdfbdb537b62", 00:19:22.397 "assigned_rate_limits": { 00:19:22.397 "rw_ios_per_sec": 0, 00:19:22.397 "rw_mbytes_per_sec": 0, 00:19:22.397 "r_mbytes_per_sec": 0, 00:19:22.397 "w_mbytes_per_sec": 0 00:19:22.397 }, 00:19:22.397 "claimed": true, 00:19:22.397 "claim_type": "exclusive_write", 00:19:22.397 "zoned": false, 00:19:22.397 "supported_io_types": { 00:19:22.397 "read": true, 00:19:22.397 "write": true, 00:19:22.397 "unmap": true, 00:19:22.397 "flush": true, 00:19:22.397 "reset": true, 00:19:22.397 "nvme_admin": false, 00:19:22.397 "nvme_io": false, 00:19:22.397 "nvme_io_md": false, 00:19:22.397 "write_zeroes": true, 00:19:22.397 "zcopy": true, 00:19:22.397 "get_zone_info": false, 00:19:22.397 "zone_management": false, 00:19:22.397 "zone_append": false, 00:19:22.397 "compare": false, 00:19:22.397 "compare_and_write": false, 00:19:22.397 "abort": true, 00:19:22.397 "seek_hole": false, 00:19:22.397 "seek_data": false, 00:19:22.397 "copy": true, 00:19:22.397 "nvme_iov_md": false 00:19:22.397 }, 00:19:22.397 "memory_domains": [ 00:19:22.397 { 00:19:22.397 "dma_device_id": "system", 00:19:22.397 "dma_device_type": 1 00:19:22.397 }, 00:19:22.397 { 00:19:22.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.397 "dma_device_type": 2 00:19:22.397 } 00:19:22.397 ], 00:19:22.397 "driver_specific": {} 00:19:22.397 }' 00:19:22.397 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.397 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.397 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:22.397 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.397 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.397 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:22.397 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.654 06:35:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.654 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:22.654 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.654 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.654 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:22.654 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:22.911 [2024-07-25 06:35:36.309501] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.911 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.168 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.168 "name": "Existed_Raid", 00:19:23.168 "uuid": "9dfe4da9-213b-4101-ac1b-4f954e313d83", 00:19:23.168 "strip_size_kb": 0, 00:19:23.168 "state": "online", 00:19:23.168 "raid_level": "raid1", 00:19:23.168 "superblock": true, 00:19:23.168 "num_base_bdevs": 3, 00:19:23.168 "num_base_bdevs_discovered": 2, 00:19:23.168 "num_base_bdevs_operational": 2, 00:19:23.168 "base_bdevs_list": [ 00:19:23.168 { 00:19:23.168 "name": null, 00:19:23.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.168 "is_configured": false, 00:19:23.168 "data_offset": 2048, 00:19:23.168 "data_size": 63488 00:19:23.168 }, 00:19:23.168 { 00:19:23.168 "name": "BaseBdev2", 00:19:23.168 "uuid": "aa374e04-3367-427a-864a-941509737345", 00:19:23.168 "is_configured": true, 00:19:23.169 "data_offset": 2048, 00:19:23.169 "data_size": 63488 00:19:23.169 }, 00:19:23.169 { 00:19:23.169 "name": "BaseBdev3", 00:19:23.169 "uuid": "499a3ecb-a172-473c-94d5-fdfbdb537b62", 00:19:23.169 "is_configured": true, 00:19:23.169 "data_offset": 2048, 00:19:23.169 "data_size": 63488 00:19:23.169 } 00:19:23.169 ] 00:19:23.169 }' 00:19:23.169 06:35:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.169 06:35:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.733 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:23.733 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:23.733 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.733 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:23.991 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:23.991 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:23.991 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:24.247 [2024-07-25 06:35:37.557773] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:24.247 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:24.247 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:24.247 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:24.247 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.505 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:24.505 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:24.505 06:35:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:24.505 [2024-07-25 06:35:38.025042] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:24.505 [2024-07-25 06:35:38.025122] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:24.505 [2024-07-25 06:35:38.035362] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:24.505 [2024-07-25 06:35:38.035393] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:24.505 [2024-07-25 06:35:38.035403] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x225c380 name Existed_Raid, state offline 00:19:24.505 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:24.505 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:24.505 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.505 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:24.762 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:24.762 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:24.762 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:19:24.762 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:24.762 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:24.762 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:25.021 BaseBdev2 00:19:25.021 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:25.021 06:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:25.021 06:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:25.021 06:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:25.021 06:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:25.021 06:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:25.021 06:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:25.278 06:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:25.533 [ 00:19:25.533 { 00:19:25.533 "name": "BaseBdev2", 00:19:25.533 "aliases": [ 00:19:25.533 "dcbc162a-397c-472a-85e2-87a728f19fd8" 00:19:25.533 ], 00:19:25.533 "product_name": "Malloc disk", 00:19:25.533 "block_size": 512, 00:19:25.533 "num_blocks": 65536, 00:19:25.533 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:25.533 "assigned_rate_limits": { 00:19:25.533 "rw_ios_per_sec": 0, 00:19:25.533 "rw_mbytes_per_sec": 0, 00:19:25.533 "r_mbytes_per_sec": 0, 00:19:25.533 "w_mbytes_per_sec": 0 00:19:25.533 }, 00:19:25.533 "claimed": false, 00:19:25.533 "zoned": false, 00:19:25.533 "supported_io_types": { 00:19:25.533 "read": true, 00:19:25.533 "write": true, 00:19:25.533 "unmap": true, 00:19:25.533 "flush": true, 00:19:25.533 "reset": true, 00:19:25.533 "nvme_admin": false, 00:19:25.533 "nvme_io": false, 00:19:25.533 "nvme_io_md": false, 00:19:25.533 "write_zeroes": true, 00:19:25.533 "zcopy": true, 00:19:25.533 "get_zone_info": false, 00:19:25.533 "zone_management": false, 00:19:25.533 "zone_append": false, 00:19:25.533 "compare": false, 00:19:25.533 "compare_and_write": false, 00:19:25.533 "abort": true, 00:19:25.533 "seek_hole": false, 00:19:25.533 "seek_data": false, 00:19:25.533 "copy": true, 00:19:25.533 "nvme_iov_md": false 00:19:25.533 }, 00:19:25.533 "memory_domains": [ 00:19:25.533 { 00:19:25.533 "dma_device_id": "system", 00:19:25.533 "dma_device_type": 1 00:19:25.533 }, 00:19:25.533 { 00:19:25.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.533 "dma_device_type": 2 00:19:25.533 } 00:19:25.533 ], 00:19:25.533 "driver_specific": {} 00:19:25.533 } 00:19:25.533 ] 00:19:25.533 06:35:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:25.533 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:25.533 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:25.533 06:35:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:25.788 BaseBdev3 00:19:25.788 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:25.788 06:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:25.788 06:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:25.788 06:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:25.788 06:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:25.788 06:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:25.788 06:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:26.044 06:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:26.301 [ 00:19:26.301 { 00:19:26.301 "name": "BaseBdev3", 00:19:26.301 "aliases": [ 00:19:26.301 "e7edbc8a-1270-42aa-b0a1-68d125b4ea94" 00:19:26.301 ], 00:19:26.301 "product_name": "Malloc disk", 00:19:26.301 "block_size": 512, 00:19:26.301 "num_blocks": 65536, 00:19:26.301 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:26.301 "assigned_rate_limits": { 00:19:26.301 "rw_ios_per_sec": 0, 00:19:26.301 "rw_mbytes_per_sec": 0, 00:19:26.301 "r_mbytes_per_sec": 0, 00:19:26.301 "w_mbytes_per_sec": 0 00:19:26.301 }, 00:19:26.301 "claimed": false, 00:19:26.301 "zoned": false, 00:19:26.301 "supported_io_types": { 00:19:26.301 "read": true, 00:19:26.301 "write": true, 00:19:26.301 "unmap": true, 00:19:26.301 "flush": true, 00:19:26.301 "reset": true, 00:19:26.301 "nvme_admin": false, 00:19:26.301 "nvme_io": false, 00:19:26.301 "nvme_io_md": false, 00:19:26.301 "write_zeroes": true, 00:19:26.301 "zcopy": true, 00:19:26.301 "get_zone_info": false, 00:19:26.301 "zone_management": false, 00:19:26.301 "zone_append": false, 00:19:26.301 "compare": false, 00:19:26.301 "compare_and_write": false, 00:19:26.301 "abort": true, 00:19:26.301 "seek_hole": false, 00:19:26.301 "seek_data": false, 00:19:26.301 "copy": true, 00:19:26.301 "nvme_iov_md": false 00:19:26.301 }, 00:19:26.301 "memory_domains": [ 00:19:26.301 { 00:19:26.302 "dma_device_id": "system", 00:19:26.302 "dma_device_type": 1 00:19:26.302 }, 00:19:26.302 { 00:19:26.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:26.302 "dma_device_type": 2 00:19:26.302 } 00:19:26.302 ], 00:19:26.302 "driver_specific": {} 00:19:26.302 } 00:19:26.302 ] 00:19:26.302 06:35:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:26.302 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:26.302 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:26.302 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:26.302 [2024-07-25 06:35:39.856641] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:26.302 [2024-07-25 06:35:39.856676] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:26.302 [2024-07-25 06:35:39.856693] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:26.559 [2024-07-25 06:35:39.857886] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.559 06:35:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:26.559 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.559 "name": "Existed_Raid", 00:19:26.559 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:26.559 "strip_size_kb": 0, 00:19:26.559 "state": "configuring", 00:19:26.559 "raid_level": "raid1", 00:19:26.559 "superblock": true, 00:19:26.559 "num_base_bdevs": 3, 00:19:26.559 "num_base_bdevs_discovered": 2, 00:19:26.559 "num_base_bdevs_operational": 3, 00:19:26.559 "base_bdevs_list": [ 00:19:26.559 { 00:19:26.559 "name": "BaseBdev1", 00:19:26.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:26.559 "is_configured": false, 00:19:26.559 "data_offset": 0, 00:19:26.559 "data_size": 0 00:19:26.559 }, 00:19:26.559 { 00:19:26.559 "name": "BaseBdev2", 00:19:26.559 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:26.559 "is_configured": true, 00:19:26.559 "data_offset": 2048, 00:19:26.559 "data_size": 63488 00:19:26.559 }, 00:19:26.559 { 00:19:26.559 "name": "BaseBdev3", 00:19:26.559 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:26.559 "is_configured": true, 00:19:26.559 "data_offset": 2048, 00:19:26.559 "data_size": 63488 00:19:26.559 } 00:19:26.559 ] 00:19:26.559 }' 00:19:26.559 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.559 06:35:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:27.123 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:27.380 [2024-07-25 06:35:40.887386] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.380 06:35:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:27.638 06:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.638 "name": "Existed_Raid", 00:19:27.638 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:27.638 "strip_size_kb": 0, 00:19:27.638 "state": "configuring", 00:19:27.638 "raid_level": "raid1", 00:19:27.638 "superblock": true, 00:19:27.638 "num_base_bdevs": 3, 00:19:27.638 "num_base_bdevs_discovered": 1, 00:19:27.638 "num_base_bdevs_operational": 3, 00:19:27.638 "base_bdevs_list": [ 00:19:27.638 { 00:19:27.638 "name": "BaseBdev1", 00:19:27.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.638 "is_configured": false, 00:19:27.638 "data_offset": 0, 00:19:27.638 "data_size": 0 00:19:27.638 }, 00:19:27.638 { 00:19:27.638 "name": null, 00:19:27.638 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:27.638 "is_configured": false, 00:19:27.638 "data_offset": 2048, 00:19:27.638 "data_size": 63488 00:19:27.638 }, 00:19:27.638 { 00:19:27.638 "name": "BaseBdev3", 00:19:27.638 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:27.638 "is_configured": true, 00:19:27.638 "data_offset": 2048, 00:19:27.638 "data_size": 63488 00:19:27.638 } 00:19:27.638 ] 00:19:27.638 }' 00:19:27.638 06:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.638 06:35:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:28.201 06:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.201 06:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:28.458 06:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:28.458 06:35:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:28.715 [2024-07-25 06:35:42.085762] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:28.715 BaseBdev1 00:19:28.715 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:28.715 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:28.715 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:28.715 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:28.715 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:28.715 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:28.715 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:28.972 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:29.230 [ 00:19:29.230 { 00:19:29.230 "name": "BaseBdev1", 00:19:29.230 "aliases": [ 00:19:29.230 "c69c3c7d-704e-4920-a59b-c7e8b3a6265b" 00:19:29.230 ], 00:19:29.230 "product_name": "Malloc disk", 00:19:29.230 "block_size": 512, 00:19:29.230 "num_blocks": 65536, 00:19:29.230 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:29.230 "assigned_rate_limits": { 00:19:29.230 "rw_ios_per_sec": 0, 00:19:29.230 "rw_mbytes_per_sec": 0, 00:19:29.230 "r_mbytes_per_sec": 0, 00:19:29.230 "w_mbytes_per_sec": 0 00:19:29.230 }, 00:19:29.230 "claimed": true, 00:19:29.230 "claim_type": "exclusive_write", 00:19:29.230 "zoned": false, 00:19:29.230 "supported_io_types": { 00:19:29.230 "read": true, 00:19:29.230 "write": true, 00:19:29.230 "unmap": true, 00:19:29.230 "flush": true, 00:19:29.230 "reset": true, 00:19:29.230 "nvme_admin": false, 00:19:29.230 "nvme_io": false, 00:19:29.230 "nvme_io_md": false, 00:19:29.230 "write_zeroes": true, 00:19:29.230 "zcopy": true, 00:19:29.230 "get_zone_info": false, 00:19:29.230 "zone_management": false, 00:19:29.230 "zone_append": false, 00:19:29.230 "compare": false, 00:19:29.230 "compare_and_write": false, 00:19:29.230 "abort": true, 00:19:29.230 "seek_hole": false, 00:19:29.230 "seek_data": false, 00:19:29.230 "copy": true, 00:19:29.230 "nvme_iov_md": false 00:19:29.230 }, 00:19:29.230 "memory_domains": [ 00:19:29.230 { 00:19:29.230 "dma_device_id": "system", 00:19:29.230 "dma_device_type": 1 00:19:29.230 }, 00:19:29.230 { 00:19:29.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.230 "dma_device_type": 2 00:19:29.230 } 00:19:29.230 ], 00:19:29.230 "driver_specific": {} 00:19:29.230 } 00:19:29.230 ] 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.230 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:29.488 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.488 "name": "Existed_Raid", 00:19:29.488 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:29.488 "strip_size_kb": 0, 00:19:29.488 "state": "configuring", 00:19:29.488 "raid_level": "raid1", 00:19:29.488 "superblock": true, 00:19:29.488 "num_base_bdevs": 3, 00:19:29.488 "num_base_bdevs_discovered": 2, 00:19:29.488 "num_base_bdevs_operational": 3, 00:19:29.488 "base_bdevs_list": [ 00:19:29.488 { 00:19:29.488 "name": "BaseBdev1", 00:19:29.488 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:29.488 "is_configured": true, 00:19:29.488 "data_offset": 2048, 00:19:29.488 "data_size": 63488 00:19:29.488 }, 00:19:29.488 { 00:19:29.488 "name": null, 00:19:29.488 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:29.488 "is_configured": false, 00:19:29.488 "data_offset": 2048, 00:19:29.488 "data_size": 63488 00:19:29.488 }, 00:19:29.488 { 00:19:29.488 "name": "BaseBdev3", 00:19:29.488 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:29.488 "is_configured": true, 00:19:29.488 "data_offset": 2048, 00:19:29.488 "data_size": 63488 00:19:29.488 } 00:19:29.488 ] 00:19:29.489 }' 00:19:29.489 06:35:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.489 06:35:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:30.052 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.052 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:30.052 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:30.052 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:30.308 [2024-07-25 06:35:43.810442] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.308 06:35:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:30.565 06:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:30.565 "name": "Existed_Raid", 00:19:30.565 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:30.565 "strip_size_kb": 0, 00:19:30.565 "state": "configuring", 00:19:30.565 "raid_level": "raid1", 00:19:30.565 "superblock": true, 00:19:30.565 "num_base_bdevs": 3, 00:19:30.565 "num_base_bdevs_discovered": 1, 00:19:30.565 "num_base_bdevs_operational": 3, 00:19:30.565 "base_bdevs_list": [ 00:19:30.565 { 00:19:30.565 "name": "BaseBdev1", 00:19:30.565 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:30.565 "is_configured": true, 00:19:30.565 "data_offset": 2048, 00:19:30.565 "data_size": 63488 00:19:30.565 }, 00:19:30.565 { 00:19:30.565 "name": null, 00:19:30.565 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:30.565 "is_configured": false, 00:19:30.565 "data_offset": 2048, 00:19:30.565 "data_size": 63488 00:19:30.565 }, 00:19:30.565 { 00:19:30.565 "name": null, 00:19:30.565 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:30.565 "is_configured": false, 00:19:30.565 "data_offset": 2048, 00:19:30.565 "data_size": 63488 00:19:30.565 } 00:19:30.565 ] 00:19:30.565 }' 00:19:30.565 06:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:30.565 06:35:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:31.129 06:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.129 06:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:31.387 06:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:31.387 06:35:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:31.652 [2024-07-25 06:35:45.061748] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.652 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.937 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.937 "name": "Existed_Raid", 00:19:31.937 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:31.937 "strip_size_kb": 0, 00:19:31.937 "state": "configuring", 00:19:31.937 "raid_level": "raid1", 00:19:31.937 "superblock": true, 00:19:31.937 "num_base_bdevs": 3, 00:19:31.937 "num_base_bdevs_discovered": 2, 00:19:31.937 "num_base_bdevs_operational": 3, 00:19:31.937 "base_bdevs_list": [ 00:19:31.937 { 00:19:31.937 "name": "BaseBdev1", 00:19:31.937 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:31.937 "is_configured": true, 00:19:31.937 "data_offset": 2048, 00:19:31.937 "data_size": 63488 00:19:31.937 }, 00:19:31.937 { 00:19:31.937 "name": null, 00:19:31.937 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:31.937 "is_configured": false, 00:19:31.937 "data_offset": 2048, 00:19:31.937 "data_size": 63488 00:19:31.937 }, 00:19:31.937 { 00:19:31.937 "name": "BaseBdev3", 00:19:31.937 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:31.937 "is_configured": true, 00:19:31.937 "data_offset": 2048, 00:19:31.937 "data_size": 63488 00:19:31.937 } 00:19:31.937 ] 00:19:31.937 }' 00:19:31.937 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.937 06:35:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:32.502 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.502 06:35:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:32.760 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:32.760 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:33.017 [2024-07-25 06:35:46.329156] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.018 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:33.275 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.275 "name": "Existed_Raid", 00:19:33.275 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:33.275 "strip_size_kb": 0, 00:19:33.275 "state": "configuring", 00:19:33.275 "raid_level": "raid1", 00:19:33.275 "superblock": true, 00:19:33.275 "num_base_bdevs": 3, 00:19:33.275 "num_base_bdevs_discovered": 1, 00:19:33.275 "num_base_bdevs_operational": 3, 00:19:33.275 "base_bdevs_list": [ 00:19:33.275 { 00:19:33.275 "name": null, 00:19:33.275 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:33.275 "is_configured": false, 00:19:33.275 "data_offset": 2048, 00:19:33.275 "data_size": 63488 00:19:33.275 }, 00:19:33.275 { 00:19:33.275 "name": null, 00:19:33.275 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:33.275 "is_configured": false, 00:19:33.275 "data_offset": 2048, 00:19:33.275 "data_size": 63488 00:19:33.275 }, 00:19:33.275 { 00:19:33.275 "name": "BaseBdev3", 00:19:33.275 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:33.275 "is_configured": true, 00:19:33.275 "data_offset": 2048, 00:19:33.275 "data_size": 63488 00:19:33.275 } 00:19:33.275 ] 00:19:33.275 }' 00:19:33.275 06:35:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.275 06:35:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:33.841 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:33.841 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.841 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:33.841 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:34.099 [2024-07-25 06:35:47.590773] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.099 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.357 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.357 "name": "Existed_Raid", 00:19:34.357 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:34.357 "strip_size_kb": 0, 00:19:34.357 "state": "configuring", 00:19:34.357 "raid_level": "raid1", 00:19:34.357 "superblock": true, 00:19:34.357 "num_base_bdevs": 3, 00:19:34.357 "num_base_bdevs_discovered": 2, 00:19:34.357 "num_base_bdevs_operational": 3, 00:19:34.357 "base_bdevs_list": [ 00:19:34.357 { 00:19:34.357 "name": null, 00:19:34.357 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:34.357 "is_configured": false, 00:19:34.357 "data_offset": 2048, 00:19:34.357 "data_size": 63488 00:19:34.357 }, 00:19:34.357 { 00:19:34.357 "name": "BaseBdev2", 00:19:34.357 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:34.357 "is_configured": true, 00:19:34.357 "data_offset": 2048, 00:19:34.357 "data_size": 63488 00:19:34.357 }, 00:19:34.357 { 00:19:34.357 "name": "BaseBdev3", 00:19:34.357 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:34.357 "is_configured": true, 00:19:34.357 "data_offset": 2048, 00:19:34.357 "data_size": 63488 00:19:34.357 } 00:19:34.357 ] 00:19:34.357 }' 00:19:34.357 06:35:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.357 06:35:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:34.923 06:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.923 06:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:35.181 06:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:35.181 06:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.181 06:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:35.438 06:35:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c69c3c7d-704e-4920-a59b-c7e8b3a6265b 00:19:35.696 [2024-07-25 06:35:49.093914] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:35.696 [2024-07-25 06:35:49.094048] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22533e0 00:19:35.696 [2024-07-25 06:35:49.094059] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:35.696 [2024-07-25 06:35:49.094222] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2095b70 00:19:35.696 [2024-07-25 06:35:49.094329] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22533e0 00:19:35.696 [2024-07-25 06:35:49.094338] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22533e0 00:19:35.696 [2024-07-25 06:35:49.094420] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:35.696 NewBaseBdev 00:19:35.696 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:35.696 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:19:35.696 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:35.696 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:35.696 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:35.696 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:35.696 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:35.953 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:36.211 [ 00:19:36.211 { 00:19:36.211 "name": "NewBaseBdev", 00:19:36.211 "aliases": [ 00:19:36.211 "c69c3c7d-704e-4920-a59b-c7e8b3a6265b" 00:19:36.211 ], 00:19:36.211 "product_name": "Malloc disk", 00:19:36.211 "block_size": 512, 00:19:36.211 "num_blocks": 65536, 00:19:36.211 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:36.211 "assigned_rate_limits": { 00:19:36.211 "rw_ios_per_sec": 0, 00:19:36.211 "rw_mbytes_per_sec": 0, 00:19:36.211 "r_mbytes_per_sec": 0, 00:19:36.211 "w_mbytes_per_sec": 0 00:19:36.211 }, 00:19:36.211 "claimed": true, 00:19:36.211 "claim_type": "exclusive_write", 00:19:36.211 "zoned": false, 00:19:36.211 "supported_io_types": { 00:19:36.211 "read": true, 00:19:36.211 "write": true, 00:19:36.211 "unmap": true, 00:19:36.211 "flush": true, 00:19:36.211 "reset": true, 00:19:36.211 "nvme_admin": false, 00:19:36.211 "nvme_io": false, 00:19:36.211 "nvme_io_md": false, 00:19:36.211 "write_zeroes": true, 00:19:36.211 "zcopy": true, 00:19:36.211 "get_zone_info": false, 00:19:36.211 "zone_management": false, 00:19:36.211 "zone_append": false, 00:19:36.211 "compare": false, 00:19:36.211 "compare_and_write": false, 00:19:36.211 "abort": true, 00:19:36.211 "seek_hole": false, 00:19:36.211 "seek_data": false, 00:19:36.211 "copy": true, 00:19:36.211 "nvme_iov_md": false 00:19:36.211 }, 00:19:36.211 "memory_domains": [ 00:19:36.211 { 00:19:36.211 "dma_device_id": "system", 00:19:36.211 "dma_device_type": 1 00:19:36.211 }, 00:19:36.211 { 00:19:36.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.211 "dma_device_type": 2 00:19:36.211 } 00:19:36.211 ], 00:19:36.211 "driver_specific": {} 00:19:36.211 } 00:19:36.211 ] 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.211 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.468 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.468 "name": "Existed_Raid", 00:19:36.468 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:36.468 "strip_size_kb": 0, 00:19:36.468 "state": "online", 00:19:36.468 "raid_level": "raid1", 00:19:36.468 "superblock": true, 00:19:36.468 "num_base_bdevs": 3, 00:19:36.468 "num_base_bdevs_discovered": 3, 00:19:36.468 "num_base_bdevs_operational": 3, 00:19:36.468 "base_bdevs_list": [ 00:19:36.468 { 00:19:36.468 "name": "NewBaseBdev", 00:19:36.468 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:36.468 "is_configured": true, 00:19:36.468 "data_offset": 2048, 00:19:36.468 "data_size": 63488 00:19:36.468 }, 00:19:36.468 { 00:19:36.468 "name": "BaseBdev2", 00:19:36.468 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:36.468 "is_configured": true, 00:19:36.468 "data_offset": 2048, 00:19:36.468 "data_size": 63488 00:19:36.468 }, 00:19:36.468 { 00:19:36.468 "name": "BaseBdev3", 00:19:36.468 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:36.468 "is_configured": true, 00:19:36.468 "data_offset": 2048, 00:19:36.468 "data_size": 63488 00:19:36.468 } 00:19:36.468 ] 00:19:36.468 }' 00:19:36.468 06:35:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.468 06:35:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:37.032 [2024-07-25 06:35:50.562080] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:37.032 "name": "Existed_Raid", 00:19:37.032 "aliases": [ 00:19:37.032 "c2e6ff37-ca02-4356-9a61-648aac18477b" 00:19:37.032 ], 00:19:37.032 "product_name": "Raid Volume", 00:19:37.032 "block_size": 512, 00:19:37.032 "num_blocks": 63488, 00:19:37.032 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:37.032 "assigned_rate_limits": { 00:19:37.032 "rw_ios_per_sec": 0, 00:19:37.032 "rw_mbytes_per_sec": 0, 00:19:37.032 "r_mbytes_per_sec": 0, 00:19:37.032 "w_mbytes_per_sec": 0 00:19:37.032 }, 00:19:37.032 "claimed": false, 00:19:37.032 "zoned": false, 00:19:37.032 "supported_io_types": { 00:19:37.032 "read": true, 00:19:37.032 "write": true, 00:19:37.032 "unmap": false, 00:19:37.032 "flush": false, 00:19:37.032 "reset": true, 00:19:37.032 "nvme_admin": false, 00:19:37.032 "nvme_io": false, 00:19:37.032 "nvme_io_md": false, 00:19:37.032 "write_zeroes": true, 00:19:37.032 "zcopy": false, 00:19:37.032 "get_zone_info": false, 00:19:37.032 "zone_management": false, 00:19:37.032 "zone_append": false, 00:19:37.032 "compare": false, 00:19:37.032 "compare_and_write": false, 00:19:37.032 "abort": false, 00:19:37.032 "seek_hole": false, 00:19:37.032 "seek_data": false, 00:19:37.032 "copy": false, 00:19:37.032 "nvme_iov_md": false 00:19:37.032 }, 00:19:37.032 "memory_domains": [ 00:19:37.032 { 00:19:37.032 "dma_device_id": "system", 00:19:37.032 "dma_device_type": 1 00:19:37.032 }, 00:19:37.032 { 00:19:37.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.032 "dma_device_type": 2 00:19:37.032 }, 00:19:37.032 { 00:19:37.032 "dma_device_id": "system", 00:19:37.032 "dma_device_type": 1 00:19:37.032 }, 00:19:37.032 { 00:19:37.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.032 "dma_device_type": 2 00:19:37.032 }, 00:19:37.032 { 00:19:37.032 "dma_device_id": "system", 00:19:37.032 "dma_device_type": 1 00:19:37.032 }, 00:19:37.032 { 00:19:37.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.032 "dma_device_type": 2 00:19:37.032 } 00:19:37.032 ], 00:19:37.032 "driver_specific": { 00:19:37.032 "raid": { 00:19:37.032 "uuid": "c2e6ff37-ca02-4356-9a61-648aac18477b", 00:19:37.032 "strip_size_kb": 0, 00:19:37.032 "state": "online", 00:19:37.032 "raid_level": "raid1", 00:19:37.032 "superblock": true, 00:19:37.032 "num_base_bdevs": 3, 00:19:37.032 "num_base_bdevs_discovered": 3, 00:19:37.032 "num_base_bdevs_operational": 3, 00:19:37.032 "base_bdevs_list": [ 00:19:37.032 { 00:19:37.032 "name": "NewBaseBdev", 00:19:37.032 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:37.032 "is_configured": true, 00:19:37.032 "data_offset": 2048, 00:19:37.032 "data_size": 63488 00:19:37.032 }, 00:19:37.032 { 00:19:37.032 "name": "BaseBdev2", 00:19:37.032 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:37.032 "is_configured": true, 00:19:37.032 "data_offset": 2048, 00:19:37.032 "data_size": 63488 00:19:37.032 }, 00:19:37.032 { 00:19:37.032 "name": "BaseBdev3", 00:19:37.032 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:37.032 "is_configured": true, 00:19:37.032 "data_offset": 2048, 00:19:37.032 "data_size": 63488 00:19:37.032 } 00:19:37.032 ] 00:19:37.032 } 00:19:37.032 } 00:19:37.032 }' 00:19:37.032 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:37.290 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:37.290 BaseBdev2 00:19:37.290 BaseBdev3' 00:19:37.290 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.290 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:37.290 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:37.548 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:37.548 "name": "NewBaseBdev", 00:19:37.548 "aliases": [ 00:19:37.548 "c69c3c7d-704e-4920-a59b-c7e8b3a6265b" 00:19:37.548 ], 00:19:37.548 "product_name": "Malloc disk", 00:19:37.548 "block_size": 512, 00:19:37.548 "num_blocks": 65536, 00:19:37.548 "uuid": "c69c3c7d-704e-4920-a59b-c7e8b3a6265b", 00:19:37.548 "assigned_rate_limits": { 00:19:37.548 "rw_ios_per_sec": 0, 00:19:37.548 "rw_mbytes_per_sec": 0, 00:19:37.548 "r_mbytes_per_sec": 0, 00:19:37.548 "w_mbytes_per_sec": 0 00:19:37.548 }, 00:19:37.548 "claimed": true, 00:19:37.548 "claim_type": "exclusive_write", 00:19:37.548 "zoned": false, 00:19:37.548 "supported_io_types": { 00:19:37.548 "read": true, 00:19:37.548 "write": true, 00:19:37.548 "unmap": true, 00:19:37.548 "flush": true, 00:19:37.548 "reset": true, 00:19:37.548 "nvme_admin": false, 00:19:37.548 "nvme_io": false, 00:19:37.548 "nvme_io_md": false, 00:19:37.548 "write_zeroes": true, 00:19:37.548 "zcopy": true, 00:19:37.548 "get_zone_info": false, 00:19:37.548 "zone_management": false, 00:19:37.548 "zone_append": false, 00:19:37.548 "compare": false, 00:19:37.548 "compare_and_write": false, 00:19:37.548 "abort": true, 00:19:37.548 "seek_hole": false, 00:19:37.548 "seek_data": false, 00:19:37.548 "copy": true, 00:19:37.548 "nvme_iov_md": false 00:19:37.548 }, 00:19:37.548 "memory_domains": [ 00:19:37.548 { 00:19:37.548 "dma_device_id": "system", 00:19:37.548 "dma_device_type": 1 00:19:37.548 }, 00:19:37.548 { 00:19:37.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.548 "dma_device_type": 2 00:19:37.548 } 00:19:37.548 ], 00:19:37.548 "driver_specific": {} 00:19:37.548 }' 00:19:37.548 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.548 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.548 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:37.548 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.548 06:35:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.548 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:37.548 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.548 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.807 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:37.807 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.807 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.807 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:37.807 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.807 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:37.807 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:38.065 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:38.065 "name": "BaseBdev2", 00:19:38.065 "aliases": [ 00:19:38.065 "dcbc162a-397c-472a-85e2-87a728f19fd8" 00:19:38.065 ], 00:19:38.065 "product_name": "Malloc disk", 00:19:38.065 "block_size": 512, 00:19:38.065 "num_blocks": 65536, 00:19:38.065 "uuid": "dcbc162a-397c-472a-85e2-87a728f19fd8", 00:19:38.065 "assigned_rate_limits": { 00:19:38.065 "rw_ios_per_sec": 0, 00:19:38.065 "rw_mbytes_per_sec": 0, 00:19:38.065 "r_mbytes_per_sec": 0, 00:19:38.065 "w_mbytes_per_sec": 0 00:19:38.065 }, 00:19:38.065 "claimed": true, 00:19:38.065 "claim_type": "exclusive_write", 00:19:38.065 "zoned": false, 00:19:38.065 "supported_io_types": { 00:19:38.065 "read": true, 00:19:38.065 "write": true, 00:19:38.065 "unmap": true, 00:19:38.065 "flush": true, 00:19:38.065 "reset": true, 00:19:38.065 "nvme_admin": false, 00:19:38.065 "nvme_io": false, 00:19:38.065 "nvme_io_md": false, 00:19:38.065 "write_zeroes": true, 00:19:38.065 "zcopy": true, 00:19:38.065 "get_zone_info": false, 00:19:38.065 "zone_management": false, 00:19:38.065 "zone_append": false, 00:19:38.065 "compare": false, 00:19:38.065 "compare_and_write": false, 00:19:38.065 "abort": true, 00:19:38.065 "seek_hole": false, 00:19:38.065 "seek_data": false, 00:19:38.065 "copy": true, 00:19:38.065 "nvme_iov_md": false 00:19:38.065 }, 00:19:38.065 "memory_domains": [ 00:19:38.065 { 00:19:38.065 "dma_device_id": "system", 00:19:38.065 "dma_device_type": 1 00:19:38.065 }, 00:19:38.065 { 00:19:38.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.065 "dma_device_type": 2 00:19:38.065 } 00:19:38.065 ], 00:19:38.065 "driver_specific": {} 00:19:38.065 }' 00:19:38.065 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.065 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.065 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.065 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.065 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.065 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.065 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.324 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.324 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.324 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.324 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.324 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.324 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:38.324 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:38.324 06:35:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:38.583 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:38.583 "name": "BaseBdev3", 00:19:38.583 "aliases": [ 00:19:38.583 "e7edbc8a-1270-42aa-b0a1-68d125b4ea94" 00:19:38.583 ], 00:19:38.583 "product_name": "Malloc disk", 00:19:38.583 "block_size": 512, 00:19:38.583 "num_blocks": 65536, 00:19:38.583 "uuid": "e7edbc8a-1270-42aa-b0a1-68d125b4ea94", 00:19:38.583 "assigned_rate_limits": { 00:19:38.583 "rw_ios_per_sec": 0, 00:19:38.583 "rw_mbytes_per_sec": 0, 00:19:38.583 "r_mbytes_per_sec": 0, 00:19:38.583 "w_mbytes_per_sec": 0 00:19:38.583 }, 00:19:38.583 "claimed": true, 00:19:38.583 "claim_type": "exclusive_write", 00:19:38.583 "zoned": false, 00:19:38.583 "supported_io_types": { 00:19:38.583 "read": true, 00:19:38.583 "write": true, 00:19:38.583 "unmap": true, 00:19:38.583 "flush": true, 00:19:38.583 "reset": true, 00:19:38.583 "nvme_admin": false, 00:19:38.583 "nvme_io": false, 00:19:38.583 "nvme_io_md": false, 00:19:38.583 "write_zeroes": true, 00:19:38.583 "zcopy": true, 00:19:38.583 "get_zone_info": false, 00:19:38.583 "zone_management": false, 00:19:38.583 "zone_append": false, 00:19:38.583 "compare": false, 00:19:38.583 "compare_and_write": false, 00:19:38.583 "abort": true, 00:19:38.583 "seek_hole": false, 00:19:38.583 "seek_data": false, 00:19:38.583 "copy": true, 00:19:38.583 "nvme_iov_md": false 00:19:38.583 }, 00:19:38.583 "memory_domains": [ 00:19:38.583 { 00:19:38.583 "dma_device_id": "system", 00:19:38.583 "dma_device_type": 1 00:19:38.583 }, 00:19:38.583 { 00:19:38.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.583 "dma_device_type": 2 00:19:38.583 } 00:19:38.583 ], 00:19:38.583 "driver_specific": {} 00:19:38.583 }' 00:19:38.583 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.583 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.583 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.583 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.842 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:39.101 [2024-07-25 06:35:52.579155] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:39.101 [2024-07-25 06:35:52.579176] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:39.101 [2024-07-25 06:35:52.579225] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:39.101 [2024-07-25 06:35:52.579451] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:39.101 [2024-07-25 06:35:52.579462] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22533e0 name Existed_Raid, state offline 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1155768 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1155768 ']' 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1155768 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1155768 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1155768' 00:19:39.101 killing process with pid 1155768 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1155768 00:19:39.101 [2024-07-25 06:35:52.656323] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:39.101 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1155768 00:19:39.360 [2024-07-25 06:35:52.680085] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:39.360 06:35:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:39.360 00:19:39.360 real 0m26.715s 00:19:39.360 user 0m48.831s 00:19:39.360 sys 0m5.061s 00:19:39.360 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:39.360 06:35:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:39.360 ************************************ 00:19:39.360 END TEST raid_state_function_test_sb 00:19:39.360 ************************************ 00:19:39.360 06:35:52 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:19:39.360 06:35:52 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:19:39.360 06:35:52 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:39.360 06:35:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:39.619 ************************************ 00:19:39.619 START TEST raid_superblock_test 00:19:39.619 ************************************ 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 3 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1160875 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1160875 /var/tmp/spdk-raid.sock 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1160875 ']' 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:39.619 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:39.619 06:35:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.619 [2024-07-25 06:35:52.999759] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:19:39.619 [2024-07-25 06:35:52.999818] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1160875 ] 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:01.0 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:01.1 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:01.2 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:01.3 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:01.4 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:01.5 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:01.6 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:01.7 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:02.0 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:02.1 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:02.2 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:02.3 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:02.4 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:02.5 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:02.6 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3d:02.7 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3f:01.0 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3f:01.1 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3f:01.2 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3f:01.3 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3f:01.4 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3f:01.5 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3f:01.6 cannot be used 00:19:39.619 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.619 EAL: Requested device 0000:3f:01.7 cannot be used 00:19:39.620 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.620 EAL: Requested device 0000:3f:02.0 cannot be used 00:19:39.620 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.620 EAL: Requested device 0000:3f:02.1 cannot be used 00:19:39.620 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.620 EAL: Requested device 0000:3f:02.2 cannot be used 00:19:39.620 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.620 EAL: Requested device 0000:3f:02.3 cannot be used 00:19:39.620 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.620 EAL: Requested device 0000:3f:02.4 cannot be used 00:19:39.620 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.620 EAL: Requested device 0000:3f:02.5 cannot be used 00:19:39.620 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.620 EAL: Requested device 0000:3f:02.6 cannot be used 00:19:39.620 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:19:39.620 EAL: Requested device 0000:3f:02.7 cannot be used 00:19:39.620 [2024-07-25 06:35:53.135689] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:39.879 [2024-07-25 06:35:53.180419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:39.879 [2024-07-25 06:35:53.243278] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:39.879 [2024-07-25 06:35:53.243320] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:40.445 06:35:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:40.702 malloc1 00:19:40.702 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:40.959 [2024-07-25 06:35:54.326200] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:40.959 [2024-07-25 06:35:54.326240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:40.959 [2024-07-25 06:35:54.326258] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1933d70 00:19:40.959 [2024-07-25 06:35:54.326271] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:40.959 [2024-07-25 06:35:54.327727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:40.959 [2024-07-25 06:35:54.327753] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:40.959 pt1 00:19:40.959 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:40.960 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:40.960 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:19:40.960 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:19:40.960 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:40.960 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:40.960 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:40.960 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:40.960 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:41.217 malloc2 00:19:41.217 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:41.217 [2024-07-25 06:35:54.759815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:41.217 [2024-07-25 06:35:54.759853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.217 [2024-07-25 06:35:54.759869] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1782790 00:19:41.217 [2024-07-25 06:35:54.759881] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.217 [2024-07-25 06:35:54.761170] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.217 [2024-07-25 06:35:54.761196] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:41.217 pt2 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:41.475 06:35:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:41.475 malloc3 00:19:41.475 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:41.732 [2024-07-25 06:35:55.205358] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:41.732 [2024-07-25 06:35:55.205398] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.732 [2024-07-25 06:35:55.205415] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19278c0 00:19:41.732 [2024-07-25 06:35:55.205426] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.732 [2024-07-25 06:35:55.206730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.732 [2024-07-25 06:35:55.206757] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:41.732 pt3 00:19:41.732 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:41.732 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:41.732 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:19:41.990 [2024-07-25 06:35:55.429953] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:41.990 [2024-07-25 06:35:55.431088] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:41.990 [2024-07-25 06:35:55.431152] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:41.990 [2024-07-25 06:35:55.431296] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19290e0 00:19:41.990 [2024-07-25 06:35:55.431306] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:41.990 [2024-07-25 06:35:55.431480] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1779bd0 00:19:41.990 [2024-07-25 06:35:55.431614] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19290e0 00:19:41.990 [2024-07-25 06:35:55.431624] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19290e0 00:19:41.990 [2024-07-25 06:35:55.431711] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.990 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:42.248 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:42.248 "name": "raid_bdev1", 00:19:42.248 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:42.248 "strip_size_kb": 0, 00:19:42.248 "state": "online", 00:19:42.248 "raid_level": "raid1", 00:19:42.248 "superblock": true, 00:19:42.248 "num_base_bdevs": 3, 00:19:42.248 "num_base_bdevs_discovered": 3, 00:19:42.248 "num_base_bdevs_operational": 3, 00:19:42.248 "base_bdevs_list": [ 00:19:42.248 { 00:19:42.248 "name": "pt1", 00:19:42.248 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:42.248 "is_configured": true, 00:19:42.248 "data_offset": 2048, 00:19:42.248 "data_size": 63488 00:19:42.248 }, 00:19:42.248 { 00:19:42.248 "name": "pt2", 00:19:42.248 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:42.248 "is_configured": true, 00:19:42.248 "data_offset": 2048, 00:19:42.248 "data_size": 63488 00:19:42.248 }, 00:19:42.248 { 00:19:42.248 "name": "pt3", 00:19:42.248 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:42.248 "is_configured": true, 00:19:42.248 "data_offset": 2048, 00:19:42.248 "data_size": 63488 00:19:42.248 } 00:19:42.248 ] 00:19:42.248 }' 00:19:42.248 06:35:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:42.248 06:35:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.812 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:19:42.812 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:42.812 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:42.812 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:42.812 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:42.812 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:42.812 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:42.812 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:43.070 [2024-07-25 06:35:56.456901] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:43.070 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:43.070 "name": "raid_bdev1", 00:19:43.070 "aliases": [ 00:19:43.070 "63f29f0f-25c0-4646-b142-3639dbd2c9c0" 00:19:43.070 ], 00:19:43.070 "product_name": "Raid Volume", 00:19:43.070 "block_size": 512, 00:19:43.070 "num_blocks": 63488, 00:19:43.070 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:43.070 "assigned_rate_limits": { 00:19:43.070 "rw_ios_per_sec": 0, 00:19:43.070 "rw_mbytes_per_sec": 0, 00:19:43.070 "r_mbytes_per_sec": 0, 00:19:43.070 "w_mbytes_per_sec": 0 00:19:43.070 }, 00:19:43.070 "claimed": false, 00:19:43.070 "zoned": false, 00:19:43.070 "supported_io_types": { 00:19:43.070 "read": true, 00:19:43.070 "write": true, 00:19:43.070 "unmap": false, 00:19:43.070 "flush": false, 00:19:43.070 "reset": true, 00:19:43.070 "nvme_admin": false, 00:19:43.070 "nvme_io": false, 00:19:43.070 "nvme_io_md": false, 00:19:43.070 "write_zeroes": true, 00:19:43.070 "zcopy": false, 00:19:43.070 "get_zone_info": false, 00:19:43.070 "zone_management": false, 00:19:43.070 "zone_append": false, 00:19:43.070 "compare": false, 00:19:43.070 "compare_and_write": false, 00:19:43.070 "abort": false, 00:19:43.070 "seek_hole": false, 00:19:43.070 "seek_data": false, 00:19:43.070 "copy": false, 00:19:43.070 "nvme_iov_md": false 00:19:43.070 }, 00:19:43.070 "memory_domains": [ 00:19:43.070 { 00:19:43.070 "dma_device_id": "system", 00:19:43.070 "dma_device_type": 1 00:19:43.070 }, 00:19:43.070 { 00:19:43.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.070 "dma_device_type": 2 00:19:43.070 }, 00:19:43.070 { 00:19:43.070 "dma_device_id": "system", 00:19:43.070 "dma_device_type": 1 00:19:43.070 }, 00:19:43.070 { 00:19:43.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.070 "dma_device_type": 2 00:19:43.070 }, 00:19:43.070 { 00:19:43.070 "dma_device_id": "system", 00:19:43.070 "dma_device_type": 1 00:19:43.070 }, 00:19:43.070 { 00:19:43.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.070 "dma_device_type": 2 00:19:43.070 } 00:19:43.070 ], 00:19:43.070 "driver_specific": { 00:19:43.070 "raid": { 00:19:43.070 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:43.070 "strip_size_kb": 0, 00:19:43.070 "state": "online", 00:19:43.070 "raid_level": "raid1", 00:19:43.070 "superblock": true, 00:19:43.070 "num_base_bdevs": 3, 00:19:43.070 "num_base_bdevs_discovered": 3, 00:19:43.070 "num_base_bdevs_operational": 3, 00:19:43.070 "base_bdevs_list": [ 00:19:43.070 { 00:19:43.070 "name": "pt1", 00:19:43.070 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:43.070 "is_configured": true, 00:19:43.070 "data_offset": 2048, 00:19:43.070 "data_size": 63488 00:19:43.070 }, 00:19:43.070 { 00:19:43.070 "name": "pt2", 00:19:43.070 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:43.070 "is_configured": true, 00:19:43.070 "data_offset": 2048, 00:19:43.070 "data_size": 63488 00:19:43.070 }, 00:19:43.070 { 00:19:43.070 "name": "pt3", 00:19:43.070 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:43.070 "is_configured": true, 00:19:43.070 "data_offset": 2048, 00:19:43.070 "data_size": 63488 00:19:43.070 } 00:19:43.070 ] 00:19:43.070 } 00:19:43.070 } 00:19:43.070 }' 00:19:43.070 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:43.070 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:43.070 pt2 00:19:43.070 pt3' 00:19:43.070 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.070 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:43.070 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.328 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.328 "name": "pt1", 00:19:43.328 "aliases": [ 00:19:43.328 "00000000-0000-0000-0000-000000000001" 00:19:43.328 ], 00:19:43.328 "product_name": "passthru", 00:19:43.328 "block_size": 512, 00:19:43.328 "num_blocks": 65536, 00:19:43.328 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:43.328 "assigned_rate_limits": { 00:19:43.328 "rw_ios_per_sec": 0, 00:19:43.328 "rw_mbytes_per_sec": 0, 00:19:43.328 "r_mbytes_per_sec": 0, 00:19:43.328 "w_mbytes_per_sec": 0 00:19:43.328 }, 00:19:43.328 "claimed": true, 00:19:43.328 "claim_type": "exclusive_write", 00:19:43.328 "zoned": false, 00:19:43.328 "supported_io_types": { 00:19:43.328 "read": true, 00:19:43.328 "write": true, 00:19:43.328 "unmap": true, 00:19:43.328 "flush": true, 00:19:43.328 "reset": true, 00:19:43.328 "nvme_admin": false, 00:19:43.328 "nvme_io": false, 00:19:43.328 "nvme_io_md": false, 00:19:43.328 "write_zeroes": true, 00:19:43.328 "zcopy": true, 00:19:43.328 "get_zone_info": false, 00:19:43.328 "zone_management": false, 00:19:43.328 "zone_append": false, 00:19:43.328 "compare": false, 00:19:43.328 "compare_and_write": false, 00:19:43.328 "abort": true, 00:19:43.328 "seek_hole": false, 00:19:43.328 "seek_data": false, 00:19:43.328 "copy": true, 00:19:43.328 "nvme_iov_md": false 00:19:43.328 }, 00:19:43.328 "memory_domains": [ 00:19:43.328 { 00:19:43.328 "dma_device_id": "system", 00:19:43.328 "dma_device_type": 1 00:19:43.328 }, 00:19:43.328 { 00:19:43.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.328 "dma_device_type": 2 00:19:43.328 } 00:19:43.328 ], 00:19:43.328 "driver_specific": { 00:19:43.328 "passthru": { 00:19:43.328 "name": "pt1", 00:19:43.328 "base_bdev_name": "malloc1" 00:19:43.328 } 00:19:43.328 } 00:19:43.328 }' 00:19:43.328 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.328 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.328 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.328 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.585 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.585 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.585 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.585 06:35:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.585 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.585 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.585 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.585 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.585 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.585 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:43.585 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.843 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.843 "name": "pt2", 00:19:43.843 "aliases": [ 00:19:43.843 "00000000-0000-0000-0000-000000000002" 00:19:43.843 ], 00:19:43.843 "product_name": "passthru", 00:19:43.843 "block_size": 512, 00:19:43.843 "num_blocks": 65536, 00:19:43.843 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:43.843 "assigned_rate_limits": { 00:19:43.843 "rw_ios_per_sec": 0, 00:19:43.843 "rw_mbytes_per_sec": 0, 00:19:43.843 "r_mbytes_per_sec": 0, 00:19:43.843 "w_mbytes_per_sec": 0 00:19:43.843 }, 00:19:43.843 "claimed": true, 00:19:43.843 "claim_type": "exclusive_write", 00:19:43.843 "zoned": false, 00:19:43.843 "supported_io_types": { 00:19:43.843 "read": true, 00:19:43.843 "write": true, 00:19:43.843 "unmap": true, 00:19:43.843 "flush": true, 00:19:43.843 "reset": true, 00:19:43.843 "nvme_admin": false, 00:19:43.843 "nvme_io": false, 00:19:43.843 "nvme_io_md": false, 00:19:43.843 "write_zeroes": true, 00:19:43.843 "zcopy": true, 00:19:43.843 "get_zone_info": false, 00:19:43.843 "zone_management": false, 00:19:43.843 "zone_append": false, 00:19:43.843 "compare": false, 00:19:43.843 "compare_and_write": false, 00:19:43.843 "abort": true, 00:19:43.843 "seek_hole": false, 00:19:43.843 "seek_data": false, 00:19:43.843 "copy": true, 00:19:43.843 "nvme_iov_md": false 00:19:43.843 }, 00:19:43.843 "memory_domains": [ 00:19:43.843 { 00:19:43.843 "dma_device_id": "system", 00:19:43.843 "dma_device_type": 1 00:19:43.843 }, 00:19:43.843 { 00:19:43.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.843 "dma_device_type": 2 00:19:43.843 } 00:19:43.843 ], 00:19:43.843 "driver_specific": { 00:19:43.843 "passthru": { 00:19:43.843 "name": "pt2", 00:19:43.843 "base_bdev_name": "malloc2" 00:19:43.843 } 00:19:43.843 } 00:19:43.843 }' 00:19:43.843 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.843 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.100 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.358 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.358 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:44.358 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:44.358 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:44.358 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.358 "name": "pt3", 00:19:44.358 "aliases": [ 00:19:44.358 "00000000-0000-0000-0000-000000000003" 00:19:44.358 ], 00:19:44.358 "product_name": "passthru", 00:19:44.358 "block_size": 512, 00:19:44.358 "num_blocks": 65536, 00:19:44.358 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:44.358 "assigned_rate_limits": { 00:19:44.358 "rw_ios_per_sec": 0, 00:19:44.358 "rw_mbytes_per_sec": 0, 00:19:44.358 "r_mbytes_per_sec": 0, 00:19:44.358 "w_mbytes_per_sec": 0 00:19:44.358 }, 00:19:44.358 "claimed": true, 00:19:44.358 "claim_type": "exclusive_write", 00:19:44.358 "zoned": false, 00:19:44.358 "supported_io_types": { 00:19:44.358 "read": true, 00:19:44.358 "write": true, 00:19:44.358 "unmap": true, 00:19:44.358 "flush": true, 00:19:44.358 "reset": true, 00:19:44.358 "nvme_admin": false, 00:19:44.358 "nvme_io": false, 00:19:44.358 "nvme_io_md": false, 00:19:44.358 "write_zeroes": true, 00:19:44.358 "zcopy": true, 00:19:44.358 "get_zone_info": false, 00:19:44.358 "zone_management": false, 00:19:44.358 "zone_append": false, 00:19:44.358 "compare": false, 00:19:44.358 "compare_and_write": false, 00:19:44.358 "abort": true, 00:19:44.358 "seek_hole": false, 00:19:44.358 "seek_data": false, 00:19:44.358 "copy": true, 00:19:44.358 "nvme_iov_md": false 00:19:44.358 }, 00:19:44.358 "memory_domains": [ 00:19:44.358 { 00:19:44.358 "dma_device_id": "system", 00:19:44.358 "dma_device_type": 1 00:19:44.358 }, 00:19:44.358 { 00:19:44.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.358 "dma_device_type": 2 00:19:44.358 } 00:19:44.358 ], 00:19:44.358 "driver_specific": { 00:19:44.358 "passthru": { 00:19:44.358 "name": "pt3", 00:19:44.358 "base_bdev_name": "malloc3" 00:19:44.358 } 00:19:44.358 } 00:19:44.358 }' 00:19:44.358 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.615 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.615 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.615 06:35:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.615 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.615 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.615 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.615 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.615 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.615 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.873 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.873 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.873 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:44.873 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:19:45.130 [2024-07-25 06:35:58.438117] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:45.130 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=63f29f0f-25c0-4646-b142-3639dbd2c9c0 00:19:45.130 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 63f29f0f-25c0-4646-b142-3639dbd2c9c0 ']' 00:19:45.130 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:45.130 [2024-07-25 06:35:58.666463] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:45.130 [2024-07-25 06:35:58.666478] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:45.130 [2024-07-25 06:35:58.666524] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:45.130 [2024-07-25 06:35:58.666589] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:45.130 [2024-07-25 06:35:58.666599] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19290e0 name raid_bdev1, state offline 00:19:45.130 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.387 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:19:45.387 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:19:45.387 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:19:45.387 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:45.387 06:35:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:45.661 06:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:45.661 06:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:45.942 06:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:45.942 06:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:46.199 06:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:46.199 06:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:46.457 06:35:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:19:46.715 [2024-07-25 06:36:00.046046] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:46.715 [2024-07-25 06:36:00.047292] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:46.715 [2024-07-25 06:36:00.047333] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:46.715 [2024-07-25 06:36:00.047375] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:46.715 [2024-07-25 06:36:00.047411] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:46.715 [2024-07-25 06:36:00.047438] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:46.715 [2024-07-25 06:36:00.047455] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:46.715 [2024-07-25 06:36:00.047463] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1778bc0 name raid_bdev1, state configuring 00:19:46.715 request: 00:19:46.715 { 00:19:46.715 "name": "raid_bdev1", 00:19:46.715 "raid_level": "raid1", 00:19:46.715 "base_bdevs": [ 00:19:46.715 "malloc1", 00:19:46.715 "malloc2", 00:19:46.715 "malloc3" 00:19:46.715 ], 00:19:46.715 "superblock": false, 00:19:46.715 "method": "bdev_raid_create", 00:19:46.715 "req_id": 1 00:19:46.715 } 00:19:46.715 Got JSON-RPC error response 00:19:46.715 response: 00:19:46.715 { 00:19:46.715 "code": -17, 00:19:46.715 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:46.715 } 00:19:46.715 06:36:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:19:46.715 06:36:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:19:46.715 06:36:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:19:46.715 06:36:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:19:46.715 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.715 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:46.973 [2024-07-25 06:36:00.507200] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:46.973 [2024-07-25 06:36:00.507238] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:46.973 [2024-07-25 06:36:00.507255] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1924f60 00:19:46.973 [2024-07-25 06:36:00.507266] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:46.973 [2024-07-25 06:36:00.508703] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:46.973 [2024-07-25 06:36:00.508730] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:46.973 [2024-07-25 06:36:00.508788] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:46.973 [2024-07-25 06:36:00.508810] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:46.973 pt1 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.973 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.231 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.231 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:47.231 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.231 "name": "raid_bdev1", 00:19:47.231 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:47.231 "strip_size_kb": 0, 00:19:47.231 "state": "configuring", 00:19:47.231 "raid_level": "raid1", 00:19:47.231 "superblock": true, 00:19:47.231 "num_base_bdevs": 3, 00:19:47.231 "num_base_bdevs_discovered": 1, 00:19:47.231 "num_base_bdevs_operational": 3, 00:19:47.231 "base_bdevs_list": [ 00:19:47.231 { 00:19:47.231 "name": "pt1", 00:19:47.231 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:47.231 "is_configured": true, 00:19:47.231 "data_offset": 2048, 00:19:47.231 "data_size": 63488 00:19:47.231 }, 00:19:47.231 { 00:19:47.231 "name": null, 00:19:47.231 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:47.231 "is_configured": false, 00:19:47.231 "data_offset": 2048, 00:19:47.231 "data_size": 63488 00:19:47.231 }, 00:19:47.231 { 00:19:47.231 "name": null, 00:19:47.231 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:47.231 "is_configured": false, 00:19:47.231 "data_offset": 2048, 00:19:47.231 "data_size": 63488 00:19:47.231 } 00:19:47.231 ] 00:19:47.231 }' 00:19:47.231 06:36:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.231 06:36:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:47.798 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:19:47.798 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:48.055 [2024-07-25 06:36:01.529915] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:48.055 [2024-07-25 06:36:01.529967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.055 [2024-07-25 06:36:01.529985] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1777ea0 00:19:48.055 [2024-07-25 06:36:01.529998] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.055 [2024-07-25 06:36:01.530319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.055 [2024-07-25 06:36:01.530337] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:48.055 [2024-07-25 06:36:01.530394] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:48.055 [2024-07-25 06:36:01.530412] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:48.055 pt2 00:19:48.055 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:48.311 [2024-07-25 06:36:01.754513] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.311 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.312 06:36:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:48.568 06:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.568 "name": "raid_bdev1", 00:19:48.568 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:48.568 "strip_size_kb": 0, 00:19:48.568 "state": "configuring", 00:19:48.568 "raid_level": "raid1", 00:19:48.568 "superblock": true, 00:19:48.568 "num_base_bdevs": 3, 00:19:48.568 "num_base_bdevs_discovered": 1, 00:19:48.568 "num_base_bdevs_operational": 3, 00:19:48.568 "base_bdevs_list": [ 00:19:48.568 { 00:19:48.568 "name": "pt1", 00:19:48.568 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:48.568 "is_configured": true, 00:19:48.568 "data_offset": 2048, 00:19:48.569 "data_size": 63488 00:19:48.569 }, 00:19:48.569 { 00:19:48.569 "name": null, 00:19:48.569 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:48.569 "is_configured": false, 00:19:48.569 "data_offset": 2048, 00:19:48.569 "data_size": 63488 00:19:48.569 }, 00:19:48.569 { 00:19:48.569 "name": null, 00:19:48.569 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:48.569 "is_configured": false, 00:19:48.569 "data_offset": 2048, 00:19:48.569 "data_size": 63488 00:19:48.569 } 00:19:48.569 ] 00:19:48.569 }' 00:19:48.569 06:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.569 06:36:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.134 06:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:19:49.134 06:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:19:49.134 06:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:49.392 [2024-07-25 06:36:02.769192] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:49.392 [2024-07-25 06:36:02.769242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.392 [2024-07-25 06:36:02.769262] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x192a160 00:19:49.392 [2024-07-25 06:36:02.769274] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.392 [2024-07-25 06:36:02.769575] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.392 [2024-07-25 06:36:02.769591] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:49.392 [2024-07-25 06:36:02.769649] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:49.392 [2024-07-25 06:36:02.769667] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:49.392 pt2 00:19:49.392 06:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:19:49.392 06:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:19:49.392 06:36:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:49.651 [2024-07-25 06:36:02.993771] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:49.651 [2024-07-25 06:36:02.993805] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.651 [2024-07-25 06:36:02.993820] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1928df0 00:19:49.651 [2024-07-25 06:36:02.993831] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.651 [2024-07-25 06:36:02.994093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.651 [2024-07-25 06:36:02.994108] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:49.651 [2024-07-25 06:36:02.994163] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:49.651 [2024-07-25 06:36:02.994180] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:49.651 [2024-07-25 06:36:02.994274] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1925d50 00:19:49.651 [2024-07-25 06:36:02.994284] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:49.651 [2024-07-25 06:36:02.994433] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x177c910 00:19:49.651 [2024-07-25 06:36:02.994548] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1925d50 00:19:49.651 [2024-07-25 06:36:02.994557] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1925d50 00:19:49.651 [2024-07-25 06:36:02.994641] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:49.651 pt3 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.651 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.910 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.910 "name": "raid_bdev1", 00:19:49.910 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:49.910 "strip_size_kb": 0, 00:19:49.910 "state": "online", 00:19:49.910 "raid_level": "raid1", 00:19:49.910 "superblock": true, 00:19:49.910 "num_base_bdevs": 3, 00:19:49.910 "num_base_bdevs_discovered": 3, 00:19:49.910 "num_base_bdevs_operational": 3, 00:19:49.910 "base_bdevs_list": [ 00:19:49.910 { 00:19:49.910 "name": "pt1", 00:19:49.910 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:49.910 "is_configured": true, 00:19:49.910 "data_offset": 2048, 00:19:49.910 "data_size": 63488 00:19:49.910 }, 00:19:49.910 { 00:19:49.910 "name": "pt2", 00:19:49.910 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:49.910 "is_configured": true, 00:19:49.910 "data_offset": 2048, 00:19:49.910 "data_size": 63488 00:19:49.910 }, 00:19:49.910 { 00:19:49.910 "name": "pt3", 00:19:49.910 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:49.910 "is_configured": true, 00:19:49.910 "data_offset": 2048, 00:19:49.910 "data_size": 63488 00:19:49.910 } 00:19:49.910 ] 00:19:49.910 }' 00:19:49.910 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.910 06:36:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.474 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:19:50.474 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:50.474 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:50.474 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:50.474 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:50.474 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:50.474 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:50.474 06:36:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:50.474 [2024-07-25 06:36:04.024759] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:50.732 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:50.732 "name": "raid_bdev1", 00:19:50.732 "aliases": [ 00:19:50.732 "63f29f0f-25c0-4646-b142-3639dbd2c9c0" 00:19:50.732 ], 00:19:50.732 "product_name": "Raid Volume", 00:19:50.732 "block_size": 512, 00:19:50.732 "num_blocks": 63488, 00:19:50.732 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:50.732 "assigned_rate_limits": { 00:19:50.732 "rw_ios_per_sec": 0, 00:19:50.732 "rw_mbytes_per_sec": 0, 00:19:50.732 "r_mbytes_per_sec": 0, 00:19:50.732 "w_mbytes_per_sec": 0 00:19:50.732 }, 00:19:50.732 "claimed": false, 00:19:50.732 "zoned": false, 00:19:50.732 "supported_io_types": { 00:19:50.732 "read": true, 00:19:50.732 "write": true, 00:19:50.732 "unmap": false, 00:19:50.732 "flush": false, 00:19:50.732 "reset": true, 00:19:50.732 "nvme_admin": false, 00:19:50.732 "nvme_io": false, 00:19:50.732 "nvme_io_md": false, 00:19:50.732 "write_zeroes": true, 00:19:50.732 "zcopy": false, 00:19:50.732 "get_zone_info": false, 00:19:50.732 "zone_management": false, 00:19:50.732 "zone_append": false, 00:19:50.732 "compare": false, 00:19:50.732 "compare_and_write": false, 00:19:50.732 "abort": false, 00:19:50.732 "seek_hole": false, 00:19:50.732 "seek_data": false, 00:19:50.732 "copy": false, 00:19:50.732 "nvme_iov_md": false 00:19:50.732 }, 00:19:50.732 "memory_domains": [ 00:19:50.732 { 00:19:50.732 "dma_device_id": "system", 00:19:50.732 "dma_device_type": 1 00:19:50.732 }, 00:19:50.732 { 00:19:50.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.732 "dma_device_type": 2 00:19:50.732 }, 00:19:50.732 { 00:19:50.732 "dma_device_id": "system", 00:19:50.732 "dma_device_type": 1 00:19:50.732 }, 00:19:50.732 { 00:19:50.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.732 "dma_device_type": 2 00:19:50.732 }, 00:19:50.732 { 00:19:50.732 "dma_device_id": "system", 00:19:50.732 "dma_device_type": 1 00:19:50.732 }, 00:19:50.732 { 00:19:50.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.732 "dma_device_type": 2 00:19:50.732 } 00:19:50.732 ], 00:19:50.732 "driver_specific": { 00:19:50.732 "raid": { 00:19:50.732 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:50.732 "strip_size_kb": 0, 00:19:50.732 "state": "online", 00:19:50.732 "raid_level": "raid1", 00:19:50.732 "superblock": true, 00:19:50.732 "num_base_bdevs": 3, 00:19:50.732 "num_base_bdevs_discovered": 3, 00:19:50.732 "num_base_bdevs_operational": 3, 00:19:50.732 "base_bdevs_list": [ 00:19:50.732 { 00:19:50.732 "name": "pt1", 00:19:50.732 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:50.732 "is_configured": true, 00:19:50.732 "data_offset": 2048, 00:19:50.732 "data_size": 63488 00:19:50.732 }, 00:19:50.732 { 00:19:50.732 "name": "pt2", 00:19:50.732 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:50.732 "is_configured": true, 00:19:50.732 "data_offset": 2048, 00:19:50.732 "data_size": 63488 00:19:50.732 }, 00:19:50.732 { 00:19:50.732 "name": "pt3", 00:19:50.732 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:50.732 "is_configured": true, 00:19:50.732 "data_offset": 2048, 00:19:50.732 "data_size": 63488 00:19:50.732 } 00:19:50.732 ] 00:19:50.732 } 00:19:50.732 } 00:19:50.732 }' 00:19:50.732 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:50.732 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:50.732 pt2 00:19:50.732 pt3' 00:19:50.732 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:50.732 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:50.732 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:50.732 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:50.732 "name": "pt1", 00:19:50.732 "aliases": [ 00:19:50.732 "00000000-0000-0000-0000-000000000001" 00:19:50.732 ], 00:19:50.732 "product_name": "passthru", 00:19:50.732 "block_size": 512, 00:19:50.732 "num_blocks": 65536, 00:19:50.732 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:50.732 "assigned_rate_limits": { 00:19:50.732 "rw_ios_per_sec": 0, 00:19:50.732 "rw_mbytes_per_sec": 0, 00:19:50.732 "r_mbytes_per_sec": 0, 00:19:50.732 "w_mbytes_per_sec": 0 00:19:50.732 }, 00:19:50.732 "claimed": true, 00:19:50.732 "claim_type": "exclusive_write", 00:19:50.732 "zoned": false, 00:19:50.732 "supported_io_types": { 00:19:50.732 "read": true, 00:19:50.732 "write": true, 00:19:50.732 "unmap": true, 00:19:50.732 "flush": true, 00:19:50.732 "reset": true, 00:19:50.732 "nvme_admin": false, 00:19:50.732 "nvme_io": false, 00:19:50.732 "nvme_io_md": false, 00:19:50.732 "write_zeroes": true, 00:19:50.732 "zcopy": true, 00:19:50.732 "get_zone_info": false, 00:19:50.732 "zone_management": false, 00:19:50.732 "zone_append": false, 00:19:50.732 "compare": false, 00:19:50.732 "compare_and_write": false, 00:19:50.732 "abort": true, 00:19:50.732 "seek_hole": false, 00:19:50.732 "seek_data": false, 00:19:50.732 "copy": true, 00:19:50.732 "nvme_iov_md": false 00:19:50.732 }, 00:19:50.732 "memory_domains": [ 00:19:50.732 { 00:19:50.732 "dma_device_id": "system", 00:19:50.732 "dma_device_type": 1 00:19:50.732 }, 00:19:50.732 { 00:19:50.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.732 "dma_device_type": 2 00:19:50.732 } 00:19:50.732 ], 00:19:50.732 "driver_specific": { 00:19:50.732 "passthru": { 00:19:50.732 "name": "pt1", 00:19:50.732 "base_bdev_name": "malloc1" 00:19:50.732 } 00:19:50.732 } 00:19:50.732 }' 00:19:50.732 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:50.991 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.249 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.249 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:51.249 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:51.249 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:51.249 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:51.507 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:51.507 "name": "pt2", 00:19:51.507 "aliases": [ 00:19:51.507 "00000000-0000-0000-0000-000000000002" 00:19:51.507 ], 00:19:51.507 "product_name": "passthru", 00:19:51.507 "block_size": 512, 00:19:51.507 "num_blocks": 65536, 00:19:51.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:51.507 "assigned_rate_limits": { 00:19:51.507 "rw_ios_per_sec": 0, 00:19:51.507 "rw_mbytes_per_sec": 0, 00:19:51.507 "r_mbytes_per_sec": 0, 00:19:51.507 "w_mbytes_per_sec": 0 00:19:51.507 }, 00:19:51.507 "claimed": true, 00:19:51.507 "claim_type": "exclusive_write", 00:19:51.507 "zoned": false, 00:19:51.507 "supported_io_types": { 00:19:51.507 "read": true, 00:19:51.507 "write": true, 00:19:51.507 "unmap": true, 00:19:51.507 "flush": true, 00:19:51.507 "reset": true, 00:19:51.507 "nvme_admin": false, 00:19:51.507 "nvme_io": false, 00:19:51.507 "nvme_io_md": false, 00:19:51.507 "write_zeroes": true, 00:19:51.507 "zcopy": true, 00:19:51.507 "get_zone_info": false, 00:19:51.507 "zone_management": false, 00:19:51.507 "zone_append": false, 00:19:51.507 "compare": false, 00:19:51.507 "compare_and_write": false, 00:19:51.507 "abort": true, 00:19:51.507 "seek_hole": false, 00:19:51.507 "seek_data": false, 00:19:51.507 "copy": true, 00:19:51.507 "nvme_iov_md": false 00:19:51.507 }, 00:19:51.507 "memory_domains": [ 00:19:51.507 { 00:19:51.507 "dma_device_id": "system", 00:19:51.507 "dma_device_type": 1 00:19:51.507 }, 00:19:51.507 { 00:19:51.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.507 "dma_device_type": 2 00:19:51.507 } 00:19:51.507 ], 00:19:51.507 "driver_specific": { 00:19:51.507 "passthru": { 00:19:51.507 "name": "pt2", 00:19:51.507 "base_bdev_name": "malloc2" 00:19:51.507 } 00:19:51.507 } 00:19:51.507 }' 00:19:51.507 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.507 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.507 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:51.507 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.507 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.507 06:36:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:51.507 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.507 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.764 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:51.764 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.764 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.764 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:51.764 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:51.764 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:51.764 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:52.022 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:52.022 "name": "pt3", 00:19:52.022 "aliases": [ 00:19:52.022 "00000000-0000-0000-0000-000000000003" 00:19:52.022 ], 00:19:52.022 "product_name": "passthru", 00:19:52.022 "block_size": 512, 00:19:52.022 "num_blocks": 65536, 00:19:52.022 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:52.022 "assigned_rate_limits": { 00:19:52.022 "rw_ios_per_sec": 0, 00:19:52.022 "rw_mbytes_per_sec": 0, 00:19:52.022 "r_mbytes_per_sec": 0, 00:19:52.022 "w_mbytes_per_sec": 0 00:19:52.022 }, 00:19:52.022 "claimed": true, 00:19:52.022 "claim_type": "exclusive_write", 00:19:52.022 "zoned": false, 00:19:52.022 "supported_io_types": { 00:19:52.022 "read": true, 00:19:52.022 "write": true, 00:19:52.022 "unmap": true, 00:19:52.022 "flush": true, 00:19:52.022 "reset": true, 00:19:52.022 "nvme_admin": false, 00:19:52.022 "nvme_io": false, 00:19:52.022 "nvme_io_md": false, 00:19:52.022 "write_zeroes": true, 00:19:52.022 "zcopy": true, 00:19:52.022 "get_zone_info": false, 00:19:52.022 "zone_management": false, 00:19:52.022 "zone_append": false, 00:19:52.022 "compare": false, 00:19:52.022 "compare_and_write": false, 00:19:52.022 "abort": true, 00:19:52.022 "seek_hole": false, 00:19:52.022 "seek_data": false, 00:19:52.022 "copy": true, 00:19:52.022 "nvme_iov_md": false 00:19:52.022 }, 00:19:52.022 "memory_domains": [ 00:19:52.022 { 00:19:52.022 "dma_device_id": "system", 00:19:52.022 "dma_device_type": 1 00:19:52.022 }, 00:19:52.022 { 00:19:52.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.022 "dma_device_type": 2 00:19:52.022 } 00:19:52.022 ], 00:19:52.022 "driver_specific": { 00:19:52.022 "passthru": { 00:19:52.022 "name": "pt3", 00:19:52.022 "base_bdev_name": "malloc3" 00:19:52.022 } 00:19:52.022 } 00:19:52.022 }' 00:19:52.022 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.022 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.022 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:52.022 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.022 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.022 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:52.022 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.281 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.281 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:52.281 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.281 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.281 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:52.281 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:52.281 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:19:52.539 [2024-07-25 06:36:05.917703] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:52.539 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 63f29f0f-25c0-4646-b142-3639dbd2c9c0 '!=' 63f29f0f-25c0-4646-b142-3639dbd2c9c0 ']' 00:19:52.539 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:19:52.539 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:52.539 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:52.539 06:36:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:52.799 [2024-07-25 06:36:06.146085] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.799 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.057 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.057 "name": "raid_bdev1", 00:19:53.057 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:53.057 "strip_size_kb": 0, 00:19:53.057 "state": "online", 00:19:53.057 "raid_level": "raid1", 00:19:53.057 "superblock": true, 00:19:53.057 "num_base_bdevs": 3, 00:19:53.057 "num_base_bdevs_discovered": 2, 00:19:53.057 "num_base_bdevs_operational": 2, 00:19:53.057 "base_bdevs_list": [ 00:19:53.057 { 00:19:53.057 "name": null, 00:19:53.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.057 "is_configured": false, 00:19:53.057 "data_offset": 2048, 00:19:53.057 "data_size": 63488 00:19:53.057 }, 00:19:53.057 { 00:19:53.057 "name": "pt2", 00:19:53.057 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:53.057 "is_configured": true, 00:19:53.057 "data_offset": 2048, 00:19:53.057 "data_size": 63488 00:19:53.057 }, 00:19:53.057 { 00:19:53.057 "name": "pt3", 00:19:53.057 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:53.057 "is_configured": true, 00:19:53.057 "data_offset": 2048, 00:19:53.057 "data_size": 63488 00:19:53.057 } 00:19:53.057 ] 00:19:53.057 }' 00:19:53.057 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.057 06:36:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.623 06:36:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:53.881 [2024-07-25 06:36:07.184808] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:53.881 [2024-07-25 06:36:07.184834] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:53.881 [2024-07-25 06:36:07.184886] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:53.881 [2024-07-25 06:36:07.184940] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:53.881 [2024-07-25 06:36:07.184951] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1925d50 name raid_bdev1, state offline 00:19:53.881 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.881 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:19:54.139 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:19:54.139 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:19:54.139 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:19:54.139 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:19:54.139 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:54.139 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:19:54.139 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:19:54.139 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:54.397 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:19:54.397 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:19:54.397 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:19:54.397 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:19:54.397 06:36:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:54.655 [2024-07-25 06:36:08.107183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:54.655 [2024-07-25 06:36:08.107225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:54.655 [2024-07-25 06:36:08.107242] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1781470 00:19:54.655 [2024-07-25 06:36:08.107253] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:54.655 [2024-07-25 06:36:08.108704] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:54.655 [2024-07-25 06:36:08.108732] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:54.655 [2024-07-25 06:36:08.108787] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:54.655 [2024-07-25 06:36:08.108809] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:54.655 pt2 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.655 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:54.914 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:54.914 "name": "raid_bdev1", 00:19:54.914 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:54.914 "strip_size_kb": 0, 00:19:54.914 "state": "configuring", 00:19:54.914 "raid_level": "raid1", 00:19:54.914 "superblock": true, 00:19:54.914 "num_base_bdevs": 3, 00:19:54.914 "num_base_bdevs_discovered": 1, 00:19:54.914 "num_base_bdevs_operational": 2, 00:19:54.914 "base_bdevs_list": [ 00:19:54.914 { 00:19:54.914 "name": null, 00:19:54.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:54.914 "is_configured": false, 00:19:54.914 "data_offset": 2048, 00:19:54.914 "data_size": 63488 00:19:54.914 }, 00:19:54.914 { 00:19:54.914 "name": "pt2", 00:19:54.914 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:54.914 "is_configured": true, 00:19:54.914 "data_offset": 2048, 00:19:54.914 "data_size": 63488 00:19:54.914 }, 00:19:54.914 { 00:19:54.914 "name": null, 00:19:54.914 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:54.914 "is_configured": false, 00:19:54.914 "data_offset": 2048, 00:19:54.914 "data_size": 63488 00:19:54.914 } 00:19:54.914 ] 00:19:54.914 }' 00:19:54.914 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:54.914 06:36:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:55.480 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:19:55.480 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:19:55.480 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=2 00:19:55.480 06:36:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:55.738 [2024-07-25 06:36:09.093998] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:55.738 [2024-07-25 06:36:09.094046] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.738 [2024-07-25 06:36:09.094063] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1932e10 00:19:55.738 [2024-07-25 06:36:09.094075] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.738 [2024-07-25 06:36:09.094382] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.738 [2024-07-25 06:36:09.094399] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:55.738 [2024-07-25 06:36:09.094455] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:55.738 [2024-07-25 06:36:09.094472] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:55.738 [2024-07-25 06:36:09.094565] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1928950 00:19:55.738 [2024-07-25 06:36:09.094575] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:55.738 [2024-07-25 06:36:09.094726] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1929010 00:19:55.738 [2024-07-25 06:36:09.094843] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1928950 00:19:55.738 [2024-07-25 06:36:09.094852] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1928950 00:19:55.738 [2024-07-25 06:36:09.094937] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:55.738 pt3 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.738 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:55.995 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:55.995 "name": "raid_bdev1", 00:19:55.995 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:55.995 "strip_size_kb": 0, 00:19:55.995 "state": "online", 00:19:55.995 "raid_level": "raid1", 00:19:55.995 "superblock": true, 00:19:55.995 "num_base_bdevs": 3, 00:19:55.995 "num_base_bdevs_discovered": 2, 00:19:55.995 "num_base_bdevs_operational": 2, 00:19:55.995 "base_bdevs_list": [ 00:19:55.995 { 00:19:55.995 "name": null, 00:19:55.995 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:55.995 "is_configured": false, 00:19:55.995 "data_offset": 2048, 00:19:55.995 "data_size": 63488 00:19:55.995 }, 00:19:55.995 { 00:19:55.995 "name": "pt2", 00:19:55.995 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:55.995 "is_configured": true, 00:19:55.995 "data_offset": 2048, 00:19:55.995 "data_size": 63488 00:19:55.995 }, 00:19:55.995 { 00:19:55.995 "name": "pt3", 00:19:55.995 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:55.995 "is_configured": true, 00:19:55.995 "data_offset": 2048, 00:19:55.995 "data_size": 63488 00:19:55.995 } 00:19:55.995 ] 00:19:55.995 }' 00:19:55.995 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:55.995 06:36:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.560 06:36:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:56.817 [2024-07-25 06:36:10.136731] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:56.817 [2024-07-25 06:36:10.136754] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:56.817 [2024-07-25 06:36:10.136803] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:56.817 [2024-07-25 06:36:10.136852] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:56.817 [2024-07-25 06:36:10.136862] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1928950 name raid_bdev1, state offline 00:19:56.817 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.817 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:19:56.817 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:19:56.817 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:19:56.817 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 3 -gt 2 ']' 00:19:56.817 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # i=2 00:19:56.817 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@550 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:57.074 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:57.332 [2024-07-25 06:36:10.782398] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:57.332 [2024-07-25 06:36:10.782439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:57.332 [2024-07-25 06:36:10.782455] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1928d40 00:19:57.332 [2024-07-25 06:36:10.782466] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:57.332 [2024-07-25 06:36:10.783920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:57.332 [2024-07-25 06:36:10.783947] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:57.332 [2024-07-25 06:36:10.784003] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:57.332 [2024-07-25 06:36:10.784025] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:57.332 [2024-07-25 06:36:10.784113] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:19:57.332 [2024-07-25 06:36:10.784125] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:57.332 [2024-07-25 06:36:10.784148] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x177a0e0 name raid_bdev1, state configuring 00:19:57.332 [2024-07-25 06:36:10.784170] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:57.332 pt1 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 3 -gt 2 ']' 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@560 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.332 06:36:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:57.590 06:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:57.590 "name": "raid_bdev1", 00:19:57.590 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:57.590 "strip_size_kb": 0, 00:19:57.590 "state": "configuring", 00:19:57.590 "raid_level": "raid1", 00:19:57.590 "superblock": true, 00:19:57.590 "num_base_bdevs": 3, 00:19:57.590 "num_base_bdevs_discovered": 1, 00:19:57.590 "num_base_bdevs_operational": 2, 00:19:57.590 "base_bdevs_list": [ 00:19:57.590 { 00:19:57.590 "name": null, 00:19:57.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.590 "is_configured": false, 00:19:57.590 "data_offset": 2048, 00:19:57.590 "data_size": 63488 00:19:57.590 }, 00:19:57.590 { 00:19:57.590 "name": "pt2", 00:19:57.590 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:57.590 "is_configured": true, 00:19:57.590 "data_offset": 2048, 00:19:57.590 "data_size": 63488 00:19:57.590 }, 00:19:57.590 { 00:19:57.590 "name": null, 00:19:57.590 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:57.590 "is_configured": false, 00:19:57.590 "data_offset": 2048, 00:19:57.590 "data_size": 63488 00:19:57.590 } 00:19:57.590 ] 00:19:57.590 }' 00:19:57.590 06:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:57.590 06:36:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:58.153 06:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:19:58.153 06:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:58.410 06:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # [[ false == \f\a\l\s\e ]] 00:19:58.410 06:36:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:58.668 [2024-07-25 06:36:12.045725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:58.668 [2024-07-25 06:36:12.045773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:58.668 [2024-07-25 06:36:12.045792] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1932c70 00:19:58.668 [2024-07-25 06:36:12.045803] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:58.668 [2024-07-25 06:36:12.046114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:58.668 [2024-07-25 06:36:12.046132] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:58.668 [2024-07-25 06:36:12.046193] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:58.668 [2024-07-25 06:36:12.046211] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:58.668 [2024-07-25 06:36:12.046299] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1926480 00:19:58.668 [2024-07-25 06:36:12.046309] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:58.668 [2024-07-25 06:36:12.046458] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x177b6c0 00:19:58.668 [2024-07-25 06:36:12.046568] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1926480 00:19:58.668 [2024-07-25 06:36:12.046577] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1926480 00:19:58.668 [2024-07-25 06:36:12.046663] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:58.668 pt3 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.668 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:58.925 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.925 "name": "raid_bdev1", 00:19:58.925 "uuid": "63f29f0f-25c0-4646-b142-3639dbd2c9c0", 00:19:58.925 "strip_size_kb": 0, 00:19:58.925 "state": "online", 00:19:58.925 "raid_level": "raid1", 00:19:58.925 "superblock": true, 00:19:58.925 "num_base_bdevs": 3, 00:19:58.925 "num_base_bdevs_discovered": 2, 00:19:58.925 "num_base_bdevs_operational": 2, 00:19:58.925 "base_bdevs_list": [ 00:19:58.925 { 00:19:58.925 "name": null, 00:19:58.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.925 "is_configured": false, 00:19:58.925 "data_offset": 2048, 00:19:58.925 "data_size": 63488 00:19:58.925 }, 00:19:58.925 { 00:19:58.925 "name": "pt2", 00:19:58.925 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:58.925 "is_configured": true, 00:19:58.925 "data_offset": 2048, 00:19:58.925 "data_size": 63488 00:19:58.925 }, 00:19:58.925 { 00:19:58.925 "name": "pt3", 00:19:58.925 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:58.925 "is_configured": true, 00:19:58.925 "data_offset": 2048, 00:19:58.925 "data_size": 63488 00:19:58.925 } 00:19:58.925 ] 00:19:58.925 }' 00:19:58.925 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.925 06:36:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:59.490 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:19:59.490 06:36:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:19:59.755 [2024-07-25 06:36:13.269189] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' 63f29f0f-25c0-4646-b142-3639dbd2c9c0 '!=' 63f29f0f-25c0-4646-b142-3639dbd2c9c0 ']' 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1160875 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1160875 ']' 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1160875 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:59.755 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1160875 00:20:00.040 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:00.040 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:00.040 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1160875' 00:20:00.040 killing process with pid 1160875 00:20:00.040 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1160875 00:20:00.040 [2024-07-25 06:36:13.348179] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:00.040 [2024-07-25 06:36:13.348246] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:00.040 [2024-07-25 06:36:13.348310] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:00.040 [2024-07-25 06:36:13.348326] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1926480 name raid_bdev1, state offline 00:20:00.040 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1160875 00:20:00.040 [2024-07-25 06:36:13.371758] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:00.040 06:36:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:20:00.040 00:20:00.040 real 0m20.610s 00:20:00.040 user 0m37.701s 00:20:00.040 sys 0m3.784s 00:20:00.040 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:00.040 06:36:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.040 ************************************ 00:20:00.040 END TEST raid_superblock_test 00:20:00.040 ************************************ 00:20:00.040 06:36:13 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:20:00.040 06:36:13 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:00.040 06:36:13 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:00.040 06:36:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:00.296 ************************************ 00:20:00.296 START TEST raid_read_error_test 00:20:00.296 ************************************ 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 read 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.6JkZR5OClb 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1165501 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1165501 /var/tmp/spdk-raid.sock 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1165501 ']' 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:00.296 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:00.296 06:36:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.296 [2024-07-25 06:36:13.716757] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:20:00.296 [2024-07-25 06:36:13.716816] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1165501 ] 00:20:00.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.296 EAL: Requested device 0000:3d:01.0 cannot be used 00:20:00.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.296 EAL: Requested device 0000:3d:01.1 cannot be used 00:20:00.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.296 EAL: Requested device 0000:3d:01.2 cannot be used 00:20:00.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.296 EAL: Requested device 0000:3d:01.3 cannot be used 00:20:00.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.296 EAL: Requested device 0000:3d:01.4 cannot be used 00:20:00.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.296 EAL: Requested device 0000:3d:01.5 cannot be used 00:20:00.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.296 EAL: Requested device 0000:3d:01.6 cannot be used 00:20:00.296 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:01.7 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:02.0 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:02.1 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:02.2 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:02.3 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:02.4 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:02.5 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:02.6 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3d:02.7 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:01.0 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:01.1 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:01.2 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:01.3 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:01.4 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:01.5 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:01.6 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:01.7 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:02.0 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:02.1 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:02.2 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:02.3 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:02.4 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:02.5 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:02.6 cannot be used 00:20:00.297 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:00.297 EAL: Requested device 0000:3f:02.7 cannot be used 00:20:00.553 [2024-07-25 06:36:13.853062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:00.553 [2024-07-25 06:36:13.896808] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:00.553 [2024-07-25 06:36:13.955253] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:00.553 [2024-07-25 06:36:13.955291] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:01.116 06:36:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:01.116 06:36:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:20:01.116 06:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:01.116 06:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:01.374 BaseBdev1_malloc 00:20:01.374 06:36:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:01.632 true 00:20:01.632 06:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:01.889 [2024-07-25 06:36:15.227516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:01.889 [2024-07-25 06:36:15.227559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.889 [2024-07-25 06:36:15.227576] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1557a60 00:20:01.889 [2024-07-25 06:36:15.227588] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.889 [2024-07-25 06:36:15.228969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.889 [2024-07-25 06:36:15.228996] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:01.889 BaseBdev1 00:20:01.889 06:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:01.889 06:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:02.146 BaseBdev2_malloc 00:20:02.146 06:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:02.146 true 00:20:02.403 06:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:02.403 [2024-07-25 06:36:15.917533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:02.403 [2024-07-25 06:36:15.917579] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.403 [2024-07-25 06:36:15.917598] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x155cdc0 00:20:02.403 [2024-07-25 06:36:15.917608] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.403 [2024-07-25 06:36:15.918916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.403 [2024-07-25 06:36:15.918943] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:02.403 BaseBdev2 00:20:02.403 06:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:02.403 06:36:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:02.661 BaseBdev3_malloc 00:20:02.661 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:02.918 true 00:20:02.918 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:03.176 [2024-07-25 06:36:16.603663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:03.176 [2024-07-25 06:36:16.603711] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:03.176 [2024-07-25 06:36:16.603729] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x155d420 00:20:03.176 [2024-07-25 06:36:16.603741] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:03.176 [2024-07-25 06:36:16.605069] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:03.176 [2024-07-25 06:36:16.605098] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:03.176 BaseBdev3 00:20:03.176 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:20:03.433 [2024-07-25 06:36:16.784397] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:03.433 [2024-07-25 06:36:16.785480] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:03.433 [2024-07-25 06:36:16.785542] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:03.433 [2024-07-25 06:36:16.785732] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15601d0 00:20:03.433 [2024-07-25 06:36:16.785742] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:03.433 [2024-07-25 06:36:16.785911] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13b33e0 00:20:03.433 [2024-07-25 06:36:16.786049] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15601d0 00:20:03.433 [2024-07-25 06:36:16.786058] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15601d0 00:20:03.433 [2024-07-25 06:36:16.786156] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.433 06:36:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.691 06:36:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.691 "name": "raid_bdev1", 00:20:03.691 "uuid": "93498610-b24c-4f8a-ba97-b48d8e5985c3", 00:20:03.691 "strip_size_kb": 0, 00:20:03.691 "state": "online", 00:20:03.691 "raid_level": "raid1", 00:20:03.691 "superblock": true, 00:20:03.691 "num_base_bdevs": 3, 00:20:03.691 "num_base_bdevs_discovered": 3, 00:20:03.691 "num_base_bdevs_operational": 3, 00:20:03.691 "base_bdevs_list": [ 00:20:03.691 { 00:20:03.691 "name": "BaseBdev1", 00:20:03.691 "uuid": "046a2b9b-0f2f-56e7-aa84-5f08e6ad8900", 00:20:03.691 "is_configured": true, 00:20:03.691 "data_offset": 2048, 00:20:03.691 "data_size": 63488 00:20:03.691 }, 00:20:03.691 { 00:20:03.691 "name": "BaseBdev2", 00:20:03.691 "uuid": "a9ba3d18-a9c7-506f-b1a6-6345251203fa", 00:20:03.691 "is_configured": true, 00:20:03.691 "data_offset": 2048, 00:20:03.691 "data_size": 63488 00:20:03.691 }, 00:20:03.691 { 00:20:03.691 "name": "BaseBdev3", 00:20:03.691 "uuid": "e9ec407f-4120-5907-a98e-2d7e126c91a1", 00:20:03.691 "is_configured": true, 00:20:03.691 "data_offset": 2048, 00:20:03.691 "data_size": 63488 00:20:03.691 } 00:20:03.691 ] 00:20:03.691 }' 00:20:03.691 06:36:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.691 06:36:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:04.256 06:36:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:20:04.256 06:36:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:04.256 [2024-07-25 06:36:17.670729] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1565950 00:20:05.187 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.445 06:36:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.702 06:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:05.702 "name": "raid_bdev1", 00:20:05.702 "uuid": "93498610-b24c-4f8a-ba97-b48d8e5985c3", 00:20:05.702 "strip_size_kb": 0, 00:20:05.702 "state": "online", 00:20:05.702 "raid_level": "raid1", 00:20:05.702 "superblock": true, 00:20:05.702 "num_base_bdevs": 3, 00:20:05.702 "num_base_bdevs_discovered": 3, 00:20:05.702 "num_base_bdevs_operational": 3, 00:20:05.702 "base_bdevs_list": [ 00:20:05.702 { 00:20:05.702 "name": "BaseBdev1", 00:20:05.702 "uuid": "046a2b9b-0f2f-56e7-aa84-5f08e6ad8900", 00:20:05.702 "is_configured": true, 00:20:05.702 "data_offset": 2048, 00:20:05.702 "data_size": 63488 00:20:05.702 }, 00:20:05.702 { 00:20:05.702 "name": "BaseBdev2", 00:20:05.702 "uuid": "a9ba3d18-a9c7-506f-b1a6-6345251203fa", 00:20:05.702 "is_configured": true, 00:20:05.702 "data_offset": 2048, 00:20:05.702 "data_size": 63488 00:20:05.702 }, 00:20:05.702 { 00:20:05.702 "name": "BaseBdev3", 00:20:05.702 "uuid": "e9ec407f-4120-5907-a98e-2d7e126c91a1", 00:20:05.702 "is_configured": true, 00:20:05.702 "data_offset": 2048, 00:20:05.702 "data_size": 63488 00:20:05.702 } 00:20:05.702 ] 00:20:05.702 }' 00:20:05.702 06:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:05.702 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.267 06:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:06.267 [2024-07-25 06:36:19.794756] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:06.267 [2024-07-25 06:36:19.794785] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:06.267 [2024-07-25 06:36:19.797672] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:06.267 [2024-07-25 06:36:19.797705] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:06.267 [2024-07-25 06:36:19.797792] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:06.267 [2024-07-25 06:36:19.797808] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15601d0 name raid_bdev1, state offline 00:20:06.267 0 00:20:06.267 06:36:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1165501 00:20:06.267 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1165501 ']' 00:20:06.267 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1165501 00:20:06.267 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:20:06.267 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:06.525 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1165501 00:20:06.525 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:06.525 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:06.525 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1165501' 00:20:06.525 killing process with pid 1165501 00:20:06.525 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1165501 00:20:06.525 [2024-07-25 06:36:19.873567] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:06.525 06:36:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1165501 00:20:06.525 [2024-07-25 06:36:19.892333] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:06.525 06:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.6JkZR5OClb 00:20:06.525 06:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:20:06.525 06:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:20:06.782 06:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:20:06.782 06:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:20:06.782 06:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:06.782 06:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:06.783 06:36:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:20:06.783 00:20:06.783 real 0m6.448s 00:20:06.783 user 0m10.106s 00:20:06.783 sys 0m1.144s 00:20:06.783 06:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:06.783 06:36:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.783 ************************************ 00:20:06.783 END TEST raid_read_error_test 00:20:06.783 ************************************ 00:20:06.783 06:36:20 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:20:06.783 06:36:20 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:06.783 06:36:20 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:06.783 06:36:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:06.783 ************************************ 00:20:06.783 START TEST raid_write_error_test 00:20:06.783 ************************************ 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 write 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.RODiBVk0YH 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1166702 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1166702 /var/tmp/spdk-raid.sock 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1166702 ']' 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:06.783 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:06.783 06:36:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.783 [2024-07-25 06:36:20.240937] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:20:06.783 [2024-07-25 06:36:20.240993] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1166702 ] 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:01.0 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:01.1 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:01.2 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:01.3 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:01.4 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:01.5 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:01.6 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:01.7 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:02.0 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:02.1 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:02.2 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:02.3 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:02.4 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:02.5 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:02.6 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3d:02.7 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:01.0 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:01.1 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:01.2 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:01.3 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:01.4 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:01.5 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:01.6 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:01.7 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:02.0 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:02.1 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:02.2 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:02.3 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:02.4 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:02.5 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:02.6 cannot be used 00:20:06.783 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:06.783 EAL: Requested device 0000:3f:02.7 cannot be used 00:20:07.041 [2024-07-25 06:36:20.376547] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:07.041 [2024-07-25 06:36:20.420690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:07.041 [2024-07-25 06:36:20.485402] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:07.041 [2024-07-25 06:36:20.485440] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:07.606 06:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:07.606 06:36:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:20:07.606 06:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:07.606 06:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:07.863 BaseBdev1_malloc 00:20:07.863 06:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:08.122 true 00:20:08.122 06:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:08.379 [2024-07-25 06:36:21.804939] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:08.379 [2024-07-25 06:36:21.804980] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:08.379 [2024-07-25 06:36:21.805005] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x260ba60 00:20:08.379 [2024-07-25 06:36:21.805017] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:08.379 [2024-07-25 06:36:21.806486] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:08.379 [2024-07-25 06:36:21.806514] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:08.379 BaseBdev1 00:20:08.379 06:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:08.379 06:36:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:08.636 BaseBdev2_malloc 00:20:08.636 06:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:08.893 true 00:20:08.893 06:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:09.149 [2024-07-25 06:36:22.471199] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:09.149 [2024-07-25 06:36:22.471237] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.149 [2024-07-25 06:36:22.471258] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2610dc0 00:20:09.149 [2024-07-25 06:36:22.471269] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.149 [2024-07-25 06:36:22.472591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.149 [2024-07-25 06:36:22.472618] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:09.149 BaseBdev2 00:20:09.149 06:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:09.149 06:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:09.149 BaseBdev3_malloc 00:20:09.405 06:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:09.405 true 00:20:09.405 06:36:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:09.662 [2024-07-25 06:36:23.153360] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:09.662 [2024-07-25 06:36:23.153400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.662 [2024-07-25 06:36:23.153419] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2611420 00:20:09.662 [2024-07-25 06:36:23.153431] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.662 [2024-07-25 06:36:23.154810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.662 [2024-07-25 06:36:23.154838] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:09.662 BaseBdev3 00:20:09.662 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:20:09.919 [2024-07-25 06:36:23.377973] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:09.919 [2024-07-25 06:36:23.379132] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:09.919 [2024-07-25 06:36:23.379202] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:09.919 [2024-07-25 06:36:23.379393] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26141d0 00:20:09.919 [2024-07-25 06:36:23.379404] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:09.919 [2024-07-25 06:36:23.379575] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24673e0 00:20:09.919 [2024-07-25 06:36:23.379721] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26141d0 00:20:09.919 [2024-07-25 06:36:23.379731] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26141d0 00:20:09.919 [2024-07-25 06:36:23.379826] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.919 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.176 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.176 "name": "raid_bdev1", 00:20:10.176 "uuid": "36b781ca-8fe6-478d-af11-c1e5b85e9ca7", 00:20:10.176 "strip_size_kb": 0, 00:20:10.176 "state": "online", 00:20:10.176 "raid_level": "raid1", 00:20:10.176 "superblock": true, 00:20:10.176 "num_base_bdevs": 3, 00:20:10.176 "num_base_bdevs_discovered": 3, 00:20:10.176 "num_base_bdevs_operational": 3, 00:20:10.176 "base_bdevs_list": [ 00:20:10.176 { 00:20:10.176 "name": "BaseBdev1", 00:20:10.176 "uuid": "1b6064eb-aa93-542c-a065-32ea4ee4b272", 00:20:10.176 "is_configured": true, 00:20:10.176 "data_offset": 2048, 00:20:10.176 "data_size": 63488 00:20:10.176 }, 00:20:10.176 { 00:20:10.176 "name": "BaseBdev2", 00:20:10.176 "uuid": "1989c6c3-4aa2-56a1-9573-b5f07365dc04", 00:20:10.176 "is_configured": true, 00:20:10.176 "data_offset": 2048, 00:20:10.176 "data_size": 63488 00:20:10.176 }, 00:20:10.176 { 00:20:10.176 "name": "BaseBdev3", 00:20:10.176 "uuid": "95e02fb5-a058-581a-85dd-4ce6e298a7ff", 00:20:10.176 "is_configured": true, 00:20:10.176 "data_offset": 2048, 00:20:10.176 "data_size": 63488 00:20:10.176 } 00:20:10.176 ] 00:20:10.176 }' 00:20:10.176 06:36:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.176 06:36:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.739 06:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:20:10.739 06:36:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:10.739 [2024-07-25 06:36:24.288611] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2619950 00:20:11.670 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:11.928 [2024-07-25 06:36:25.403105] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:20:11.928 [2024-07-25 06:36:25.403160] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:11.928 [2024-07-25 06:36:25.403352] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2619950 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=2 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.928 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.185 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.185 "name": "raid_bdev1", 00:20:12.185 "uuid": "36b781ca-8fe6-478d-af11-c1e5b85e9ca7", 00:20:12.185 "strip_size_kb": 0, 00:20:12.185 "state": "online", 00:20:12.185 "raid_level": "raid1", 00:20:12.185 "superblock": true, 00:20:12.185 "num_base_bdevs": 3, 00:20:12.185 "num_base_bdevs_discovered": 2, 00:20:12.185 "num_base_bdevs_operational": 2, 00:20:12.185 "base_bdevs_list": [ 00:20:12.185 { 00:20:12.185 "name": null, 00:20:12.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.185 "is_configured": false, 00:20:12.185 "data_offset": 2048, 00:20:12.185 "data_size": 63488 00:20:12.185 }, 00:20:12.185 { 00:20:12.185 "name": "BaseBdev2", 00:20:12.185 "uuid": "1989c6c3-4aa2-56a1-9573-b5f07365dc04", 00:20:12.185 "is_configured": true, 00:20:12.185 "data_offset": 2048, 00:20:12.185 "data_size": 63488 00:20:12.185 }, 00:20:12.185 { 00:20:12.185 "name": "BaseBdev3", 00:20:12.185 "uuid": "95e02fb5-a058-581a-85dd-4ce6e298a7ff", 00:20:12.185 "is_configured": true, 00:20:12.185 "data_offset": 2048, 00:20:12.185 "data_size": 63488 00:20:12.185 } 00:20:12.185 ] 00:20:12.185 }' 00:20:12.185 06:36:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.185 06:36:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:12.750 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:13.008 [2024-07-25 06:36:26.444937] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:13.008 [2024-07-25 06:36:26.444969] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:13.008 [2024-07-25 06:36:26.447860] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:13.008 [2024-07-25 06:36:26.447886] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:13.008 [2024-07-25 06:36:26.447951] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:13.008 [2024-07-25 06:36:26.447961] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26141d0 name raid_bdev1, state offline 00:20:13.008 0 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1166702 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1166702 ']' 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1166702 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1166702 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1166702' 00:20:13.008 killing process with pid 1166702 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1166702 00:20:13.008 [2024-07-25 06:36:26.521181] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:13.008 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1166702 00:20:13.008 [2024-07-25 06:36:26.539259] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.RODiBVk0YH 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:20:13.266 00:20:13.266 real 0m6.564s 00:20:13.266 user 0m10.304s 00:20:13.266 sys 0m1.218s 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:13.266 06:36:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.266 ************************************ 00:20:13.266 END TEST raid_write_error_test 00:20:13.266 ************************************ 00:20:13.266 06:36:26 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:20:13.266 06:36:26 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:20:13.266 06:36:26 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:20:13.266 06:36:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:13.266 06:36:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:13.266 06:36:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:13.525 ************************************ 00:20:13.525 START TEST raid_state_function_test 00:20:13.525 ************************************ 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 false 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1167863 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1167863' 00:20:13.525 Process raid pid: 1167863 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1167863 /var/tmp/spdk-raid.sock 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1167863 ']' 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:13.525 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:13.525 06:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.525 [2024-07-25 06:36:26.889081] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:20:13.525 [2024-07-25 06:36:26.889128] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:01.0 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:01.1 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:01.2 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:01.3 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:01.4 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:01.5 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:01.6 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:01.7 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:02.0 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:02.1 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:02.2 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:02.3 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:02.4 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:02.5 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:02.6 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3d:02.7 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:01.0 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:01.1 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:01.2 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:01.3 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:01.4 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:01.5 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:01.6 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:01.7 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:02.0 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:02.1 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:02.2 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:02.3 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:02.4 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:02.5 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:02.6 cannot be used 00:20:13.525 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:13.525 EAL: Requested device 0000:3f:02.7 cannot be used 00:20:13.525 [2024-07-25 06:36:27.013748] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:13.525 [2024-07-25 06:36:27.057031] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:13.783 [2024-07-25 06:36:27.115591] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:13.783 [2024-07-25 06:36:27.115620] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:14.389 06:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:14.389 06:36:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:20:14.389 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:14.389 [2024-07-25 06:36:27.880711] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:14.389 [2024-07-25 06:36:27.880750] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:14.389 [2024-07-25 06:36:27.880760] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:14.389 [2024-07-25 06:36:27.880771] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:14.389 [2024-07-25 06:36:27.880779] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:14.389 [2024-07-25 06:36:27.880789] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:14.389 [2024-07-25 06:36:27.880797] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:14.389 [2024-07-25 06:36:27.880807] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:14.389 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:14.389 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:14.389 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.390 06:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:14.647 06:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.647 "name": "Existed_Raid", 00:20:14.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.647 "strip_size_kb": 64, 00:20:14.647 "state": "configuring", 00:20:14.647 "raid_level": "raid0", 00:20:14.647 "superblock": false, 00:20:14.647 "num_base_bdevs": 4, 00:20:14.647 "num_base_bdevs_discovered": 0, 00:20:14.647 "num_base_bdevs_operational": 4, 00:20:14.647 "base_bdevs_list": [ 00:20:14.647 { 00:20:14.647 "name": "BaseBdev1", 00:20:14.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.647 "is_configured": false, 00:20:14.647 "data_offset": 0, 00:20:14.647 "data_size": 0 00:20:14.647 }, 00:20:14.647 { 00:20:14.647 "name": "BaseBdev2", 00:20:14.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.647 "is_configured": false, 00:20:14.647 "data_offset": 0, 00:20:14.647 "data_size": 0 00:20:14.647 }, 00:20:14.647 { 00:20:14.647 "name": "BaseBdev3", 00:20:14.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.647 "is_configured": false, 00:20:14.647 "data_offset": 0, 00:20:14.647 "data_size": 0 00:20:14.647 }, 00:20:14.647 { 00:20:14.647 "name": "BaseBdev4", 00:20:14.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.647 "is_configured": false, 00:20:14.647 "data_offset": 0, 00:20:14.647 "data_size": 0 00:20:14.647 } 00:20:14.647 ] 00:20:14.647 }' 00:20:14.647 06:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.647 06:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:15.212 06:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:15.470 [2024-07-25 06:36:28.907295] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:15.470 [2024-07-25 06:36:28.907322] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x117a470 name Existed_Raid, state configuring 00:20:15.470 06:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:15.728 [2024-07-25 06:36:29.131904] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:15.728 [2024-07-25 06:36:29.131938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:15.728 [2024-07-25 06:36:29.131948] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:15.728 [2024-07-25 06:36:29.131959] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:15.728 [2024-07-25 06:36:29.131967] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:15.728 [2024-07-25 06:36:29.131977] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:15.728 [2024-07-25 06:36:29.131985] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:15.728 [2024-07-25 06:36:29.132007] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:15.728 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:15.986 [2024-07-25 06:36:29.362206] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:15.986 BaseBdev1 00:20:15.986 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:15.986 06:36:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:15.986 06:36:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:15.986 06:36:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:15.986 06:36:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:15.986 06:36:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:15.986 06:36:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:16.244 06:36:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:16.502 [ 00:20:16.502 { 00:20:16.502 "name": "BaseBdev1", 00:20:16.502 "aliases": [ 00:20:16.502 "96f43f26-e8fa-410c-9edb-b03cfc206dd7" 00:20:16.502 ], 00:20:16.502 "product_name": "Malloc disk", 00:20:16.502 "block_size": 512, 00:20:16.502 "num_blocks": 65536, 00:20:16.502 "uuid": "96f43f26-e8fa-410c-9edb-b03cfc206dd7", 00:20:16.502 "assigned_rate_limits": { 00:20:16.502 "rw_ios_per_sec": 0, 00:20:16.502 "rw_mbytes_per_sec": 0, 00:20:16.502 "r_mbytes_per_sec": 0, 00:20:16.502 "w_mbytes_per_sec": 0 00:20:16.502 }, 00:20:16.502 "claimed": true, 00:20:16.502 "claim_type": "exclusive_write", 00:20:16.502 "zoned": false, 00:20:16.502 "supported_io_types": { 00:20:16.502 "read": true, 00:20:16.502 "write": true, 00:20:16.502 "unmap": true, 00:20:16.502 "flush": true, 00:20:16.502 "reset": true, 00:20:16.502 "nvme_admin": false, 00:20:16.502 "nvme_io": false, 00:20:16.502 "nvme_io_md": false, 00:20:16.502 "write_zeroes": true, 00:20:16.502 "zcopy": true, 00:20:16.502 "get_zone_info": false, 00:20:16.502 "zone_management": false, 00:20:16.502 "zone_append": false, 00:20:16.502 "compare": false, 00:20:16.502 "compare_and_write": false, 00:20:16.502 "abort": true, 00:20:16.502 "seek_hole": false, 00:20:16.502 "seek_data": false, 00:20:16.502 "copy": true, 00:20:16.502 "nvme_iov_md": false 00:20:16.502 }, 00:20:16.502 "memory_domains": [ 00:20:16.502 { 00:20:16.502 "dma_device_id": "system", 00:20:16.502 "dma_device_type": 1 00:20:16.502 }, 00:20:16.502 { 00:20:16.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.502 "dma_device_type": 2 00:20:16.502 } 00:20:16.502 ], 00:20:16.502 "driver_specific": {} 00:20:16.502 } 00:20:16.502 ] 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.502 06:36:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.760 06:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.760 "name": "Existed_Raid", 00:20:16.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.760 "strip_size_kb": 64, 00:20:16.760 "state": "configuring", 00:20:16.760 "raid_level": "raid0", 00:20:16.760 "superblock": false, 00:20:16.760 "num_base_bdevs": 4, 00:20:16.760 "num_base_bdevs_discovered": 1, 00:20:16.760 "num_base_bdevs_operational": 4, 00:20:16.760 "base_bdevs_list": [ 00:20:16.760 { 00:20:16.760 "name": "BaseBdev1", 00:20:16.760 "uuid": "96f43f26-e8fa-410c-9edb-b03cfc206dd7", 00:20:16.760 "is_configured": true, 00:20:16.760 "data_offset": 0, 00:20:16.760 "data_size": 65536 00:20:16.760 }, 00:20:16.760 { 00:20:16.760 "name": "BaseBdev2", 00:20:16.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.760 "is_configured": false, 00:20:16.760 "data_offset": 0, 00:20:16.760 "data_size": 0 00:20:16.760 }, 00:20:16.760 { 00:20:16.760 "name": "BaseBdev3", 00:20:16.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.760 "is_configured": false, 00:20:16.760 "data_offset": 0, 00:20:16.760 "data_size": 0 00:20:16.760 }, 00:20:16.760 { 00:20:16.760 "name": "BaseBdev4", 00:20:16.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.760 "is_configured": false, 00:20:16.760 "data_offset": 0, 00:20:16.760 "data_size": 0 00:20:16.760 } 00:20:16.760 ] 00:20:16.760 }' 00:20:16.760 06:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.760 06:36:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:17.325 06:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:17.325 [2024-07-25 06:36:30.842088] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:17.325 [2024-07-25 06:36:30.842130] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1179ce0 name Existed_Raid, state configuring 00:20:17.325 06:36:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:17.583 [2024-07-25 06:36:31.070732] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:17.583 [2024-07-25 06:36:31.072110] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:17.583 [2024-07-25 06:36:31.072151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:17.583 [2024-07-25 06:36:31.072161] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:17.583 [2024-07-25 06:36:31.072172] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:17.583 [2024-07-25 06:36:31.072180] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:17.583 [2024-07-25 06:36:31.072190] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:17.583 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.840 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.840 "name": "Existed_Raid", 00:20:17.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.840 "strip_size_kb": 64, 00:20:17.840 "state": "configuring", 00:20:17.840 "raid_level": "raid0", 00:20:17.840 "superblock": false, 00:20:17.840 "num_base_bdevs": 4, 00:20:17.840 "num_base_bdevs_discovered": 1, 00:20:17.840 "num_base_bdevs_operational": 4, 00:20:17.840 "base_bdevs_list": [ 00:20:17.840 { 00:20:17.840 "name": "BaseBdev1", 00:20:17.840 "uuid": "96f43f26-e8fa-410c-9edb-b03cfc206dd7", 00:20:17.840 "is_configured": true, 00:20:17.840 "data_offset": 0, 00:20:17.840 "data_size": 65536 00:20:17.840 }, 00:20:17.840 { 00:20:17.840 "name": "BaseBdev2", 00:20:17.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.840 "is_configured": false, 00:20:17.840 "data_offset": 0, 00:20:17.840 "data_size": 0 00:20:17.840 }, 00:20:17.840 { 00:20:17.840 "name": "BaseBdev3", 00:20:17.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.840 "is_configured": false, 00:20:17.840 "data_offset": 0, 00:20:17.840 "data_size": 0 00:20:17.840 }, 00:20:17.840 { 00:20:17.840 "name": "BaseBdev4", 00:20:17.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.840 "is_configured": false, 00:20:17.841 "data_offset": 0, 00:20:17.841 "data_size": 0 00:20:17.841 } 00:20:17.841 ] 00:20:17.841 }' 00:20:17.841 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.841 06:36:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:18.405 06:36:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:18.663 [2024-07-25 06:36:32.104548] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:18.663 BaseBdev2 00:20:18.663 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:18.663 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:18.663 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:18.663 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:18.663 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:18.663 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:18.663 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:18.921 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:19.178 [ 00:20:19.178 { 00:20:19.178 "name": "BaseBdev2", 00:20:19.178 "aliases": [ 00:20:19.178 "b63b48fa-4351-4cfe-8e92-a8e31da49aeb" 00:20:19.178 ], 00:20:19.178 "product_name": "Malloc disk", 00:20:19.179 "block_size": 512, 00:20:19.179 "num_blocks": 65536, 00:20:19.179 "uuid": "b63b48fa-4351-4cfe-8e92-a8e31da49aeb", 00:20:19.179 "assigned_rate_limits": { 00:20:19.179 "rw_ios_per_sec": 0, 00:20:19.179 "rw_mbytes_per_sec": 0, 00:20:19.179 "r_mbytes_per_sec": 0, 00:20:19.179 "w_mbytes_per_sec": 0 00:20:19.179 }, 00:20:19.179 "claimed": true, 00:20:19.179 "claim_type": "exclusive_write", 00:20:19.179 "zoned": false, 00:20:19.179 "supported_io_types": { 00:20:19.179 "read": true, 00:20:19.179 "write": true, 00:20:19.179 "unmap": true, 00:20:19.179 "flush": true, 00:20:19.179 "reset": true, 00:20:19.179 "nvme_admin": false, 00:20:19.179 "nvme_io": false, 00:20:19.179 "nvme_io_md": false, 00:20:19.179 "write_zeroes": true, 00:20:19.179 "zcopy": true, 00:20:19.179 "get_zone_info": false, 00:20:19.179 "zone_management": false, 00:20:19.179 "zone_append": false, 00:20:19.179 "compare": false, 00:20:19.179 "compare_and_write": false, 00:20:19.179 "abort": true, 00:20:19.179 "seek_hole": false, 00:20:19.179 "seek_data": false, 00:20:19.179 "copy": true, 00:20:19.179 "nvme_iov_md": false 00:20:19.179 }, 00:20:19.179 "memory_domains": [ 00:20:19.179 { 00:20:19.179 "dma_device_id": "system", 00:20:19.179 "dma_device_type": 1 00:20:19.179 }, 00:20:19.179 { 00:20:19.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.179 "dma_device_type": 2 00:20:19.179 } 00:20:19.179 ], 00:20:19.179 "driver_specific": {} 00:20:19.179 } 00:20:19.179 ] 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.179 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:19.436 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.436 "name": "Existed_Raid", 00:20:19.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.436 "strip_size_kb": 64, 00:20:19.436 "state": "configuring", 00:20:19.436 "raid_level": "raid0", 00:20:19.436 "superblock": false, 00:20:19.436 "num_base_bdevs": 4, 00:20:19.436 "num_base_bdevs_discovered": 2, 00:20:19.436 "num_base_bdevs_operational": 4, 00:20:19.436 "base_bdevs_list": [ 00:20:19.436 { 00:20:19.436 "name": "BaseBdev1", 00:20:19.436 "uuid": "96f43f26-e8fa-410c-9edb-b03cfc206dd7", 00:20:19.436 "is_configured": true, 00:20:19.436 "data_offset": 0, 00:20:19.436 "data_size": 65536 00:20:19.436 }, 00:20:19.436 { 00:20:19.436 "name": "BaseBdev2", 00:20:19.436 "uuid": "b63b48fa-4351-4cfe-8e92-a8e31da49aeb", 00:20:19.436 "is_configured": true, 00:20:19.436 "data_offset": 0, 00:20:19.436 "data_size": 65536 00:20:19.436 }, 00:20:19.436 { 00:20:19.436 "name": "BaseBdev3", 00:20:19.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.436 "is_configured": false, 00:20:19.436 "data_offset": 0, 00:20:19.436 "data_size": 0 00:20:19.436 }, 00:20:19.436 { 00:20:19.436 "name": "BaseBdev4", 00:20:19.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.436 "is_configured": false, 00:20:19.436 "data_offset": 0, 00:20:19.436 "data_size": 0 00:20:19.436 } 00:20:19.436 ] 00:20:19.436 }' 00:20:19.436 06:36:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.436 06:36:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.003 06:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:20.263 [2024-07-25 06:36:33.583670] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:20.263 BaseBdev3 00:20:20.263 06:36:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:20.263 06:36:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:20.263 06:36:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:20.263 06:36:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:20.263 06:36:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:20.263 06:36:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:20.263 06:36:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:20.520 06:36:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:20.520 [ 00:20:20.520 { 00:20:20.520 "name": "BaseBdev3", 00:20:20.520 "aliases": [ 00:20:20.520 "84552c58-df03-409f-a5a8-eeb67af0d880" 00:20:20.520 ], 00:20:20.520 "product_name": "Malloc disk", 00:20:20.520 "block_size": 512, 00:20:20.520 "num_blocks": 65536, 00:20:20.520 "uuid": "84552c58-df03-409f-a5a8-eeb67af0d880", 00:20:20.520 "assigned_rate_limits": { 00:20:20.520 "rw_ios_per_sec": 0, 00:20:20.520 "rw_mbytes_per_sec": 0, 00:20:20.520 "r_mbytes_per_sec": 0, 00:20:20.520 "w_mbytes_per_sec": 0 00:20:20.520 }, 00:20:20.520 "claimed": true, 00:20:20.520 "claim_type": "exclusive_write", 00:20:20.520 "zoned": false, 00:20:20.520 "supported_io_types": { 00:20:20.520 "read": true, 00:20:20.520 "write": true, 00:20:20.520 "unmap": true, 00:20:20.520 "flush": true, 00:20:20.520 "reset": true, 00:20:20.520 "nvme_admin": false, 00:20:20.520 "nvme_io": false, 00:20:20.520 "nvme_io_md": false, 00:20:20.520 "write_zeroes": true, 00:20:20.520 "zcopy": true, 00:20:20.520 "get_zone_info": false, 00:20:20.520 "zone_management": false, 00:20:20.520 "zone_append": false, 00:20:20.520 "compare": false, 00:20:20.520 "compare_and_write": false, 00:20:20.520 "abort": true, 00:20:20.520 "seek_hole": false, 00:20:20.520 "seek_data": false, 00:20:20.520 "copy": true, 00:20:20.520 "nvme_iov_md": false 00:20:20.520 }, 00:20:20.520 "memory_domains": [ 00:20:20.520 { 00:20:20.520 "dma_device_id": "system", 00:20:20.520 "dma_device_type": 1 00:20:20.520 }, 00:20:20.520 { 00:20:20.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:20.520 "dma_device_type": 2 00:20:20.520 } 00:20:20.520 ], 00:20:20.521 "driver_specific": {} 00:20:20.521 } 00:20:20.521 ] 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.521 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:20.777 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.777 "name": "Existed_Raid", 00:20:20.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.777 "strip_size_kb": 64, 00:20:20.777 "state": "configuring", 00:20:20.777 "raid_level": "raid0", 00:20:20.777 "superblock": false, 00:20:20.777 "num_base_bdevs": 4, 00:20:20.777 "num_base_bdevs_discovered": 3, 00:20:20.777 "num_base_bdevs_operational": 4, 00:20:20.777 "base_bdevs_list": [ 00:20:20.777 { 00:20:20.777 "name": "BaseBdev1", 00:20:20.777 "uuid": "96f43f26-e8fa-410c-9edb-b03cfc206dd7", 00:20:20.777 "is_configured": true, 00:20:20.777 "data_offset": 0, 00:20:20.777 "data_size": 65536 00:20:20.777 }, 00:20:20.777 { 00:20:20.777 "name": "BaseBdev2", 00:20:20.777 "uuid": "b63b48fa-4351-4cfe-8e92-a8e31da49aeb", 00:20:20.777 "is_configured": true, 00:20:20.777 "data_offset": 0, 00:20:20.777 "data_size": 65536 00:20:20.777 }, 00:20:20.777 { 00:20:20.777 "name": "BaseBdev3", 00:20:20.777 "uuid": "84552c58-df03-409f-a5a8-eeb67af0d880", 00:20:20.777 "is_configured": true, 00:20:20.777 "data_offset": 0, 00:20:20.777 "data_size": 65536 00:20:20.777 }, 00:20:20.777 { 00:20:20.777 "name": "BaseBdev4", 00:20:20.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.777 "is_configured": false, 00:20:20.777 "data_offset": 0, 00:20:20.777 "data_size": 0 00:20:20.777 } 00:20:20.777 ] 00:20:20.777 }' 00:20:20.777 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.777 06:36:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:21.341 06:36:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:21.598 [2024-07-25 06:36:35.086851] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:21.598 [2024-07-25 06:36:35.086887] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x132d250 00:20:21.598 [2024-07-25 06:36:35.086895] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:21.598 [2024-07-25 06:36:35.087072] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1321cc0 00:20:21.598 [2024-07-25 06:36:35.087199] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x132d250 00:20:21.598 [2024-07-25 06:36:35.087209] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x132d250 00:20:21.598 [2024-07-25 06:36:35.087361] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:21.598 BaseBdev4 00:20:21.598 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:21.598 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:21.598 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:21.598 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:21.598 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:21.598 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:21.598 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:21.855 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:22.113 [ 00:20:22.113 { 00:20:22.113 "name": "BaseBdev4", 00:20:22.113 "aliases": [ 00:20:22.113 "758a90e8-d6a0-43d7-a197-b15301d00228" 00:20:22.113 ], 00:20:22.113 "product_name": "Malloc disk", 00:20:22.113 "block_size": 512, 00:20:22.113 "num_blocks": 65536, 00:20:22.113 "uuid": "758a90e8-d6a0-43d7-a197-b15301d00228", 00:20:22.113 "assigned_rate_limits": { 00:20:22.113 "rw_ios_per_sec": 0, 00:20:22.113 "rw_mbytes_per_sec": 0, 00:20:22.113 "r_mbytes_per_sec": 0, 00:20:22.113 "w_mbytes_per_sec": 0 00:20:22.113 }, 00:20:22.113 "claimed": true, 00:20:22.113 "claim_type": "exclusive_write", 00:20:22.113 "zoned": false, 00:20:22.113 "supported_io_types": { 00:20:22.113 "read": true, 00:20:22.113 "write": true, 00:20:22.113 "unmap": true, 00:20:22.113 "flush": true, 00:20:22.113 "reset": true, 00:20:22.113 "nvme_admin": false, 00:20:22.113 "nvme_io": false, 00:20:22.113 "nvme_io_md": false, 00:20:22.113 "write_zeroes": true, 00:20:22.113 "zcopy": true, 00:20:22.113 "get_zone_info": false, 00:20:22.113 "zone_management": false, 00:20:22.113 "zone_append": false, 00:20:22.113 "compare": false, 00:20:22.113 "compare_and_write": false, 00:20:22.113 "abort": true, 00:20:22.113 "seek_hole": false, 00:20:22.113 "seek_data": false, 00:20:22.113 "copy": true, 00:20:22.113 "nvme_iov_md": false 00:20:22.113 }, 00:20:22.113 "memory_domains": [ 00:20:22.113 { 00:20:22.113 "dma_device_id": "system", 00:20:22.113 "dma_device_type": 1 00:20:22.113 }, 00:20:22.113 { 00:20:22.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.113 "dma_device_type": 2 00:20:22.113 } 00:20:22.113 ], 00:20:22.113 "driver_specific": {} 00:20:22.113 } 00:20:22.113 ] 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.113 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:22.370 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.370 "name": "Existed_Raid", 00:20:22.370 "uuid": "585cc9b6-1e9e-4167-a64b-75f5191010a8", 00:20:22.370 "strip_size_kb": 64, 00:20:22.370 "state": "online", 00:20:22.370 "raid_level": "raid0", 00:20:22.370 "superblock": false, 00:20:22.370 "num_base_bdevs": 4, 00:20:22.370 "num_base_bdevs_discovered": 4, 00:20:22.370 "num_base_bdevs_operational": 4, 00:20:22.370 "base_bdevs_list": [ 00:20:22.370 { 00:20:22.370 "name": "BaseBdev1", 00:20:22.370 "uuid": "96f43f26-e8fa-410c-9edb-b03cfc206dd7", 00:20:22.370 "is_configured": true, 00:20:22.370 "data_offset": 0, 00:20:22.370 "data_size": 65536 00:20:22.370 }, 00:20:22.370 { 00:20:22.370 "name": "BaseBdev2", 00:20:22.370 "uuid": "b63b48fa-4351-4cfe-8e92-a8e31da49aeb", 00:20:22.370 "is_configured": true, 00:20:22.370 "data_offset": 0, 00:20:22.370 "data_size": 65536 00:20:22.370 }, 00:20:22.370 { 00:20:22.370 "name": "BaseBdev3", 00:20:22.370 "uuid": "84552c58-df03-409f-a5a8-eeb67af0d880", 00:20:22.370 "is_configured": true, 00:20:22.370 "data_offset": 0, 00:20:22.370 "data_size": 65536 00:20:22.370 }, 00:20:22.370 { 00:20:22.370 "name": "BaseBdev4", 00:20:22.371 "uuid": "758a90e8-d6a0-43d7-a197-b15301d00228", 00:20:22.371 "is_configured": true, 00:20:22.371 "data_offset": 0, 00:20:22.371 "data_size": 65536 00:20:22.371 } 00:20:22.371 ] 00:20:22.371 }' 00:20:22.371 06:36:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.371 06:36:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:22.935 [2024-07-25 06:36:36.442743] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:22.935 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:22.935 "name": "Existed_Raid", 00:20:22.935 "aliases": [ 00:20:22.935 "585cc9b6-1e9e-4167-a64b-75f5191010a8" 00:20:22.935 ], 00:20:22.935 "product_name": "Raid Volume", 00:20:22.935 "block_size": 512, 00:20:22.935 "num_blocks": 262144, 00:20:22.935 "uuid": "585cc9b6-1e9e-4167-a64b-75f5191010a8", 00:20:22.935 "assigned_rate_limits": { 00:20:22.935 "rw_ios_per_sec": 0, 00:20:22.935 "rw_mbytes_per_sec": 0, 00:20:22.935 "r_mbytes_per_sec": 0, 00:20:22.935 "w_mbytes_per_sec": 0 00:20:22.935 }, 00:20:22.935 "claimed": false, 00:20:22.935 "zoned": false, 00:20:22.935 "supported_io_types": { 00:20:22.935 "read": true, 00:20:22.935 "write": true, 00:20:22.935 "unmap": true, 00:20:22.935 "flush": true, 00:20:22.935 "reset": true, 00:20:22.935 "nvme_admin": false, 00:20:22.935 "nvme_io": false, 00:20:22.935 "nvme_io_md": false, 00:20:22.935 "write_zeroes": true, 00:20:22.935 "zcopy": false, 00:20:22.935 "get_zone_info": false, 00:20:22.935 "zone_management": false, 00:20:22.935 "zone_append": false, 00:20:22.935 "compare": false, 00:20:22.935 "compare_and_write": false, 00:20:22.935 "abort": false, 00:20:22.935 "seek_hole": false, 00:20:22.935 "seek_data": false, 00:20:22.935 "copy": false, 00:20:22.935 "nvme_iov_md": false 00:20:22.935 }, 00:20:22.935 "memory_domains": [ 00:20:22.935 { 00:20:22.935 "dma_device_id": "system", 00:20:22.935 "dma_device_type": 1 00:20:22.935 }, 00:20:22.935 { 00:20:22.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.935 "dma_device_type": 2 00:20:22.935 }, 00:20:22.935 { 00:20:22.935 "dma_device_id": "system", 00:20:22.935 "dma_device_type": 1 00:20:22.935 }, 00:20:22.935 { 00:20:22.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.935 "dma_device_type": 2 00:20:22.935 }, 00:20:22.935 { 00:20:22.935 "dma_device_id": "system", 00:20:22.935 "dma_device_type": 1 00:20:22.935 }, 00:20:22.935 { 00:20:22.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.935 "dma_device_type": 2 00:20:22.935 }, 00:20:22.935 { 00:20:22.935 "dma_device_id": "system", 00:20:22.935 "dma_device_type": 1 00:20:22.935 }, 00:20:22.935 { 00:20:22.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.935 "dma_device_type": 2 00:20:22.935 } 00:20:22.935 ], 00:20:22.935 "driver_specific": { 00:20:22.935 "raid": { 00:20:22.935 "uuid": "585cc9b6-1e9e-4167-a64b-75f5191010a8", 00:20:22.935 "strip_size_kb": 64, 00:20:22.935 "state": "online", 00:20:22.935 "raid_level": "raid0", 00:20:22.936 "superblock": false, 00:20:22.936 "num_base_bdevs": 4, 00:20:22.936 "num_base_bdevs_discovered": 4, 00:20:22.936 "num_base_bdevs_operational": 4, 00:20:22.936 "base_bdevs_list": [ 00:20:22.936 { 00:20:22.936 "name": "BaseBdev1", 00:20:22.936 "uuid": "96f43f26-e8fa-410c-9edb-b03cfc206dd7", 00:20:22.936 "is_configured": true, 00:20:22.936 "data_offset": 0, 00:20:22.936 "data_size": 65536 00:20:22.936 }, 00:20:22.936 { 00:20:22.936 "name": "BaseBdev2", 00:20:22.936 "uuid": "b63b48fa-4351-4cfe-8e92-a8e31da49aeb", 00:20:22.936 "is_configured": true, 00:20:22.936 "data_offset": 0, 00:20:22.936 "data_size": 65536 00:20:22.936 }, 00:20:22.936 { 00:20:22.936 "name": "BaseBdev3", 00:20:22.936 "uuid": "84552c58-df03-409f-a5a8-eeb67af0d880", 00:20:22.936 "is_configured": true, 00:20:22.936 "data_offset": 0, 00:20:22.936 "data_size": 65536 00:20:22.936 }, 00:20:22.936 { 00:20:22.936 "name": "BaseBdev4", 00:20:22.936 "uuid": "758a90e8-d6a0-43d7-a197-b15301d00228", 00:20:22.936 "is_configured": true, 00:20:22.936 "data_offset": 0, 00:20:22.936 "data_size": 65536 00:20:22.936 } 00:20:22.936 ] 00:20:22.936 } 00:20:22.936 } 00:20:22.936 }' 00:20:22.936 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:23.194 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:23.194 BaseBdev2 00:20:23.194 BaseBdev3 00:20:23.194 BaseBdev4' 00:20:23.194 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.194 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:23.194 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.194 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.194 "name": "BaseBdev1", 00:20:23.194 "aliases": [ 00:20:23.194 "96f43f26-e8fa-410c-9edb-b03cfc206dd7" 00:20:23.194 ], 00:20:23.194 "product_name": "Malloc disk", 00:20:23.194 "block_size": 512, 00:20:23.194 "num_blocks": 65536, 00:20:23.194 "uuid": "96f43f26-e8fa-410c-9edb-b03cfc206dd7", 00:20:23.194 "assigned_rate_limits": { 00:20:23.194 "rw_ios_per_sec": 0, 00:20:23.194 "rw_mbytes_per_sec": 0, 00:20:23.194 "r_mbytes_per_sec": 0, 00:20:23.194 "w_mbytes_per_sec": 0 00:20:23.194 }, 00:20:23.194 "claimed": true, 00:20:23.194 "claim_type": "exclusive_write", 00:20:23.194 "zoned": false, 00:20:23.194 "supported_io_types": { 00:20:23.194 "read": true, 00:20:23.194 "write": true, 00:20:23.194 "unmap": true, 00:20:23.194 "flush": true, 00:20:23.194 "reset": true, 00:20:23.194 "nvme_admin": false, 00:20:23.194 "nvme_io": false, 00:20:23.194 "nvme_io_md": false, 00:20:23.194 "write_zeroes": true, 00:20:23.194 "zcopy": true, 00:20:23.194 "get_zone_info": false, 00:20:23.194 "zone_management": false, 00:20:23.194 "zone_append": false, 00:20:23.194 "compare": false, 00:20:23.194 "compare_and_write": false, 00:20:23.194 "abort": true, 00:20:23.194 "seek_hole": false, 00:20:23.194 "seek_data": false, 00:20:23.194 "copy": true, 00:20:23.194 "nvme_iov_md": false 00:20:23.194 }, 00:20:23.194 "memory_domains": [ 00:20:23.194 { 00:20:23.194 "dma_device_id": "system", 00:20:23.194 "dma_device_type": 1 00:20:23.194 }, 00:20:23.194 { 00:20:23.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.194 "dma_device_type": 2 00:20:23.194 } 00:20:23.194 ], 00:20:23.194 "driver_specific": {} 00:20:23.194 }' 00:20:23.194 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.451 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.451 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.451 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.451 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.451 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.451 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.451 06:36:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.451 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.451 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.709 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.709 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.709 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.709 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:23.709 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.967 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.967 "name": "BaseBdev2", 00:20:23.967 "aliases": [ 00:20:23.967 "b63b48fa-4351-4cfe-8e92-a8e31da49aeb" 00:20:23.967 ], 00:20:23.967 "product_name": "Malloc disk", 00:20:23.967 "block_size": 512, 00:20:23.967 "num_blocks": 65536, 00:20:23.967 "uuid": "b63b48fa-4351-4cfe-8e92-a8e31da49aeb", 00:20:23.967 "assigned_rate_limits": { 00:20:23.967 "rw_ios_per_sec": 0, 00:20:23.967 "rw_mbytes_per_sec": 0, 00:20:23.967 "r_mbytes_per_sec": 0, 00:20:23.967 "w_mbytes_per_sec": 0 00:20:23.967 }, 00:20:23.967 "claimed": true, 00:20:23.967 "claim_type": "exclusive_write", 00:20:23.967 "zoned": false, 00:20:23.967 "supported_io_types": { 00:20:23.967 "read": true, 00:20:23.967 "write": true, 00:20:23.967 "unmap": true, 00:20:23.967 "flush": true, 00:20:23.967 "reset": true, 00:20:23.967 "nvme_admin": false, 00:20:23.967 "nvme_io": false, 00:20:23.967 "nvme_io_md": false, 00:20:23.967 "write_zeroes": true, 00:20:23.967 "zcopy": true, 00:20:23.967 "get_zone_info": false, 00:20:23.967 "zone_management": false, 00:20:23.967 "zone_append": false, 00:20:23.967 "compare": false, 00:20:23.967 "compare_and_write": false, 00:20:23.967 "abort": true, 00:20:23.967 "seek_hole": false, 00:20:23.967 "seek_data": false, 00:20:23.967 "copy": true, 00:20:23.967 "nvme_iov_md": false 00:20:23.967 }, 00:20:23.967 "memory_domains": [ 00:20:23.967 { 00:20:23.967 "dma_device_id": "system", 00:20:23.967 "dma_device_type": 1 00:20:23.967 }, 00:20:23.967 { 00:20:23.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.967 "dma_device_type": 2 00:20:23.967 } 00:20:23.967 ], 00:20:23.967 "driver_specific": {} 00:20:23.967 }' 00:20:23.967 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.967 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.967 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.967 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.967 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.967 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.967 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.224 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.224 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.224 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.224 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.224 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.224 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:24.224 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:24.224 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.482 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.482 "name": "BaseBdev3", 00:20:24.482 "aliases": [ 00:20:24.482 "84552c58-df03-409f-a5a8-eeb67af0d880" 00:20:24.482 ], 00:20:24.482 "product_name": "Malloc disk", 00:20:24.482 "block_size": 512, 00:20:24.482 "num_blocks": 65536, 00:20:24.482 "uuid": "84552c58-df03-409f-a5a8-eeb67af0d880", 00:20:24.482 "assigned_rate_limits": { 00:20:24.482 "rw_ios_per_sec": 0, 00:20:24.482 "rw_mbytes_per_sec": 0, 00:20:24.482 "r_mbytes_per_sec": 0, 00:20:24.482 "w_mbytes_per_sec": 0 00:20:24.482 }, 00:20:24.482 "claimed": true, 00:20:24.482 "claim_type": "exclusive_write", 00:20:24.482 "zoned": false, 00:20:24.482 "supported_io_types": { 00:20:24.482 "read": true, 00:20:24.482 "write": true, 00:20:24.482 "unmap": true, 00:20:24.482 "flush": true, 00:20:24.482 "reset": true, 00:20:24.482 "nvme_admin": false, 00:20:24.482 "nvme_io": false, 00:20:24.482 "nvme_io_md": false, 00:20:24.482 "write_zeroes": true, 00:20:24.482 "zcopy": true, 00:20:24.482 "get_zone_info": false, 00:20:24.482 "zone_management": false, 00:20:24.482 "zone_append": false, 00:20:24.482 "compare": false, 00:20:24.482 "compare_and_write": false, 00:20:24.482 "abort": true, 00:20:24.482 "seek_hole": false, 00:20:24.482 "seek_data": false, 00:20:24.482 "copy": true, 00:20:24.482 "nvme_iov_md": false 00:20:24.482 }, 00:20:24.482 "memory_domains": [ 00:20:24.482 { 00:20:24.482 "dma_device_id": "system", 00:20:24.482 "dma_device_type": 1 00:20:24.482 }, 00:20:24.482 { 00:20:24.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.482 "dma_device_type": 2 00:20:24.482 } 00:20:24.482 ], 00:20:24.482 "driver_specific": {} 00:20:24.482 }' 00:20:24.482 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.482 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.482 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.482 06:36:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.482 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.739 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.739 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.739 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.739 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.739 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.739 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.739 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.740 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:24.740 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:24.740 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.997 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.997 "name": "BaseBdev4", 00:20:24.997 "aliases": [ 00:20:24.997 "758a90e8-d6a0-43d7-a197-b15301d00228" 00:20:24.997 ], 00:20:24.997 "product_name": "Malloc disk", 00:20:24.997 "block_size": 512, 00:20:24.997 "num_blocks": 65536, 00:20:24.997 "uuid": "758a90e8-d6a0-43d7-a197-b15301d00228", 00:20:24.997 "assigned_rate_limits": { 00:20:24.997 "rw_ios_per_sec": 0, 00:20:24.997 "rw_mbytes_per_sec": 0, 00:20:24.997 "r_mbytes_per_sec": 0, 00:20:24.997 "w_mbytes_per_sec": 0 00:20:24.997 }, 00:20:24.997 "claimed": true, 00:20:24.997 "claim_type": "exclusive_write", 00:20:24.997 "zoned": false, 00:20:24.997 "supported_io_types": { 00:20:24.997 "read": true, 00:20:24.997 "write": true, 00:20:24.997 "unmap": true, 00:20:24.997 "flush": true, 00:20:24.997 "reset": true, 00:20:24.997 "nvme_admin": false, 00:20:24.997 "nvme_io": false, 00:20:24.997 "nvme_io_md": false, 00:20:24.997 "write_zeroes": true, 00:20:24.997 "zcopy": true, 00:20:24.997 "get_zone_info": false, 00:20:24.997 "zone_management": false, 00:20:24.997 "zone_append": false, 00:20:24.997 "compare": false, 00:20:24.997 "compare_and_write": false, 00:20:24.997 "abort": true, 00:20:24.997 "seek_hole": false, 00:20:24.997 "seek_data": false, 00:20:24.997 "copy": true, 00:20:24.997 "nvme_iov_md": false 00:20:24.997 }, 00:20:24.997 "memory_domains": [ 00:20:24.997 { 00:20:24.997 "dma_device_id": "system", 00:20:24.997 "dma_device_type": 1 00:20:24.997 }, 00:20:24.997 { 00:20:24.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.997 "dma_device_type": 2 00:20:24.997 } 00:20:24.997 ], 00:20:24.997 "driver_specific": {} 00:20:24.997 }' 00:20:24.997 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.997 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.997 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.997 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:25.255 06:36:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:25.513 [2024-07-25 06:36:39.005252] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:25.513 [2024-07-25 06:36:39.005279] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:25.513 [2024-07-25 06:36:39.005325] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.513 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:25.771 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.771 "name": "Existed_Raid", 00:20:25.771 "uuid": "585cc9b6-1e9e-4167-a64b-75f5191010a8", 00:20:25.771 "strip_size_kb": 64, 00:20:25.771 "state": "offline", 00:20:25.771 "raid_level": "raid0", 00:20:25.771 "superblock": false, 00:20:25.771 "num_base_bdevs": 4, 00:20:25.771 "num_base_bdevs_discovered": 3, 00:20:25.771 "num_base_bdevs_operational": 3, 00:20:25.771 "base_bdevs_list": [ 00:20:25.771 { 00:20:25.771 "name": null, 00:20:25.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.771 "is_configured": false, 00:20:25.771 "data_offset": 0, 00:20:25.771 "data_size": 65536 00:20:25.771 }, 00:20:25.771 { 00:20:25.771 "name": "BaseBdev2", 00:20:25.771 "uuid": "b63b48fa-4351-4cfe-8e92-a8e31da49aeb", 00:20:25.771 "is_configured": true, 00:20:25.771 "data_offset": 0, 00:20:25.771 "data_size": 65536 00:20:25.771 }, 00:20:25.771 { 00:20:25.771 "name": "BaseBdev3", 00:20:25.771 "uuid": "84552c58-df03-409f-a5a8-eeb67af0d880", 00:20:25.771 "is_configured": true, 00:20:25.771 "data_offset": 0, 00:20:25.771 "data_size": 65536 00:20:25.771 }, 00:20:25.771 { 00:20:25.771 "name": "BaseBdev4", 00:20:25.771 "uuid": "758a90e8-d6a0-43d7-a197-b15301d00228", 00:20:25.771 "is_configured": true, 00:20:25.771 "data_offset": 0, 00:20:25.771 "data_size": 65536 00:20:25.771 } 00:20:25.771 ] 00:20:25.771 }' 00:20:25.771 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.771 06:36:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.336 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:26.336 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:26.336 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:26.336 06:36:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.594 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:26.594 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:26.594 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:26.851 [2024-07-25 06:36:40.233558] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:26.851 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:26.851 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:26.851 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.851 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:27.109 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:27.109 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:27.109 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:27.367 [2024-07-25 06:36:40.696770] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:27.367 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:27.367 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:27.367 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.367 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:27.625 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:27.625 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:27.625 06:36:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:27.625 [2024-07-25 06:36:41.160153] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:27.625 [2024-07-25 06:36:41.160196] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x132d250 name Existed_Raid, state offline 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:27.883 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:28.152 BaseBdev2 00:20:28.152 06:36:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:28.152 06:36:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:28.152 06:36:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:28.152 06:36:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:28.152 06:36:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:28.152 06:36:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:28.152 06:36:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:28.461 06:36:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:28.719 [ 00:20:28.719 { 00:20:28.719 "name": "BaseBdev2", 00:20:28.719 "aliases": [ 00:20:28.719 "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3" 00:20:28.719 ], 00:20:28.719 "product_name": "Malloc disk", 00:20:28.719 "block_size": 512, 00:20:28.719 "num_blocks": 65536, 00:20:28.719 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:28.719 "assigned_rate_limits": { 00:20:28.719 "rw_ios_per_sec": 0, 00:20:28.719 "rw_mbytes_per_sec": 0, 00:20:28.719 "r_mbytes_per_sec": 0, 00:20:28.719 "w_mbytes_per_sec": 0 00:20:28.719 }, 00:20:28.719 "claimed": false, 00:20:28.719 "zoned": false, 00:20:28.719 "supported_io_types": { 00:20:28.719 "read": true, 00:20:28.719 "write": true, 00:20:28.719 "unmap": true, 00:20:28.719 "flush": true, 00:20:28.719 "reset": true, 00:20:28.719 "nvme_admin": false, 00:20:28.719 "nvme_io": false, 00:20:28.719 "nvme_io_md": false, 00:20:28.719 "write_zeroes": true, 00:20:28.719 "zcopy": true, 00:20:28.719 "get_zone_info": false, 00:20:28.719 "zone_management": false, 00:20:28.719 "zone_append": false, 00:20:28.719 "compare": false, 00:20:28.719 "compare_and_write": false, 00:20:28.719 "abort": true, 00:20:28.719 "seek_hole": false, 00:20:28.719 "seek_data": false, 00:20:28.719 "copy": true, 00:20:28.719 "nvme_iov_md": false 00:20:28.719 }, 00:20:28.719 "memory_domains": [ 00:20:28.719 { 00:20:28.719 "dma_device_id": "system", 00:20:28.719 "dma_device_type": 1 00:20:28.719 }, 00:20:28.719 { 00:20:28.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.719 "dma_device_type": 2 00:20:28.719 } 00:20:28.719 ], 00:20:28.719 "driver_specific": {} 00:20:28.719 } 00:20:28.719 ] 00:20:28.719 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:28.719 06:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:28.719 06:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:28.719 06:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:28.977 BaseBdev3 00:20:28.977 06:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:28.977 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:28.977 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:28.977 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:28.977 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:28.977 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:28.977 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:29.235 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:29.235 [ 00:20:29.235 { 00:20:29.235 "name": "BaseBdev3", 00:20:29.235 "aliases": [ 00:20:29.235 "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9" 00:20:29.235 ], 00:20:29.235 "product_name": "Malloc disk", 00:20:29.235 "block_size": 512, 00:20:29.235 "num_blocks": 65536, 00:20:29.235 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:29.235 "assigned_rate_limits": { 00:20:29.235 "rw_ios_per_sec": 0, 00:20:29.235 "rw_mbytes_per_sec": 0, 00:20:29.235 "r_mbytes_per_sec": 0, 00:20:29.235 "w_mbytes_per_sec": 0 00:20:29.235 }, 00:20:29.235 "claimed": false, 00:20:29.235 "zoned": false, 00:20:29.235 "supported_io_types": { 00:20:29.235 "read": true, 00:20:29.235 "write": true, 00:20:29.235 "unmap": true, 00:20:29.235 "flush": true, 00:20:29.235 "reset": true, 00:20:29.235 "nvme_admin": false, 00:20:29.235 "nvme_io": false, 00:20:29.235 "nvme_io_md": false, 00:20:29.235 "write_zeroes": true, 00:20:29.235 "zcopy": true, 00:20:29.235 "get_zone_info": false, 00:20:29.235 "zone_management": false, 00:20:29.235 "zone_append": false, 00:20:29.235 "compare": false, 00:20:29.235 "compare_and_write": false, 00:20:29.235 "abort": true, 00:20:29.235 "seek_hole": false, 00:20:29.235 "seek_data": false, 00:20:29.235 "copy": true, 00:20:29.235 "nvme_iov_md": false 00:20:29.235 }, 00:20:29.235 "memory_domains": [ 00:20:29.235 { 00:20:29.235 "dma_device_id": "system", 00:20:29.235 "dma_device_type": 1 00:20:29.235 }, 00:20:29.235 { 00:20:29.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.235 "dma_device_type": 2 00:20:29.235 } 00:20:29.235 ], 00:20:29.235 "driver_specific": {} 00:20:29.235 } 00:20:29.235 ] 00:20:29.235 06:36:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:29.235 06:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:29.492 06:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:29.493 06:36:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:29.493 BaseBdev4 00:20:29.493 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:29.493 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:29.493 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:29.493 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:29.493 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:29.493 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:29.493 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:29.750 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:30.019 [ 00:20:30.019 { 00:20:30.019 "name": "BaseBdev4", 00:20:30.019 "aliases": [ 00:20:30.019 "3fb28383-efc3-44df-bf9e-17685ea01193" 00:20:30.019 ], 00:20:30.019 "product_name": "Malloc disk", 00:20:30.019 "block_size": 512, 00:20:30.019 "num_blocks": 65536, 00:20:30.019 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:30.019 "assigned_rate_limits": { 00:20:30.019 "rw_ios_per_sec": 0, 00:20:30.019 "rw_mbytes_per_sec": 0, 00:20:30.019 "r_mbytes_per_sec": 0, 00:20:30.019 "w_mbytes_per_sec": 0 00:20:30.019 }, 00:20:30.019 "claimed": false, 00:20:30.019 "zoned": false, 00:20:30.019 "supported_io_types": { 00:20:30.019 "read": true, 00:20:30.019 "write": true, 00:20:30.019 "unmap": true, 00:20:30.019 "flush": true, 00:20:30.019 "reset": true, 00:20:30.019 "nvme_admin": false, 00:20:30.019 "nvme_io": false, 00:20:30.019 "nvme_io_md": false, 00:20:30.019 "write_zeroes": true, 00:20:30.019 "zcopy": true, 00:20:30.019 "get_zone_info": false, 00:20:30.019 "zone_management": false, 00:20:30.019 "zone_append": false, 00:20:30.019 "compare": false, 00:20:30.019 "compare_and_write": false, 00:20:30.019 "abort": true, 00:20:30.019 "seek_hole": false, 00:20:30.019 "seek_data": false, 00:20:30.019 "copy": true, 00:20:30.019 "nvme_iov_md": false 00:20:30.019 }, 00:20:30.019 "memory_domains": [ 00:20:30.019 { 00:20:30.019 "dma_device_id": "system", 00:20:30.019 "dma_device_type": 1 00:20:30.019 }, 00:20:30.019 { 00:20:30.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.019 "dma_device_type": 2 00:20:30.019 } 00:20:30.019 ], 00:20:30.019 "driver_specific": {} 00:20:30.019 } 00:20:30.019 ] 00:20:30.019 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:30.019 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:30.019 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:30.019 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:30.275 [2024-07-25 06:36:43.677718] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:30.275 [2024-07-25 06:36:43.677756] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:30.275 [2024-07-25 06:36:43.677774] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:30.275 [2024-07-25 06:36:43.678975] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:30.275 [2024-07-25 06:36:43.679014] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:30.275 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.276 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:30.532 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:30.532 "name": "Existed_Raid", 00:20:30.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.532 "strip_size_kb": 64, 00:20:30.532 "state": "configuring", 00:20:30.533 "raid_level": "raid0", 00:20:30.533 "superblock": false, 00:20:30.533 "num_base_bdevs": 4, 00:20:30.533 "num_base_bdevs_discovered": 3, 00:20:30.533 "num_base_bdevs_operational": 4, 00:20:30.533 "base_bdevs_list": [ 00:20:30.533 { 00:20:30.533 "name": "BaseBdev1", 00:20:30.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.533 "is_configured": false, 00:20:30.533 "data_offset": 0, 00:20:30.533 "data_size": 0 00:20:30.533 }, 00:20:30.533 { 00:20:30.533 "name": "BaseBdev2", 00:20:30.533 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:30.533 "is_configured": true, 00:20:30.533 "data_offset": 0, 00:20:30.533 "data_size": 65536 00:20:30.533 }, 00:20:30.533 { 00:20:30.533 "name": "BaseBdev3", 00:20:30.533 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:30.533 "is_configured": true, 00:20:30.533 "data_offset": 0, 00:20:30.533 "data_size": 65536 00:20:30.533 }, 00:20:30.533 { 00:20:30.533 "name": "BaseBdev4", 00:20:30.533 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:30.533 "is_configured": true, 00:20:30.533 "data_offset": 0, 00:20:30.533 "data_size": 65536 00:20:30.533 } 00:20:30.533 ] 00:20:30.533 }' 00:20:30.533 06:36:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:30.533 06:36:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:31.097 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:31.355 [2024-07-25 06:36:44.692371] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.355 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:31.613 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.613 "name": "Existed_Raid", 00:20:31.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.613 "strip_size_kb": 64, 00:20:31.613 "state": "configuring", 00:20:31.613 "raid_level": "raid0", 00:20:31.613 "superblock": false, 00:20:31.613 "num_base_bdevs": 4, 00:20:31.613 "num_base_bdevs_discovered": 2, 00:20:31.613 "num_base_bdevs_operational": 4, 00:20:31.613 "base_bdevs_list": [ 00:20:31.613 { 00:20:31.613 "name": "BaseBdev1", 00:20:31.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.613 "is_configured": false, 00:20:31.613 "data_offset": 0, 00:20:31.613 "data_size": 0 00:20:31.613 }, 00:20:31.613 { 00:20:31.613 "name": null, 00:20:31.613 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:31.613 "is_configured": false, 00:20:31.613 "data_offset": 0, 00:20:31.613 "data_size": 65536 00:20:31.613 }, 00:20:31.613 { 00:20:31.613 "name": "BaseBdev3", 00:20:31.613 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:31.613 "is_configured": true, 00:20:31.613 "data_offset": 0, 00:20:31.613 "data_size": 65536 00:20:31.613 }, 00:20:31.613 { 00:20:31.613 "name": "BaseBdev4", 00:20:31.613 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:31.613 "is_configured": true, 00:20:31.613 "data_offset": 0, 00:20:31.613 "data_size": 65536 00:20:31.613 } 00:20:31.613 ] 00:20:31.613 }' 00:20:31.613 06:36:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.613 06:36:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.179 06:36:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.179 06:36:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:32.179 06:36:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:32.179 06:36:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:32.436 [2024-07-25 06:36:45.886627] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:32.436 BaseBdev1 00:20:32.436 06:36:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:32.436 06:36:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:32.436 06:36:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:32.436 06:36:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:32.436 06:36:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:32.436 06:36:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:32.437 06:36:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:32.694 06:36:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:32.952 [ 00:20:32.952 { 00:20:32.952 "name": "BaseBdev1", 00:20:32.952 "aliases": [ 00:20:32.952 "14777652-b4f4-4114-ac4f-d916afa5e6a9" 00:20:32.952 ], 00:20:32.952 "product_name": "Malloc disk", 00:20:32.952 "block_size": 512, 00:20:32.952 "num_blocks": 65536, 00:20:32.952 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:32.952 "assigned_rate_limits": { 00:20:32.952 "rw_ios_per_sec": 0, 00:20:32.952 "rw_mbytes_per_sec": 0, 00:20:32.952 "r_mbytes_per_sec": 0, 00:20:32.952 "w_mbytes_per_sec": 0 00:20:32.952 }, 00:20:32.952 "claimed": true, 00:20:32.952 "claim_type": "exclusive_write", 00:20:32.952 "zoned": false, 00:20:32.952 "supported_io_types": { 00:20:32.952 "read": true, 00:20:32.952 "write": true, 00:20:32.952 "unmap": true, 00:20:32.952 "flush": true, 00:20:32.952 "reset": true, 00:20:32.952 "nvme_admin": false, 00:20:32.952 "nvme_io": false, 00:20:32.952 "nvme_io_md": false, 00:20:32.952 "write_zeroes": true, 00:20:32.952 "zcopy": true, 00:20:32.952 "get_zone_info": false, 00:20:32.952 "zone_management": false, 00:20:32.952 "zone_append": false, 00:20:32.952 "compare": false, 00:20:32.952 "compare_and_write": false, 00:20:32.952 "abort": true, 00:20:32.952 "seek_hole": false, 00:20:32.952 "seek_data": false, 00:20:32.952 "copy": true, 00:20:32.952 "nvme_iov_md": false 00:20:32.952 }, 00:20:32.952 "memory_domains": [ 00:20:32.952 { 00:20:32.952 "dma_device_id": "system", 00:20:32.952 "dma_device_type": 1 00:20:32.952 }, 00:20:32.952 { 00:20:32.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.952 "dma_device_type": 2 00:20:32.952 } 00:20:32.952 ], 00:20:32.952 "driver_specific": {} 00:20:32.952 } 00:20:32.952 ] 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.952 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:33.210 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.210 "name": "Existed_Raid", 00:20:33.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.210 "strip_size_kb": 64, 00:20:33.210 "state": "configuring", 00:20:33.210 "raid_level": "raid0", 00:20:33.210 "superblock": false, 00:20:33.210 "num_base_bdevs": 4, 00:20:33.210 "num_base_bdevs_discovered": 3, 00:20:33.210 "num_base_bdevs_operational": 4, 00:20:33.210 "base_bdevs_list": [ 00:20:33.210 { 00:20:33.210 "name": "BaseBdev1", 00:20:33.210 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:33.210 "is_configured": true, 00:20:33.210 "data_offset": 0, 00:20:33.210 "data_size": 65536 00:20:33.210 }, 00:20:33.210 { 00:20:33.210 "name": null, 00:20:33.210 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:33.210 "is_configured": false, 00:20:33.210 "data_offset": 0, 00:20:33.210 "data_size": 65536 00:20:33.210 }, 00:20:33.210 { 00:20:33.210 "name": "BaseBdev3", 00:20:33.210 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:33.210 "is_configured": true, 00:20:33.210 "data_offset": 0, 00:20:33.210 "data_size": 65536 00:20:33.210 }, 00:20:33.210 { 00:20:33.210 "name": "BaseBdev4", 00:20:33.210 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:33.210 "is_configured": true, 00:20:33.210 "data_offset": 0, 00:20:33.210 "data_size": 65536 00:20:33.210 } 00:20:33.210 ] 00:20:33.210 }' 00:20:33.210 06:36:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.210 06:36:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.776 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.776 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:34.034 [2024-07-25 06:36:47.510952] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.034 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:34.292 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.292 "name": "Existed_Raid", 00:20:34.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.292 "strip_size_kb": 64, 00:20:34.292 "state": "configuring", 00:20:34.292 "raid_level": "raid0", 00:20:34.292 "superblock": false, 00:20:34.292 "num_base_bdevs": 4, 00:20:34.292 "num_base_bdevs_discovered": 2, 00:20:34.292 "num_base_bdevs_operational": 4, 00:20:34.292 "base_bdevs_list": [ 00:20:34.292 { 00:20:34.292 "name": "BaseBdev1", 00:20:34.292 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:34.292 "is_configured": true, 00:20:34.292 "data_offset": 0, 00:20:34.292 "data_size": 65536 00:20:34.292 }, 00:20:34.292 { 00:20:34.292 "name": null, 00:20:34.292 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:34.292 "is_configured": false, 00:20:34.292 "data_offset": 0, 00:20:34.292 "data_size": 65536 00:20:34.292 }, 00:20:34.292 { 00:20:34.292 "name": null, 00:20:34.292 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:34.292 "is_configured": false, 00:20:34.292 "data_offset": 0, 00:20:34.292 "data_size": 65536 00:20:34.292 }, 00:20:34.292 { 00:20:34.292 "name": "BaseBdev4", 00:20:34.292 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:34.292 "is_configured": true, 00:20:34.292 "data_offset": 0, 00:20:34.292 "data_size": 65536 00:20:34.292 } 00:20:34.292 ] 00:20:34.292 }' 00:20:34.292 06:36:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.292 06:36:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.857 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.857 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:35.115 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:35.115 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:35.373 [2024-07-25 06:36:48.734202] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.373 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:35.630 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:35.630 "name": "Existed_Raid", 00:20:35.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:35.630 "strip_size_kb": 64, 00:20:35.630 "state": "configuring", 00:20:35.630 "raid_level": "raid0", 00:20:35.630 "superblock": false, 00:20:35.630 "num_base_bdevs": 4, 00:20:35.631 "num_base_bdevs_discovered": 3, 00:20:35.631 "num_base_bdevs_operational": 4, 00:20:35.631 "base_bdevs_list": [ 00:20:35.631 { 00:20:35.631 "name": "BaseBdev1", 00:20:35.631 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:35.631 "is_configured": true, 00:20:35.631 "data_offset": 0, 00:20:35.631 "data_size": 65536 00:20:35.631 }, 00:20:35.631 { 00:20:35.631 "name": null, 00:20:35.631 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:35.631 "is_configured": false, 00:20:35.631 "data_offset": 0, 00:20:35.631 "data_size": 65536 00:20:35.631 }, 00:20:35.631 { 00:20:35.631 "name": "BaseBdev3", 00:20:35.631 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:35.631 "is_configured": true, 00:20:35.631 "data_offset": 0, 00:20:35.631 "data_size": 65536 00:20:35.631 }, 00:20:35.631 { 00:20:35.631 "name": "BaseBdev4", 00:20:35.631 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:35.631 "is_configured": true, 00:20:35.631 "data_offset": 0, 00:20:35.631 "data_size": 65536 00:20:35.631 } 00:20:35.631 ] 00:20:35.631 }' 00:20:35.631 06:36:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:35.631 06:36:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:36.195 06:36:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.195 06:36:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:36.453 06:36:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:36.453 06:36:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:36.453 [2024-07-25 06:36:49.989522] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.712 "name": "Existed_Raid", 00:20:36.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.712 "strip_size_kb": 64, 00:20:36.712 "state": "configuring", 00:20:36.712 "raid_level": "raid0", 00:20:36.712 "superblock": false, 00:20:36.712 "num_base_bdevs": 4, 00:20:36.712 "num_base_bdevs_discovered": 2, 00:20:36.712 "num_base_bdevs_operational": 4, 00:20:36.712 "base_bdevs_list": [ 00:20:36.712 { 00:20:36.712 "name": null, 00:20:36.712 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:36.712 "is_configured": false, 00:20:36.712 "data_offset": 0, 00:20:36.712 "data_size": 65536 00:20:36.712 }, 00:20:36.712 { 00:20:36.712 "name": null, 00:20:36.712 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:36.712 "is_configured": false, 00:20:36.712 "data_offset": 0, 00:20:36.712 "data_size": 65536 00:20:36.712 }, 00:20:36.712 { 00:20:36.712 "name": "BaseBdev3", 00:20:36.712 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:36.712 "is_configured": true, 00:20:36.712 "data_offset": 0, 00:20:36.712 "data_size": 65536 00:20:36.712 }, 00:20:36.712 { 00:20:36.712 "name": "BaseBdev4", 00:20:36.712 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:36.712 "is_configured": true, 00:20:36.712 "data_offset": 0, 00:20:36.712 "data_size": 65536 00:20:36.712 } 00:20:36.712 ] 00:20:36.712 }' 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.712 06:36:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:37.279 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.279 06:36:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:37.537 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:37.537 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:37.796 [2024-07-25 06:36:51.267055] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.796 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:38.055 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.055 "name": "Existed_Raid", 00:20:38.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.055 "strip_size_kb": 64, 00:20:38.055 "state": "configuring", 00:20:38.055 "raid_level": "raid0", 00:20:38.055 "superblock": false, 00:20:38.055 "num_base_bdevs": 4, 00:20:38.055 "num_base_bdevs_discovered": 3, 00:20:38.055 "num_base_bdevs_operational": 4, 00:20:38.055 "base_bdevs_list": [ 00:20:38.055 { 00:20:38.055 "name": null, 00:20:38.055 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:38.055 "is_configured": false, 00:20:38.055 "data_offset": 0, 00:20:38.055 "data_size": 65536 00:20:38.055 }, 00:20:38.055 { 00:20:38.055 "name": "BaseBdev2", 00:20:38.055 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:38.055 "is_configured": true, 00:20:38.055 "data_offset": 0, 00:20:38.055 "data_size": 65536 00:20:38.055 }, 00:20:38.055 { 00:20:38.055 "name": "BaseBdev3", 00:20:38.055 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:38.055 "is_configured": true, 00:20:38.055 "data_offset": 0, 00:20:38.055 "data_size": 65536 00:20:38.055 }, 00:20:38.055 { 00:20:38.055 "name": "BaseBdev4", 00:20:38.055 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:38.055 "is_configured": true, 00:20:38.055 "data_offset": 0, 00:20:38.055 "data_size": 65536 00:20:38.055 } 00:20:38.055 ] 00:20:38.055 }' 00:20:38.055 06:36:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.055 06:36:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:38.621 06:36:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:38.621 06:36:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.880 06:36:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:38.880 06:36:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.880 06:36:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:39.138 06:36:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 14777652-b4f4-4114-ac4f-d916afa5e6a9 00:20:39.396 [2024-07-25 06:36:52.778396] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:39.396 [2024-07-25 06:36:52.778431] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x117b710 00:20:39.396 [2024-07-25 06:36:52.778439] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:39.396 [2024-07-25 06:36:52.778613] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1323620 00:20:39.396 [2024-07-25 06:36:52.778718] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x117b710 00:20:39.396 [2024-07-25 06:36:52.778727] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x117b710 00:20:39.396 [2024-07-25 06:36:52.778876] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:39.396 NewBaseBdev 00:20:39.396 06:36:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:39.396 06:36:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:20:39.396 06:36:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:39.396 06:36:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:39.396 06:36:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:39.396 06:36:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:39.396 06:36:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:39.655 06:36:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:39.926 [ 00:20:39.926 { 00:20:39.926 "name": "NewBaseBdev", 00:20:39.926 "aliases": [ 00:20:39.926 "14777652-b4f4-4114-ac4f-d916afa5e6a9" 00:20:39.926 ], 00:20:39.926 "product_name": "Malloc disk", 00:20:39.926 "block_size": 512, 00:20:39.926 "num_blocks": 65536, 00:20:39.926 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:39.926 "assigned_rate_limits": { 00:20:39.926 "rw_ios_per_sec": 0, 00:20:39.926 "rw_mbytes_per_sec": 0, 00:20:39.926 "r_mbytes_per_sec": 0, 00:20:39.926 "w_mbytes_per_sec": 0 00:20:39.926 }, 00:20:39.926 "claimed": true, 00:20:39.926 "claim_type": "exclusive_write", 00:20:39.926 "zoned": false, 00:20:39.926 "supported_io_types": { 00:20:39.926 "read": true, 00:20:39.926 "write": true, 00:20:39.926 "unmap": true, 00:20:39.926 "flush": true, 00:20:39.926 "reset": true, 00:20:39.926 "nvme_admin": false, 00:20:39.926 "nvme_io": false, 00:20:39.926 "nvme_io_md": false, 00:20:39.926 "write_zeroes": true, 00:20:39.926 "zcopy": true, 00:20:39.926 "get_zone_info": false, 00:20:39.926 "zone_management": false, 00:20:39.926 "zone_append": false, 00:20:39.926 "compare": false, 00:20:39.926 "compare_and_write": false, 00:20:39.926 "abort": true, 00:20:39.926 "seek_hole": false, 00:20:39.926 "seek_data": false, 00:20:39.926 "copy": true, 00:20:39.926 "nvme_iov_md": false 00:20:39.926 }, 00:20:39.926 "memory_domains": [ 00:20:39.926 { 00:20:39.926 "dma_device_id": "system", 00:20:39.926 "dma_device_type": 1 00:20:39.926 }, 00:20:39.926 { 00:20:39.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.926 "dma_device_type": 2 00:20:39.926 } 00:20:39.926 ], 00:20:39.926 "driver_specific": {} 00:20:39.926 } 00:20:39.926 ] 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.926 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:40.183 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.183 "name": "Existed_Raid", 00:20:40.183 "uuid": "580320ef-027e-4d45-8ac3-26b7dfb40403", 00:20:40.183 "strip_size_kb": 64, 00:20:40.183 "state": "online", 00:20:40.183 "raid_level": "raid0", 00:20:40.183 "superblock": false, 00:20:40.183 "num_base_bdevs": 4, 00:20:40.183 "num_base_bdevs_discovered": 4, 00:20:40.183 "num_base_bdevs_operational": 4, 00:20:40.183 "base_bdevs_list": [ 00:20:40.183 { 00:20:40.183 "name": "NewBaseBdev", 00:20:40.183 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:40.183 "is_configured": true, 00:20:40.183 "data_offset": 0, 00:20:40.183 "data_size": 65536 00:20:40.183 }, 00:20:40.183 { 00:20:40.183 "name": "BaseBdev2", 00:20:40.183 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:40.183 "is_configured": true, 00:20:40.183 "data_offset": 0, 00:20:40.183 "data_size": 65536 00:20:40.183 }, 00:20:40.183 { 00:20:40.183 "name": "BaseBdev3", 00:20:40.183 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:40.183 "is_configured": true, 00:20:40.183 "data_offset": 0, 00:20:40.183 "data_size": 65536 00:20:40.183 }, 00:20:40.183 { 00:20:40.183 "name": "BaseBdev4", 00:20:40.183 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:40.183 "is_configured": true, 00:20:40.183 "data_offset": 0, 00:20:40.183 "data_size": 65536 00:20:40.183 } 00:20:40.183 ] 00:20:40.183 }' 00:20:40.183 06:36:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.183 06:36:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:40.748 [2024-07-25 06:36:54.262625] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:40.748 "name": "Existed_Raid", 00:20:40.748 "aliases": [ 00:20:40.748 "580320ef-027e-4d45-8ac3-26b7dfb40403" 00:20:40.748 ], 00:20:40.748 "product_name": "Raid Volume", 00:20:40.748 "block_size": 512, 00:20:40.748 "num_blocks": 262144, 00:20:40.748 "uuid": "580320ef-027e-4d45-8ac3-26b7dfb40403", 00:20:40.748 "assigned_rate_limits": { 00:20:40.748 "rw_ios_per_sec": 0, 00:20:40.748 "rw_mbytes_per_sec": 0, 00:20:40.748 "r_mbytes_per_sec": 0, 00:20:40.748 "w_mbytes_per_sec": 0 00:20:40.748 }, 00:20:40.748 "claimed": false, 00:20:40.748 "zoned": false, 00:20:40.748 "supported_io_types": { 00:20:40.748 "read": true, 00:20:40.748 "write": true, 00:20:40.748 "unmap": true, 00:20:40.748 "flush": true, 00:20:40.748 "reset": true, 00:20:40.748 "nvme_admin": false, 00:20:40.748 "nvme_io": false, 00:20:40.748 "nvme_io_md": false, 00:20:40.748 "write_zeroes": true, 00:20:40.748 "zcopy": false, 00:20:40.748 "get_zone_info": false, 00:20:40.748 "zone_management": false, 00:20:40.748 "zone_append": false, 00:20:40.748 "compare": false, 00:20:40.748 "compare_and_write": false, 00:20:40.748 "abort": false, 00:20:40.748 "seek_hole": false, 00:20:40.748 "seek_data": false, 00:20:40.748 "copy": false, 00:20:40.748 "nvme_iov_md": false 00:20:40.748 }, 00:20:40.748 "memory_domains": [ 00:20:40.748 { 00:20:40.748 "dma_device_id": "system", 00:20:40.748 "dma_device_type": 1 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.748 "dma_device_type": 2 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "dma_device_id": "system", 00:20:40.748 "dma_device_type": 1 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.748 "dma_device_type": 2 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "dma_device_id": "system", 00:20:40.748 "dma_device_type": 1 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.748 "dma_device_type": 2 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "dma_device_id": "system", 00:20:40.748 "dma_device_type": 1 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.748 "dma_device_type": 2 00:20:40.748 } 00:20:40.748 ], 00:20:40.748 "driver_specific": { 00:20:40.748 "raid": { 00:20:40.748 "uuid": "580320ef-027e-4d45-8ac3-26b7dfb40403", 00:20:40.748 "strip_size_kb": 64, 00:20:40.748 "state": "online", 00:20:40.748 "raid_level": "raid0", 00:20:40.748 "superblock": false, 00:20:40.748 "num_base_bdevs": 4, 00:20:40.748 "num_base_bdevs_discovered": 4, 00:20:40.748 "num_base_bdevs_operational": 4, 00:20:40.748 "base_bdevs_list": [ 00:20:40.748 { 00:20:40.748 "name": "NewBaseBdev", 00:20:40.748 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:40.748 "is_configured": true, 00:20:40.748 "data_offset": 0, 00:20:40.748 "data_size": 65536 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "name": "BaseBdev2", 00:20:40.748 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:40.748 "is_configured": true, 00:20:40.748 "data_offset": 0, 00:20:40.748 "data_size": 65536 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "name": "BaseBdev3", 00:20:40.748 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:40.748 "is_configured": true, 00:20:40.748 "data_offset": 0, 00:20:40.748 "data_size": 65536 00:20:40.748 }, 00:20:40.748 { 00:20:40.748 "name": "BaseBdev4", 00:20:40.748 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:40.748 "is_configured": true, 00:20:40.748 "data_offset": 0, 00:20:40.748 "data_size": 65536 00:20:40.748 } 00:20:40.748 ] 00:20:40.748 } 00:20:40.748 } 00:20:40.748 }' 00:20:40.748 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:41.008 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:41.008 BaseBdev2 00:20:41.008 BaseBdev3 00:20:41.008 BaseBdev4' 00:20:41.008 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:41.008 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:41.008 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:41.008 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:41.008 "name": "NewBaseBdev", 00:20:41.008 "aliases": [ 00:20:41.008 "14777652-b4f4-4114-ac4f-d916afa5e6a9" 00:20:41.008 ], 00:20:41.008 "product_name": "Malloc disk", 00:20:41.008 "block_size": 512, 00:20:41.008 "num_blocks": 65536, 00:20:41.008 "uuid": "14777652-b4f4-4114-ac4f-d916afa5e6a9", 00:20:41.008 "assigned_rate_limits": { 00:20:41.008 "rw_ios_per_sec": 0, 00:20:41.008 "rw_mbytes_per_sec": 0, 00:20:41.008 "r_mbytes_per_sec": 0, 00:20:41.008 "w_mbytes_per_sec": 0 00:20:41.008 }, 00:20:41.008 "claimed": true, 00:20:41.008 "claim_type": "exclusive_write", 00:20:41.008 "zoned": false, 00:20:41.008 "supported_io_types": { 00:20:41.008 "read": true, 00:20:41.008 "write": true, 00:20:41.008 "unmap": true, 00:20:41.008 "flush": true, 00:20:41.008 "reset": true, 00:20:41.008 "nvme_admin": false, 00:20:41.008 "nvme_io": false, 00:20:41.008 "nvme_io_md": false, 00:20:41.008 "write_zeroes": true, 00:20:41.008 "zcopy": true, 00:20:41.008 "get_zone_info": false, 00:20:41.008 "zone_management": false, 00:20:41.008 "zone_append": false, 00:20:41.008 "compare": false, 00:20:41.008 "compare_and_write": false, 00:20:41.008 "abort": true, 00:20:41.008 "seek_hole": false, 00:20:41.008 "seek_data": false, 00:20:41.008 "copy": true, 00:20:41.008 "nvme_iov_md": false 00:20:41.008 }, 00:20:41.008 "memory_domains": [ 00:20:41.008 { 00:20:41.008 "dma_device_id": "system", 00:20:41.008 "dma_device_type": 1 00:20:41.008 }, 00:20:41.008 { 00:20:41.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:41.008 "dma_device_type": 2 00:20:41.008 } 00:20:41.008 ], 00:20:41.008 "driver_specific": {} 00:20:41.008 }' 00:20:41.008 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:41.267 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:41.525 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:41.525 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:41.525 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:41.525 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:41.525 06:36:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:41.783 "name": "BaseBdev2", 00:20:41.783 "aliases": [ 00:20:41.783 "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3" 00:20:41.783 ], 00:20:41.783 "product_name": "Malloc disk", 00:20:41.783 "block_size": 512, 00:20:41.783 "num_blocks": 65536, 00:20:41.783 "uuid": "b2aa700e-6948-4f57-b9a9-f0bf47ff55a3", 00:20:41.783 "assigned_rate_limits": { 00:20:41.783 "rw_ios_per_sec": 0, 00:20:41.783 "rw_mbytes_per_sec": 0, 00:20:41.783 "r_mbytes_per_sec": 0, 00:20:41.783 "w_mbytes_per_sec": 0 00:20:41.783 }, 00:20:41.783 "claimed": true, 00:20:41.783 "claim_type": "exclusive_write", 00:20:41.783 "zoned": false, 00:20:41.783 "supported_io_types": { 00:20:41.783 "read": true, 00:20:41.783 "write": true, 00:20:41.783 "unmap": true, 00:20:41.783 "flush": true, 00:20:41.783 "reset": true, 00:20:41.783 "nvme_admin": false, 00:20:41.783 "nvme_io": false, 00:20:41.783 "nvme_io_md": false, 00:20:41.783 "write_zeroes": true, 00:20:41.783 "zcopy": true, 00:20:41.783 "get_zone_info": false, 00:20:41.783 "zone_management": false, 00:20:41.783 "zone_append": false, 00:20:41.783 "compare": false, 00:20:41.783 "compare_and_write": false, 00:20:41.783 "abort": true, 00:20:41.783 "seek_hole": false, 00:20:41.783 "seek_data": false, 00:20:41.783 "copy": true, 00:20:41.783 "nvme_iov_md": false 00:20:41.783 }, 00:20:41.783 "memory_domains": [ 00:20:41.783 { 00:20:41.783 "dma_device_id": "system", 00:20:41.783 "dma_device_type": 1 00:20:41.783 }, 00:20:41.783 { 00:20:41.783 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:41.783 "dma_device_type": 2 00:20:41.783 } 00:20:41.783 ], 00:20:41.783 "driver_specific": {} 00:20:41.783 }' 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.783 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.041 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:42.041 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.041 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.041 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:42.041 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:42.041 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:42.041 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:42.359 "name": "BaseBdev3", 00:20:42.359 "aliases": [ 00:20:42.359 "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9" 00:20:42.359 ], 00:20:42.359 "product_name": "Malloc disk", 00:20:42.359 "block_size": 512, 00:20:42.359 "num_blocks": 65536, 00:20:42.359 "uuid": "5b812e8f-4c21-4b82-a0ad-b5ddf1f1a1d9", 00:20:42.359 "assigned_rate_limits": { 00:20:42.359 "rw_ios_per_sec": 0, 00:20:42.359 "rw_mbytes_per_sec": 0, 00:20:42.359 "r_mbytes_per_sec": 0, 00:20:42.359 "w_mbytes_per_sec": 0 00:20:42.359 }, 00:20:42.359 "claimed": true, 00:20:42.359 "claim_type": "exclusive_write", 00:20:42.359 "zoned": false, 00:20:42.359 "supported_io_types": { 00:20:42.359 "read": true, 00:20:42.359 "write": true, 00:20:42.359 "unmap": true, 00:20:42.359 "flush": true, 00:20:42.359 "reset": true, 00:20:42.359 "nvme_admin": false, 00:20:42.359 "nvme_io": false, 00:20:42.359 "nvme_io_md": false, 00:20:42.359 "write_zeroes": true, 00:20:42.359 "zcopy": true, 00:20:42.359 "get_zone_info": false, 00:20:42.359 "zone_management": false, 00:20:42.359 "zone_append": false, 00:20:42.359 "compare": false, 00:20:42.359 "compare_and_write": false, 00:20:42.359 "abort": true, 00:20:42.359 "seek_hole": false, 00:20:42.359 "seek_data": false, 00:20:42.359 "copy": true, 00:20:42.359 "nvme_iov_md": false 00:20:42.359 }, 00:20:42.359 "memory_domains": [ 00:20:42.359 { 00:20:42.359 "dma_device_id": "system", 00:20:42.359 "dma_device_type": 1 00:20:42.359 }, 00:20:42.359 { 00:20:42.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.359 "dma_device_type": 2 00:20:42.359 } 00:20:42.359 ], 00:20:42.359 "driver_specific": {} 00:20:42.359 }' 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.359 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.618 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:42.618 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.618 06:36:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.618 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:42.618 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:42.618 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:42.618 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:42.876 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:42.876 "name": "BaseBdev4", 00:20:42.876 "aliases": [ 00:20:42.876 "3fb28383-efc3-44df-bf9e-17685ea01193" 00:20:42.876 ], 00:20:42.876 "product_name": "Malloc disk", 00:20:42.876 "block_size": 512, 00:20:42.876 "num_blocks": 65536, 00:20:42.876 "uuid": "3fb28383-efc3-44df-bf9e-17685ea01193", 00:20:42.876 "assigned_rate_limits": { 00:20:42.876 "rw_ios_per_sec": 0, 00:20:42.876 "rw_mbytes_per_sec": 0, 00:20:42.876 "r_mbytes_per_sec": 0, 00:20:42.876 "w_mbytes_per_sec": 0 00:20:42.876 }, 00:20:42.876 "claimed": true, 00:20:42.876 "claim_type": "exclusive_write", 00:20:42.876 "zoned": false, 00:20:42.876 "supported_io_types": { 00:20:42.876 "read": true, 00:20:42.876 "write": true, 00:20:42.876 "unmap": true, 00:20:42.876 "flush": true, 00:20:42.876 "reset": true, 00:20:42.876 "nvme_admin": false, 00:20:42.876 "nvme_io": false, 00:20:42.876 "nvme_io_md": false, 00:20:42.876 "write_zeroes": true, 00:20:42.876 "zcopy": true, 00:20:42.876 "get_zone_info": false, 00:20:42.876 "zone_management": false, 00:20:42.876 "zone_append": false, 00:20:42.876 "compare": false, 00:20:42.876 "compare_and_write": false, 00:20:42.876 "abort": true, 00:20:42.876 "seek_hole": false, 00:20:42.876 "seek_data": false, 00:20:42.876 "copy": true, 00:20:42.876 "nvme_iov_md": false 00:20:42.876 }, 00:20:42.876 "memory_domains": [ 00:20:42.876 { 00:20:42.876 "dma_device_id": "system", 00:20:42.876 "dma_device_type": 1 00:20:42.876 }, 00:20:42.876 { 00:20:42.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.876 "dma_device_type": 2 00:20:42.876 } 00:20:42.876 ], 00:20:42.876 "driver_specific": {} 00:20:42.876 }' 00:20:42.876 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:42.876 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:42.876 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:42.876 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.876 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.876 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:42.876 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.134 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.134 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:43.134 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.134 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.134 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:43.134 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:43.393 [2024-07-25 06:36:56.797002] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:43.393 [2024-07-25 06:36:56.797026] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:43.393 [2024-07-25 06:36:56.797077] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:43.393 [2024-07-25 06:36:56.797132] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:43.393 [2024-07-25 06:36:56.797148] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x117b710 name Existed_Raid, state offline 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1167863 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1167863 ']' 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1167863 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1167863 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1167863' 00:20:43.393 killing process with pid 1167863 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1167863 00:20:43.393 [2024-07-25 06:36:56.873358] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:43.393 06:36:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1167863 00:20:43.393 [2024-07-25 06:36:56.904814] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:43.653 00:20:43.653 real 0m30.259s 00:20:43.653 user 0m55.441s 00:20:43.653 sys 0m5.588s 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:43.653 ************************************ 00:20:43.653 END TEST raid_state_function_test 00:20:43.653 ************************************ 00:20:43.653 06:36:57 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:20:43.653 06:36:57 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:43.653 06:36:57 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:43.653 06:36:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:43.653 ************************************ 00:20:43.653 START TEST raid_state_function_test_sb 00:20:43.653 ************************************ 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 true 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1173559 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1173559' 00:20:43.653 Process raid pid: 1173559 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1173559 /var/tmp/spdk-raid.sock 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1173559 ']' 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:43.653 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:43.653 06:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:43.912 [2024-07-25 06:36:57.242419] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:20:43.912 [2024-07-25 06:36:57.242484] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:01.0 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:01.1 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:01.2 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:01.3 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:01.4 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:01.5 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:01.6 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:01.7 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:02.0 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:02.1 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:02.2 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:02.3 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:02.4 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:02.5 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:02.6 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3d:02.7 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:01.0 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:01.1 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:01.2 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:01.3 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:01.4 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:01.5 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:01.6 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:01.7 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:02.0 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:02.1 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:02.2 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:02.3 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:02.4 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:02.5 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:02.6 cannot be used 00:20:43.912 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:20:43.912 EAL: Requested device 0000:3f:02.7 cannot be used 00:20:43.912 [2024-07-25 06:36:57.380938] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:43.912 [2024-07-25 06:36:57.424323] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:44.171 [2024-07-25 06:36:57.486457] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:44.171 [2024-07-25 06:36:57.486483] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:44.737 06:36:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:44.737 06:36:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:20:44.737 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:44.995 [2024-07-25 06:36:58.346434] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:44.995 [2024-07-25 06:36:58.346470] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:44.995 [2024-07-25 06:36:58.346480] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:44.995 [2024-07-25 06:36:58.346491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:44.995 [2024-07-25 06:36:58.346499] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:44.995 [2024-07-25 06:36:58.346509] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:44.995 [2024-07-25 06:36:58.346518] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:44.995 [2024-07-25 06:36:58.346529] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.995 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.254 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.254 "name": "Existed_Raid", 00:20:45.254 "uuid": "564545f7-81e0-48c4-aa81-7723395bdc23", 00:20:45.254 "strip_size_kb": 64, 00:20:45.254 "state": "configuring", 00:20:45.254 "raid_level": "raid0", 00:20:45.254 "superblock": true, 00:20:45.254 "num_base_bdevs": 4, 00:20:45.254 "num_base_bdevs_discovered": 0, 00:20:45.254 "num_base_bdevs_operational": 4, 00:20:45.254 "base_bdevs_list": [ 00:20:45.254 { 00:20:45.254 "name": "BaseBdev1", 00:20:45.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.254 "is_configured": false, 00:20:45.254 "data_offset": 0, 00:20:45.254 "data_size": 0 00:20:45.254 }, 00:20:45.254 { 00:20:45.254 "name": "BaseBdev2", 00:20:45.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.254 "is_configured": false, 00:20:45.254 "data_offset": 0, 00:20:45.254 "data_size": 0 00:20:45.254 }, 00:20:45.254 { 00:20:45.254 "name": "BaseBdev3", 00:20:45.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.254 "is_configured": false, 00:20:45.254 "data_offset": 0, 00:20:45.254 "data_size": 0 00:20:45.254 }, 00:20:45.254 { 00:20:45.254 "name": "BaseBdev4", 00:20:45.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.254 "is_configured": false, 00:20:45.254 "data_offset": 0, 00:20:45.254 "data_size": 0 00:20:45.254 } 00:20:45.254 ] 00:20:45.254 }' 00:20:45.254 06:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.254 06:36:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:45.820 06:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:46.079 [2024-07-25 06:36:59.389030] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:46.079 [2024-07-25 06:36:59.389057] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdd7470 name Existed_Raid, state configuring 00:20:46.079 06:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:46.079 [2024-07-25 06:36:59.613651] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:46.079 [2024-07-25 06:36:59.613680] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:46.079 [2024-07-25 06:36:59.613689] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:46.079 [2024-07-25 06:36:59.613700] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:46.079 [2024-07-25 06:36:59.613708] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:46.079 [2024-07-25 06:36:59.613718] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:46.079 [2024-07-25 06:36:59.613726] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:46.079 [2024-07-25 06:36:59.613736] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:46.079 06:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:46.338 [2024-07-25 06:36:59.847659] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:46.338 BaseBdev1 00:20:46.338 06:36:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:46.338 06:36:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:46.338 06:36:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:46.338 06:36:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:46.338 06:36:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:46.338 06:36:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:46.338 06:36:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:46.596 06:37:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:46.855 [ 00:20:46.855 { 00:20:46.855 "name": "BaseBdev1", 00:20:46.855 "aliases": [ 00:20:46.855 "def98640-ccf2-49f5-82b2-be57dcf52a57" 00:20:46.855 ], 00:20:46.855 "product_name": "Malloc disk", 00:20:46.855 "block_size": 512, 00:20:46.855 "num_blocks": 65536, 00:20:46.855 "uuid": "def98640-ccf2-49f5-82b2-be57dcf52a57", 00:20:46.855 "assigned_rate_limits": { 00:20:46.855 "rw_ios_per_sec": 0, 00:20:46.855 "rw_mbytes_per_sec": 0, 00:20:46.855 "r_mbytes_per_sec": 0, 00:20:46.855 "w_mbytes_per_sec": 0 00:20:46.855 }, 00:20:46.855 "claimed": true, 00:20:46.855 "claim_type": "exclusive_write", 00:20:46.855 "zoned": false, 00:20:46.855 "supported_io_types": { 00:20:46.855 "read": true, 00:20:46.855 "write": true, 00:20:46.855 "unmap": true, 00:20:46.855 "flush": true, 00:20:46.855 "reset": true, 00:20:46.855 "nvme_admin": false, 00:20:46.855 "nvme_io": false, 00:20:46.855 "nvme_io_md": false, 00:20:46.855 "write_zeroes": true, 00:20:46.855 "zcopy": true, 00:20:46.855 "get_zone_info": false, 00:20:46.855 "zone_management": false, 00:20:46.855 "zone_append": false, 00:20:46.855 "compare": false, 00:20:46.855 "compare_and_write": false, 00:20:46.855 "abort": true, 00:20:46.855 "seek_hole": false, 00:20:46.855 "seek_data": false, 00:20:46.855 "copy": true, 00:20:46.855 "nvme_iov_md": false 00:20:46.855 }, 00:20:46.855 "memory_domains": [ 00:20:46.855 { 00:20:46.855 "dma_device_id": "system", 00:20:46.855 "dma_device_type": 1 00:20:46.855 }, 00:20:46.855 { 00:20:46.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.855 "dma_device_type": 2 00:20:46.855 } 00:20:46.855 ], 00:20:46.855 "driver_specific": {} 00:20:46.855 } 00:20:46.855 ] 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.855 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.114 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.114 "name": "Existed_Raid", 00:20:47.114 "uuid": "97d77248-09c6-4ade-9714-b20e699a0cf9", 00:20:47.114 "strip_size_kb": 64, 00:20:47.114 "state": "configuring", 00:20:47.114 "raid_level": "raid0", 00:20:47.114 "superblock": true, 00:20:47.114 "num_base_bdevs": 4, 00:20:47.114 "num_base_bdevs_discovered": 1, 00:20:47.114 "num_base_bdevs_operational": 4, 00:20:47.114 "base_bdevs_list": [ 00:20:47.114 { 00:20:47.114 "name": "BaseBdev1", 00:20:47.114 "uuid": "def98640-ccf2-49f5-82b2-be57dcf52a57", 00:20:47.114 "is_configured": true, 00:20:47.114 "data_offset": 2048, 00:20:47.114 "data_size": 63488 00:20:47.114 }, 00:20:47.114 { 00:20:47.114 "name": "BaseBdev2", 00:20:47.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.114 "is_configured": false, 00:20:47.114 "data_offset": 0, 00:20:47.114 "data_size": 0 00:20:47.114 }, 00:20:47.114 { 00:20:47.114 "name": "BaseBdev3", 00:20:47.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.114 "is_configured": false, 00:20:47.114 "data_offset": 0, 00:20:47.114 "data_size": 0 00:20:47.114 }, 00:20:47.114 { 00:20:47.114 "name": "BaseBdev4", 00:20:47.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.114 "is_configured": false, 00:20:47.114 "data_offset": 0, 00:20:47.114 "data_size": 0 00:20:47.114 } 00:20:47.114 ] 00:20:47.114 }' 00:20:47.114 06:37:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.114 06:37:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:47.681 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:47.939 [2024-07-25 06:37:01.331656] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:47.939 [2024-07-25 06:37:01.331694] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdd6ce0 name Existed_Raid, state configuring 00:20:47.939 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:48.197 [2024-07-25 06:37:01.560298] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:48.197 [2024-07-25 06:37:01.561650] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:48.197 [2024-07-25 06:37:01.561682] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:48.197 [2024-07-25 06:37:01.561691] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:48.197 [2024-07-25 06:37:01.561701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:48.197 [2024-07-25 06:37:01.561710] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:48.197 [2024-07-25 06:37:01.561720] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:48.197 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:48.197 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:48.197 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.198 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:48.456 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.456 "name": "Existed_Raid", 00:20:48.456 "uuid": "253c5c4b-d952-4e14-bb52-ae5edebe993a", 00:20:48.456 "strip_size_kb": 64, 00:20:48.456 "state": "configuring", 00:20:48.456 "raid_level": "raid0", 00:20:48.456 "superblock": true, 00:20:48.456 "num_base_bdevs": 4, 00:20:48.456 "num_base_bdevs_discovered": 1, 00:20:48.456 "num_base_bdevs_operational": 4, 00:20:48.456 "base_bdevs_list": [ 00:20:48.456 { 00:20:48.456 "name": "BaseBdev1", 00:20:48.456 "uuid": "def98640-ccf2-49f5-82b2-be57dcf52a57", 00:20:48.456 "is_configured": true, 00:20:48.456 "data_offset": 2048, 00:20:48.456 "data_size": 63488 00:20:48.456 }, 00:20:48.456 { 00:20:48.456 "name": "BaseBdev2", 00:20:48.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.456 "is_configured": false, 00:20:48.456 "data_offset": 0, 00:20:48.456 "data_size": 0 00:20:48.456 }, 00:20:48.456 { 00:20:48.456 "name": "BaseBdev3", 00:20:48.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.456 "is_configured": false, 00:20:48.456 "data_offset": 0, 00:20:48.456 "data_size": 0 00:20:48.456 }, 00:20:48.456 { 00:20:48.456 "name": "BaseBdev4", 00:20:48.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.456 "is_configured": false, 00:20:48.456 "data_offset": 0, 00:20:48.456 "data_size": 0 00:20:48.456 } 00:20:48.456 ] 00:20:48.456 }' 00:20:48.456 06:37:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.456 06:37:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:49.023 06:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:49.282 [2024-07-25 06:37:02.614513] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:49.282 BaseBdev2 00:20:49.282 06:37:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:49.282 06:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:49.282 06:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:49.282 06:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:49.282 06:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:49.282 06:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:49.282 06:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:49.541 06:37:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:49.541 [ 00:20:49.541 { 00:20:49.541 "name": "BaseBdev2", 00:20:49.541 "aliases": [ 00:20:49.541 "a9589d84-914e-438f-99fb-0a129ca915c7" 00:20:49.541 ], 00:20:49.541 "product_name": "Malloc disk", 00:20:49.541 "block_size": 512, 00:20:49.541 "num_blocks": 65536, 00:20:49.541 "uuid": "a9589d84-914e-438f-99fb-0a129ca915c7", 00:20:49.541 "assigned_rate_limits": { 00:20:49.541 "rw_ios_per_sec": 0, 00:20:49.541 "rw_mbytes_per_sec": 0, 00:20:49.541 "r_mbytes_per_sec": 0, 00:20:49.541 "w_mbytes_per_sec": 0 00:20:49.541 }, 00:20:49.541 "claimed": true, 00:20:49.541 "claim_type": "exclusive_write", 00:20:49.541 "zoned": false, 00:20:49.541 "supported_io_types": { 00:20:49.541 "read": true, 00:20:49.541 "write": true, 00:20:49.541 "unmap": true, 00:20:49.541 "flush": true, 00:20:49.541 "reset": true, 00:20:49.541 "nvme_admin": false, 00:20:49.541 "nvme_io": false, 00:20:49.541 "nvme_io_md": false, 00:20:49.541 "write_zeroes": true, 00:20:49.541 "zcopy": true, 00:20:49.541 "get_zone_info": false, 00:20:49.541 "zone_management": false, 00:20:49.541 "zone_append": false, 00:20:49.541 "compare": false, 00:20:49.541 "compare_and_write": false, 00:20:49.541 "abort": true, 00:20:49.541 "seek_hole": false, 00:20:49.541 "seek_data": false, 00:20:49.541 "copy": true, 00:20:49.541 "nvme_iov_md": false 00:20:49.541 }, 00:20:49.541 "memory_domains": [ 00:20:49.541 { 00:20:49.541 "dma_device_id": "system", 00:20:49.541 "dma_device_type": 1 00:20:49.541 }, 00:20:49.541 { 00:20:49.541 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.541 "dma_device_type": 2 00:20:49.541 } 00:20:49.541 ], 00:20:49.541 "driver_specific": {} 00:20:49.541 } 00:20:49.541 ] 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.541 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:49.800 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.800 "name": "Existed_Raid", 00:20:49.800 "uuid": "253c5c4b-d952-4e14-bb52-ae5edebe993a", 00:20:49.800 "strip_size_kb": 64, 00:20:49.800 "state": "configuring", 00:20:49.800 "raid_level": "raid0", 00:20:49.800 "superblock": true, 00:20:49.800 "num_base_bdevs": 4, 00:20:49.800 "num_base_bdevs_discovered": 2, 00:20:49.800 "num_base_bdevs_operational": 4, 00:20:49.800 "base_bdevs_list": [ 00:20:49.800 { 00:20:49.800 "name": "BaseBdev1", 00:20:49.800 "uuid": "def98640-ccf2-49f5-82b2-be57dcf52a57", 00:20:49.800 "is_configured": true, 00:20:49.800 "data_offset": 2048, 00:20:49.800 "data_size": 63488 00:20:49.800 }, 00:20:49.800 { 00:20:49.800 "name": "BaseBdev2", 00:20:49.800 "uuid": "a9589d84-914e-438f-99fb-0a129ca915c7", 00:20:49.800 "is_configured": true, 00:20:49.800 "data_offset": 2048, 00:20:49.800 "data_size": 63488 00:20:49.800 }, 00:20:49.800 { 00:20:49.800 "name": "BaseBdev3", 00:20:49.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.800 "is_configured": false, 00:20:49.800 "data_offset": 0, 00:20:49.800 "data_size": 0 00:20:49.800 }, 00:20:49.801 { 00:20:49.801 "name": "BaseBdev4", 00:20:49.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.801 "is_configured": false, 00:20:49.801 "data_offset": 0, 00:20:49.801 "data_size": 0 00:20:49.801 } 00:20:49.801 ] 00:20:49.801 }' 00:20:49.801 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.801 06:37:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:50.365 06:37:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:50.623 [2024-07-25 06:37:04.113638] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:50.623 BaseBdev3 00:20:50.623 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:50.623 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:50.623 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:50.623 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:50.623 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:50.623 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:50.623 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:50.881 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:51.140 [ 00:20:51.140 { 00:20:51.140 "name": "BaseBdev3", 00:20:51.140 "aliases": [ 00:20:51.140 "d75bbaf8-e709-4379-9de0-1d37bfbab923" 00:20:51.140 ], 00:20:51.140 "product_name": "Malloc disk", 00:20:51.140 "block_size": 512, 00:20:51.140 "num_blocks": 65536, 00:20:51.140 "uuid": "d75bbaf8-e709-4379-9de0-1d37bfbab923", 00:20:51.140 "assigned_rate_limits": { 00:20:51.140 "rw_ios_per_sec": 0, 00:20:51.140 "rw_mbytes_per_sec": 0, 00:20:51.140 "r_mbytes_per_sec": 0, 00:20:51.140 "w_mbytes_per_sec": 0 00:20:51.140 }, 00:20:51.140 "claimed": true, 00:20:51.140 "claim_type": "exclusive_write", 00:20:51.140 "zoned": false, 00:20:51.140 "supported_io_types": { 00:20:51.140 "read": true, 00:20:51.140 "write": true, 00:20:51.140 "unmap": true, 00:20:51.140 "flush": true, 00:20:51.140 "reset": true, 00:20:51.140 "nvme_admin": false, 00:20:51.140 "nvme_io": false, 00:20:51.140 "nvme_io_md": false, 00:20:51.140 "write_zeroes": true, 00:20:51.140 "zcopy": true, 00:20:51.140 "get_zone_info": false, 00:20:51.140 "zone_management": false, 00:20:51.140 "zone_append": false, 00:20:51.140 "compare": false, 00:20:51.140 "compare_and_write": false, 00:20:51.140 "abort": true, 00:20:51.140 "seek_hole": false, 00:20:51.140 "seek_data": false, 00:20:51.140 "copy": true, 00:20:51.140 "nvme_iov_md": false 00:20:51.140 }, 00:20:51.140 "memory_domains": [ 00:20:51.140 { 00:20:51.140 "dma_device_id": "system", 00:20:51.140 "dma_device_type": 1 00:20:51.140 }, 00:20:51.140 { 00:20:51.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.140 "dma_device_type": 2 00:20:51.140 } 00:20:51.140 ], 00:20:51.140 "driver_specific": {} 00:20:51.140 } 00:20:51.140 ] 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.140 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:51.398 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:51.398 "name": "Existed_Raid", 00:20:51.398 "uuid": "253c5c4b-d952-4e14-bb52-ae5edebe993a", 00:20:51.398 "strip_size_kb": 64, 00:20:51.398 "state": "configuring", 00:20:51.398 "raid_level": "raid0", 00:20:51.398 "superblock": true, 00:20:51.398 "num_base_bdevs": 4, 00:20:51.398 "num_base_bdevs_discovered": 3, 00:20:51.398 "num_base_bdevs_operational": 4, 00:20:51.398 "base_bdevs_list": [ 00:20:51.398 { 00:20:51.398 "name": "BaseBdev1", 00:20:51.398 "uuid": "def98640-ccf2-49f5-82b2-be57dcf52a57", 00:20:51.398 "is_configured": true, 00:20:51.398 "data_offset": 2048, 00:20:51.398 "data_size": 63488 00:20:51.398 }, 00:20:51.398 { 00:20:51.398 "name": "BaseBdev2", 00:20:51.398 "uuid": "a9589d84-914e-438f-99fb-0a129ca915c7", 00:20:51.398 "is_configured": true, 00:20:51.398 "data_offset": 2048, 00:20:51.398 "data_size": 63488 00:20:51.398 }, 00:20:51.398 { 00:20:51.398 "name": "BaseBdev3", 00:20:51.398 "uuid": "d75bbaf8-e709-4379-9de0-1d37bfbab923", 00:20:51.398 "is_configured": true, 00:20:51.398 "data_offset": 2048, 00:20:51.398 "data_size": 63488 00:20:51.398 }, 00:20:51.398 { 00:20:51.398 "name": "BaseBdev4", 00:20:51.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.399 "is_configured": false, 00:20:51.399 "data_offset": 0, 00:20:51.399 "data_size": 0 00:20:51.399 } 00:20:51.399 ] 00:20:51.399 }' 00:20:51.399 06:37:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:51.399 06:37:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:51.965 06:37:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:52.224 [2024-07-25 06:37:05.560649] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:52.224 [2024-07-25 06:37:05.560803] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf8a250 00:20:52.224 [2024-07-25 06:37:05.560816] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:52.224 [2024-07-25 06:37:05.560979] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd6030 00:20:52.224 [2024-07-25 06:37:05.561099] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf8a250 00:20:52.224 [2024-07-25 06:37:05.561108] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf8a250 00:20:52.224 [2024-07-25 06:37:05.561203] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:52.224 BaseBdev4 00:20:52.224 06:37:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:52.224 06:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:52.224 06:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:52.224 06:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:52.224 06:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:52.224 06:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:52.224 06:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:52.482 06:37:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:52.482 [ 00:20:52.482 { 00:20:52.482 "name": "BaseBdev4", 00:20:52.482 "aliases": [ 00:20:52.482 "6dc83562-d4cd-4766-8356-14177251ebfe" 00:20:52.482 ], 00:20:52.482 "product_name": "Malloc disk", 00:20:52.482 "block_size": 512, 00:20:52.482 "num_blocks": 65536, 00:20:52.482 "uuid": "6dc83562-d4cd-4766-8356-14177251ebfe", 00:20:52.482 "assigned_rate_limits": { 00:20:52.482 "rw_ios_per_sec": 0, 00:20:52.482 "rw_mbytes_per_sec": 0, 00:20:52.482 "r_mbytes_per_sec": 0, 00:20:52.482 "w_mbytes_per_sec": 0 00:20:52.482 }, 00:20:52.482 "claimed": true, 00:20:52.482 "claim_type": "exclusive_write", 00:20:52.482 "zoned": false, 00:20:52.482 "supported_io_types": { 00:20:52.482 "read": true, 00:20:52.482 "write": true, 00:20:52.482 "unmap": true, 00:20:52.482 "flush": true, 00:20:52.482 "reset": true, 00:20:52.482 "nvme_admin": false, 00:20:52.482 "nvme_io": false, 00:20:52.482 "nvme_io_md": false, 00:20:52.482 "write_zeroes": true, 00:20:52.482 "zcopy": true, 00:20:52.482 "get_zone_info": false, 00:20:52.482 "zone_management": false, 00:20:52.482 "zone_append": false, 00:20:52.482 "compare": false, 00:20:52.483 "compare_and_write": false, 00:20:52.483 "abort": true, 00:20:52.483 "seek_hole": false, 00:20:52.483 "seek_data": false, 00:20:52.483 "copy": true, 00:20:52.483 "nvme_iov_md": false 00:20:52.483 }, 00:20:52.483 "memory_domains": [ 00:20:52.483 { 00:20:52.483 "dma_device_id": "system", 00:20:52.483 "dma_device_type": 1 00:20:52.483 }, 00:20:52.483 { 00:20:52.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.483 "dma_device_type": 2 00:20:52.483 } 00:20:52.483 ], 00:20:52.483 "driver_specific": {} 00:20:52.483 } 00:20:52.483 ] 00:20:52.483 06:37:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:52.483 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:52.483 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.741 "name": "Existed_Raid", 00:20:52.741 "uuid": "253c5c4b-d952-4e14-bb52-ae5edebe993a", 00:20:52.741 "strip_size_kb": 64, 00:20:52.741 "state": "online", 00:20:52.741 "raid_level": "raid0", 00:20:52.741 "superblock": true, 00:20:52.741 "num_base_bdevs": 4, 00:20:52.741 "num_base_bdevs_discovered": 4, 00:20:52.741 "num_base_bdevs_operational": 4, 00:20:52.741 "base_bdevs_list": [ 00:20:52.741 { 00:20:52.741 "name": "BaseBdev1", 00:20:52.741 "uuid": "def98640-ccf2-49f5-82b2-be57dcf52a57", 00:20:52.741 "is_configured": true, 00:20:52.741 "data_offset": 2048, 00:20:52.741 "data_size": 63488 00:20:52.741 }, 00:20:52.741 { 00:20:52.741 "name": "BaseBdev2", 00:20:52.741 "uuid": "a9589d84-914e-438f-99fb-0a129ca915c7", 00:20:52.741 "is_configured": true, 00:20:52.741 "data_offset": 2048, 00:20:52.741 "data_size": 63488 00:20:52.741 }, 00:20:52.741 { 00:20:52.741 "name": "BaseBdev3", 00:20:52.741 "uuid": "d75bbaf8-e709-4379-9de0-1d37bfbab923", 00:20:52.741 "is_configured": true, 00:20:52.741 "data_offset": 2048, 00:20:52.741 "data_size": 63488 00:20:52.741 }, 00:20:52.741 { 00:20:52.741 "name": "BaseBdev4", 00:20:52.741 "uuid": "6dc83562-d4cd-4766-8356-14177251ebfe", 00:20:52.741 "is_configured": true, 00:20:52.741 "data_offset": 2048, 00:20:52.741 "data_size": 63488 00:20:52.741 } 00:20:52.741 ] 00:20:52.741 }' 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.741 06:37:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.308 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:53.308 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:53.308 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:53.308 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:53.308 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:53.308 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:53.308 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:53.308 06:37:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:53.567 [2024-07-25 06:37:07.044915] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:53.567 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:53.567 "name": "Existed_Raid", 00:20:53.567 "aliases": [ 00:20:53.567 "253c5c4b-d952-4e14-bb52-ae5edebe993a" 00:20:53.567 ], 00:20:53.567 "product_name": "Raid Volume", 00:20:53.567 "block_size": 512, 00:20:53.567 "num_blocks": 253952, 00:20:53.567 "uuid": "253c5c4b-d952-4e14-bb52-ae5edebe993a", 00:20:53.567 "assigned_rate_limits": { 00:20:53.567 "rw_ios_per_sec": 0, 00:20:53.567 "rw_mbytes_per_sec": 0, 00:20:53.567 "r_mbytes_per_sec": 0, 00:20:53.567 "w_mbytes_per_sec": 0 00:20:53.567 }, 00:20:53.567 "claimed": false, 00:20:53.567 "zoned": false, 00:20:53.567 "supported_io_types": { 00:20:53.567 "read": true, 00:20:53.567 "write": true, 00:20:53.567 "unmap": true, 00:20:53.567 "flush": true, 00:20:53.567 "reset": true, 00:20:53.567 "nvme_admin": false, 00:20:53.567 "nvme_io": false, 00:20:53.567 "nvme_io_md": false, 00:20:53.567 "write_zeroes": true, 00:20:53.567 "zcopy": false, 00:20:53.567 "get_zone_info": false, 00:20:53.567 "zone_management": false, 00:20:53.567 "zone_append": false, 00:20:53.567 "compare": false, 00:20:53.567 "compare_and_write": false, 00:20:53.567 "abort": false, 00:20:53.567 "seek_hole": false, 00:20:53.567 "seek_data": false, 00:20:53.567 "copy": false, 00:20:53.567 "nvme_iov_md": false 00:20:53.567 }, 00:20:53.567 "memory_domains": [ 00:20:53.567 { 00:20:53.567 "dma_device_id": "system", 00:20:53.567 "dma_device_type": 1 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.567 "dma_device_type": 2 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "dma_device_id": "system", 00:20:53.567 "dma_device_type": 1 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.567 "dma_device_type": 2 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "dma_device_id": "system", 00:20:53.567 "dma_device_type": 1 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.567 "dma_device_type": 2 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "dma_device_id": "system", 00:20:53.567 "dma_device_type": 1 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.567 "dma_device_type": 2 00:20:53.567 } 00:20:53.567 ], 00:20:53.567 "driver_specific": { 00:20:53.567 "raid": { 00:20:53.567 "uuid": "253c5c4b-d952-4e14-bb52-ae5edebe993a", 00:20:53.567 "strip_size_kb": 64, 00:20:53.567 "state": "online", 00:20:53.567 "raid_level": "raid0", 00:20:53.567 "superblock": true, 00:20:53.567 "num_base_bdevs": 4, 00:20:53.567 "num_base_bdevs_discovered": 4, 00:20:53.567 "num_base_bdevs_operational": 4, 00:20:53.567 "base_bdevs_list": [ 00:20:53.567 { 00:20:53.567 "name": "BaseBdev1", 00:20:53.567 "uuid": "def98640-ccf2-49f5-82b2-be57dcf52a57", 00:20:53.567 "is_configured": true, 00:20:53.567 "data_offset": 2048, 00:20:53.567 "data_size": 63488 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "name": "BaseBdev2", 00:20:53.567 "uuid": "a9589d84-914e-438f-99fb-0a129ca915c7", 00:20:53.567 "is_configured": true, 00:20:53.567 "data_offset": 2048, 00:20:53.567 "data_size": 63488 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "name": "BaseBdev3", 00:20:53.567 "uuid": "d75bbaf8-e709-4379-9de0-1d37bfbab923", 00:20:53.567 "is_configured": true, 00:20:53.567 "data_offset": 2048, 00:20:53.567 "data_size": 63488 00:20:53.567 }, 00:20:53.567 { 00:20:53.567 "name": "BaseBdev4", 00:20:53.567 "uuid": "6dc83562-d4cd-4766-8356-14177251ebfe", 00:20:53.567 "is_configured": true, 00:20:53.567 "data_offset": 2048, 00:20:53.567 "data_size": 63488 00:20:53.567 } 00:20:53.567 ] 00:20:53.567 } 00:20:53.567 } 00:20:53.567 }' 00:20:53.567 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:53.826 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:53.826 BaseBdev2 00:20:53.826 BaseBdev3 00:20:53.826 BaseBdev4' 00:20:53.826 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:53.826 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:53.826 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:53.826 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:53.826 "name": "BaseBdev1", 00:20:53.826 "aliases": [ 00:20:53.826 "def98640-ccf2-49f5-82b2-be57dcf52a57" 00:20:53.826 ], 00:20:53.826 "product_name": "Malloc disk", 00:20:53.826 "block_size": 512, 00:20:53.826 "num_blocks": 65536, 00:20:53.826 "uuid": "def98640-ccf2-49f5-82b2-be57dcf52a57", 00:20:53.826 "assigned_rate_limits": { 00:20:53.826 "rw_ios_per_sec": 0, 00:20:53.826 "rw_mbytes_per_sec": 0, 00:20:53.826 "r_mbytes_per_sec": 0, 00:20:53.826 "w_mbytes_per_sec": 0 00:20:53.826 }, 00:20:53.826 "claimed": true, 00:20:53.826 "claim_type": "exclusive_write", 00:20:53.826 "zoned": false, 00:20:53.826 "supported_io_types": { 00:20:53.826 "read": true, 00:20:53.826 "write": true, 00:20:53.826 "unmap": true, 00:20:53.826 "flush": true, 00:20:53.826 "reset": true, 00:20:53.826 "nvme_admin": false, 00:20:53.826 "nvme_io": false, 00:20:53.826 "nvme_io_md": false, 00:20:53.826 "write_zeroes": true, 00:20:53.826 "zcopy": true, 00:20:53.826 "get_zone_info": false, 00:20:53.826 "zone_management": false, 00:20:53.826 "zone_append": false, 00:20:53.826 "compare": false, 00:20:53.826 "compare_and_write": false, 00:20:53.826 "abort": true, 00:20:53.826 "seek_hole": false, 00:20:53.826 "seek_data": false, 00:20:53.826 "copy": true, 00:20:53.826 "nvme_iov_md": false 00:20:53.826 }, 00:20:53.826 "memory_domains": [ 00:20:53.826 { 00:20:53.826 "dma_device_id": "system", 00:20:53.826 "dma_device_type": 1 00:20:53.826 }, 00:20:53.826 { 00:20:53.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.826 "dma_device_type": 2 00:20:53.826 } 00:20:53.826 ], 00:20:53.826 "driver_specific": {} 00:20:53.826 }' 00:20:53.826 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.085 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.343 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.344 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.344 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.344 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:54.344 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.602 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.602 "name": "BaseBdev2", 00:20:54.602 "aliases": [ 00:20:54.602 "a9589d84-914e-438f-99fb-0a129ca915c7" 00:20:54.602 ], 00:20:54.602 "product_name": "Malloc disk", 00:20:54.602 "block_size": 512, 00:20:54.602 "num_blocks": 65536, 00:20:54.602 "uuid": "a9589d84-914e-438f-99fb-0a129ca915c7", 00:20:54.602 "assigned_rate_limits": { 00:20:54.602 "rw_ios_per_sec": 0, 00:20:54.602 "rw_mbytes_per_sec": 0, 00:20:54.602 "r_mbytes_per_sec": 0, 00:20:54.602 "w_mbytes_per_sec": 0 00:20:54.602 }, 00:20:54.602 "claimed": true, 00:20:54.602 "claim_type": "exclusive_write", 00:20:54.602 "zoned": false, 00:20:54.602 "supported_io_types": { 00:20:54.602 "read": true, 00:20:54.602 "write": true, 00:20:54.602 "unmap": true, 00:20:54.602 "flush": true, 00:20:54.602 "reset": true, 00:20:54.602 "nvme_admin": false, 00:20:54.602 "nvme_io": false, 00:20:54.602 "nvme_io_md": false, 00:20:54.602 "write_zeroes": true, 00:20:54.602 "zcopy": true, 00:20:54.602 "get_zone_info": false, 00:20:54.602 "zone_management": false, 00:20:54.602 "zone_append": false, 00:20:54.602 "compare": false, 00:20:54.602 "compare_and_write": false, 00:20:54.602 "abort": true, 00:20:54.602 "seek_hole": false, 00:20:54.602 "seek_data": false, 00:20:54.602 "copy": true, 00:20:54.602 "nvme_iov_md": false 00:20:54.602 }, 00:20:54.602 "memory_domains": [ 00:20:54.602 { 00:20:54.602 "dma_device_id": "system", 00:20:54.602 "dma_device_type": 1 00:20:54.602 }, 00:20:54.602 { 00:20:54.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.602 "dma_device_type": 2 00:20:54.602 } 00:20:54.602 ], 00:20:54.602 "driver_specific": {} 00:20:54.602 }' 00:20:54.602 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.602 06:37:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.602 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.602 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.602 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.602 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.602 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.602 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.860 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.860 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.860 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.860 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.860 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.860 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:54.860 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.119 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.119 "name": "BaseBdev3", 00:20:55.119 "aliases": [ 00:20:55.119 "d75bbaf8-e709-4379-9de0-1d37bfbab923" 00:20:55.119 ], 00:20:55.119 "product_name": "Malloc disk", 00:20:55.119 "block_size": 512, 00:20:55.119 "num_blocks": 65536, 00:20:55.119 "uuid": "d75bbaf8-e709-4379-9de0-1d37bfbab923", 00:20:55.119 "assigned_rate_limits": { 00:20:55.119 "rw_ios_per_sec": 0, 00:20:55.119 "rw_mbytes_per_sec": 0, 00:20:55.119 "r_mbytes_per_sec": 0, 00:20:55.119 "w_mbytes_per_sec": 0 00:20:55.119 }, 00:20:55.119 "claimed": true, 00:20:55.119 "claim_type": "exclusive_write", 00:20:55.119 "zoned": false, 00:20:55.119 "supported_io_types": { 00:20:55.119 "read": true, 00:20:55.119 "write": true, 00:20:55.119 "unmap": true, 00:20:55.119 "flush": true, 00:20:55.119 "reset": true, 00:20:55.119 "nvme_admin": false, 00:20:55.119 "nvme_io": false, 00:20:55.119 "nvme_io_md": false, 00:20:55.119 "write_zeroes": true, 00:20:55.119 "zcopy": true, 00:20:55.119 "get_zone_info": false, 00:20:55.119 "zone_management": false, 00:20:55.119 "zone_append": false, 00:20:55.119 "compare": false, 00:20:55.119 "compare_and_write": false, 00:20:55.119 "abort": true, 00:20:55.119 "seek_hole": false, 00:20:55.119 "seek_data": false, 00:20:55.119 "copy": true, 00:20:55.119 "nvme_iov_md": false 00:20:55.119 }, 00:20:55.119 "memory_domains": [ 00:20:55.119 { 00:20:55.119 "dma_device_id": "system", 00:20:55.119 "dma_device_type": 1 00:20:55.119 }, 00:20:55.119 { 00:20:55.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.119 "dma_device_type": 2 00:20:55.119 } 00:20:55.119 ], 00:20:55.119 "driver_specific": {} 00:20:55.119 }' 00:20:55.119 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.119 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.119 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.119 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.119 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.377 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:55.378 06:37:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.636 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.636 "name": "BaseBdev4", 00:20:55.636 "aliases": [ 00:20:55.636 "6dc83562-d4cd-4766-8356-14177251ebfe" 00:20:55.636 ], 00:20:55.636 "product_name": "Malloc disk", 00:20:55.636 "block_size": 512, 00:20:55.636 "num_blocks": 65536, 00:20:55.636 "uuid": "6dc83562-d4cd-4766-8356-14177251ebfe", 00:20:55.636 "assigned_rate_limits": { 00:20:55.636 "rw_ios_per_sec": 0, 00:20:55.636 "rw_mbytes_per_sec": 0, 00:20:55.636 "r_mbytes_per_sec": 0, 00:20:55.636 "w_mbytes_per_sec": 0 00:20:55.636 }, 00:20:55.636 "claimed": true, 00:20:55.636 "claim_type": "exclusive_write", 00:20:55.636 "zoned": false, 00:20:55.636 "supported_io_types": { 00:20:55.636 "read": true, 00:20:55.636 "write": true, 00:20:55.636 "unmap": true, 00:20:55.636 "flush": true, 00:20:55.636 "reset": true, 00:20:55.636 "nvme_admin": false, 00:20:55.636 "nvme_io": false, 00:20:55.636 "nvme_io_md": false, 00:20:55.636 "write_zeroes": true, 00:20:55.636 "zcopy": true, 00:20:55.636 "get_zone_info": false, 00:20:55.636 "zone_management": false, 00:20:55.636 "zone_append": false, 00:20:55.636 "compare": false, 00:20:55.636 "compare_and_write": false, 00:20:55.636 "abort": true, 00:20:55.636 "seek_hole": false, 00:20:55.636 "seek_data": false, 00:20:55.636 "copy": true, 00:20:55.636 "nvme_iov_md": false 00:20:55.636 }, 00:20:55.636 "memory_domains": [ 00:20:55.636 { 00:20:55.636 "dma_device_id": "system", 00:20:55.636 "dma_device_type": 1 00:20:55.636 }, 00:20:55.636 { 00:20:55.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.636 "dma_device_type": 2 00:20:55.636 } 00:20:55.636 ], 00:20:55.636 "driver_specific": {} 00:20:55.636 }' 00:20:55.636 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.636 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.637 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.637 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.895 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:56.154 [2024-07-25 06:37:09.631455] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:56.154 [2024-07-25 06:37:09.631481] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:56.154 [2024-07-25 06:37:09.631526] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.154 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:56.421 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.421 "name": "Existed_Raid", 00:20:56.421 "uuid": "253c5c4b-d952-4e14-bb52-ae5edebe993a", 00:20:56.421 "strip_size_kb": 64, 00:20:56.421 "state": "offline", 00:20:56.421 "raid_level": "raid0", 00:20:56.421 "superblock": true, 00:20:56.421 "num_base_bdevs": 4, 00:20:56.421 "num_base_bdevs_discovered": 3, 00:20:56.421 "num_base_bdevs_operational": 3, 00:20:56.421 "base_bdevs_list": [ 00:20:56.421 { 00:20:56.421 "name": null, 00:20:56.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.421 "is_configured": false, 00:20:56.421 "data_offset": 2048, 00:20:56.421 "data_size": 63488 00:20:56.421 }, 00:20:56.421 { 00:20:56.421 "name": "BaseBdev2", 00:20:56.421 "uuid": "a9589d84-914e-438f-99fb-0a129ca915c7", 00:20:56.421 "is_configured": true, 00:20:56.421 "data_offset": 2048, 00:20:56.422 "data_size": 63488 00:20:56.422 }, 00:20:56.422 { 00:20:56.422 "name": "BaseBdev3", 00:20:56.422 "uuid": "d75bbaf8-e709-4379-9de0-1d37bfbab923", 00:20:56.422 "is_configured": true, 00:20:56.422 "data_offset": 2048, 00:20:56.422 "data_size": 63488 00:20:56.422 }, 00:20:56.422 { 00:20:56.422 "name": "BaseBdev4", 00:20:56.422 "uuid": "6dc83562-d4cd-4766-8356-14177251ebfe", 00:20:56.422 "is_configured": true, 00:20:56.422 "data_offset": 2048, 00:20:56.422 "data_size": 63488 00:20:56.422 } 00:20:56.422 ] 00:20:56.422 }' 00:20:56.422 06:37:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.422 06:37:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:57.001 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:57.001 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:57.001 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.001 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:57.259 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:57.259 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:57.259 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:57.518 [2024-07-25 06:37:10.895823] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:57.518 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:57.518 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:57.518 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.518 06:37:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:57.777 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:57.777 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:57.777 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:58.035 [2024-07-25 06:37:11.363126] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:58.035 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:58.035 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:58.035 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.035 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:58.294 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:58.294 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:58.294 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:58.294 [2024-07-25 06:37:11.834517] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:58.294 [2024-07-25 06:37:11.834555] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf8a250 name Existed_Raid, state offline 00:20:58.552 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:58.552 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:58.552 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.552 06:37:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:58.552 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:58.552 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:58.553 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:58.553 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:58.553 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:58.553 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:58.811 BaseBdev2 00:20:58.811 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:58.811 06:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:58.811 06:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:58.811 06:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:58.811 06:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:58.811 06:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:58.811 06:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.069 06:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:59.327 [ 00:20:59.327 { 00:20:59.327 "name": "BaseBdev2", 00:20:59.327 "aliases": [ 00:20:59.327 "14d1144c-7903-40a2-b623-1195137c8240" 00:20:59.327 ], 00:20:59.327 "product_name": "Malloc disk", 00:20:59.327 "block_size": 512, 00:20:59.327 "num_blocks": 65536, 00:20:59.327 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:20:59.327 "assigned_rate_limits": { 00:20:59.327 "rw_ios_per_sec": 0, 00:20:59.327 "rw_mbytes_per_sec": 0, 00:20:59.327 "r_mbytes_per_sec": 0, 00:20:59.327 "w_mbytes_per_sec": 0 00:20:59.327 }, 00:20:59.327 "claimed": false, 00:20:59.327 "zoned": false, 00:20:59.327 "supported_io_types": { 00:20:59.327 "read": true, 00:20:59.327 "write": true, 00:20:59.327 "unmap": true, 00:20:59.327 "flush": true, 00:20:59.327 "reset": true, 00:20:59.327 "nvme_admin": false, 00:20:59.327 "nvme_io": false, 00:20:59.327 "nvme_io_md": false, 00:20:59.327 "write_zeroes": true, 00:20:59.327 "zcopy": true, 00:20:59.327 "get_zone_info": false, 00:20:59.327 "zone_management": false, 00:20:59.327 "zone_append": false, 00:20:59.327 "compare": false, 00:20:59.327 "compare_and_write": false, 00:20:59.327 "abort": true, 00:20:59.327 "seek_hole": false, 00:20:59.327 "seek_data": false, 00:20:59.327 "copy": true, 00:20:59.327 "nvme_iov_md": false 00:20:59.327 }, 00:20:59.327 "memory_domains": [ 00:20:59.327 { 00:20:59.327 "dma_device_id": "system", 00:20:59.327 "dma_device_type": 1 00:20:59.327 }, 00:20:59.327 { 00:20:59.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.327 "dma_device_type": 2 00:20:59.327 } 00:20:59.328 ], 00:20:59.328 "driver_specific": {} 00:20:59.328 } 00:20:59.328 ] 00:20:59.328 06:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:59.328 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:59.328 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:59.328 06:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:59.586 BaseBdev3 00:20:59.586 06:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:59.586 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:59.586 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:59.586 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:59.586 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:59.586 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:59.586 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.844 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:00.103 [ 00:21:00.103 { 00:21:00.103 "name": "BaseBdev3", 00:21:00.103 "aliases": [ 00:21:00.103 "7d8108ea-deb7-44aa-b33d-011b7c956944" 00:21:00.103 ], 00:21:00.103 "product_name": "Malloc disk", 00:21:00.103 "block_size": 512, 00:21:00.103 "num_blocks": 65536, 00:21:00.103 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:00.103 "assigned_rate_limits": { 00:21:00.103 "rw_ios_per_sec": 0, 00:21:00.103 "rw_mbytes_per_sec": 0, 00:21:00.103 "r_mbytes_per_sec": 0, 00:21:00.103 "w_mbytes_per_sec": 0 00:21:00.103 }, 00:21:00.103 "claimed": false, 00:21:00.103 "zoned": false, 00:21:00.103 "supported_io_types": { 00:21:00.103 "read": true, 00:21:00.103 "write": true, 00:21:00.103 "unmap": true, 00:21:00.103 "flush": true, 00:21:00.103 "reset": true, 00:21:00.103 "nvme_admin": false, 00:21:00.103 "nvme_io": false, 00:21:00.103 "nvme_io_md": false, 00:21:00.103 "write_zeroes": true, 00:21:00.103 "zcopy": true, 00:21:00.103 "get_zone_info": false, 00:21:00.103 "zone_management": false, 00:21:00.103 "zone_append": false, 00:21:00.103 "compare": false, 00:21:00.103 "compare_and_write": false, 00:21:00.103 "abort": true, 00:21:00.103 "seek_hole": false, 00:21:00.103 "seek_data": false, 00:21:00.103 "copy": true, 00:21:00.103 "nvme_iov_md": false 00:21:00.103 }, 00:21:00.103 "memory_domains": [ 00:21:00.103 { 00:21:00.103 "dma_device_id": "system", 00:21:00.103 "dma_device_type": 1 00:21:00.103 }, 00:21:00.103 { 00:21:00.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.103 "dma_device_type": 2 00:21:00.103 } 00:21:00.103 ], 00:21:00.103 "driver_specific": {} 00:21:00.103 } 00:21:00.103 ] 00:21:00.103 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:00.103 06:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:00.103 06:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:00.103 06:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:00.361 BaseBdev4 00:21:00.361 06:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:00.361 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:21:00.361 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:00.361 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:00.361 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:00.361 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:00.361 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:00.619 06:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:00.619 [ 00:21:00.619 { 00:21:00.619 "name": "BaseBdev4", 00:21:00.619 "aliases": [ 00:21:00.619 "f3ce824a-a598-45af-8a2b-be40ee184ebe" 00:21:00.619 ], 00:21:00.619 "product_name": "Malloc disk", 00:21:00.619 "block_size": 512, 00:21:00.619 "num_blocks": 65536, 00:21:00.619 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:00.619 "assigned_rate_limits": { 00:21:00.619 "rw_ios_per_sec": 0, 00:21:00.619 "rw_mbytes_per_sec": 0, 00:21:00.619 "r_mbytes_per_sec": 0, 00:21:00.619 "w_mbytes_per_sec": 0 00:21:00.619 }, 00:21:00.619 "claimed": false, 00:21:00.619 "zoned": false, 00:21:00.619 "supported_io_types": { 00:21:00.619 "read": true, 00:21:00.619 "write": true, 00:21:00.619 "unmap": true, 00:21:00.619 "flush": true, 00:21:00.619 "reset": true, 00:21:00.619 "nvme_admin": false, 00:21:00.619 "nvme_io": false, 00:21:00.619 "nvme_io_md": false, 00:21:00.619 "write_zeroes": true, 00:21:00.619 "zcopy": true, 00:21:00.619 "get_zone_info": false, 00:21:00.619 "zone_management": false, 00:21:00.619 "zone_append": false, 00:21:00.619 "compare": false, 00:21:00.619 "compare_and_write": false, 00:21:00.619 "abort": true, 00:21:00.619 "seek_hole": false, 00:21:00.619 "seek_data": false, 00:21:00.619 "copy": true, 00:21:00.619 "nvme_iov_md": false 00:21:00.619 }, 00:21:00.619 "memory_domains": [ 00:21:00.619 { 00:21:00.619 "dma_device_id": "system", 00:21:00.619 "dma_device_type": 1 00:21:00.619 }, 00:21:00.619 { 00:21:00.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.620 "dma_device_type": 2 00:21:00.620 } 00:21:00.620 ], 00:21:00.620 "driver_specific": {} 00:21:00.620 } 00:21:00.620 ] 00:21:00.620 06:37:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:00.620 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:00.620 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:00.620 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:00.878 [2024-07-25 06:37:14.368278] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:00.878 [2024-07-25 06:37:14.368316] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:00.878 [2024-07-25 06:37:14.368335] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:00.878 [2024-07-25 06:37:14.369539] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:00.878 [2024-07-25 06:37:14.369582] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.878 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:01.137 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.137 "name": "Existed_Raid", 00:21:01.137 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:01.137 "strip_size_kb": 64, 00:21:01.137 "state": "configuring", 00:21:01.137 "raid_level": "raid0", 00:21:01.137 "superblock": true, 00:21:01.137 "num_base_bdevs": 4, 00:21:01.137 "num_base_bdevs_discovered": 3, 00:21:01.137 "num_base_bdevs_operational": 4, 00:21:01.137 "base_bdevs_list": [ 00:21:01.137 { 00:21:01.137 "name": "BaseBdev1", 00:21:01.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.137 "is_configured": false, 00:21:01.137 "data_offset": 0, 00:21:01.137 "data_size": 0 00:21:01.137 }, 00:21:01.137 { 00:21:01.137 "name": "BaseBdev2", 00:21:01.137 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:01.137 "is_configured": true, 00:21:01.137 "data_offset": 2048, 00:21:01.137 "data_size": 63488 00:21:01.137 }, 00:21:01.137 { 00:21:01.137 "name": "BaseBdev3", 00:21:01.137 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:01.137 "is_configured": true, 00:21:01.137 "data_offset": 2048, 00:21:01.137 "data_size": 63488 00:21:01.137 }, 00:21:01.137 { 00:21:01.137 "name": "BaseBdev4", 00:21:01.137 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:01.137 "is_configured": true, 00:21:01.137 "data_offset": 2048, 00:21:01.137 "data_size": 63488 00:21:01.137 } 00:21:01.137 ] 00:21:01.137 }' 00:21:01.137 06:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.137 06:37:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:01.703 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:01.962 [2024-07-25 06:37:15.390930] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.962 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:02.221 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.221 "name": "Existed_Raid", 00:21:02.221 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:02.221 "strip_size_kb": 64, 00:21:02.221 "state": "configuring", 00:21:02.221 "raid_level": "raid0", 00:21:02.221 "superblock": true, 00:21:02.221 "num_base_bdevs": 4, 00:21:02.221 "num_base_bdevs_discovered": 2, 00:21:02.221 "num_base_bdevs_operational": 4, 00:21:02.221 "base_bdevs_list": [ 00:21:02.221 { 00:21:02.221 "name": "BaseBdev1", 00:21:02.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.221 "is_configured": false, 00:21:02.221 "data_offset": 0, 00:21:02.221 "data_size": 0 00:21:02.221 }, 00:21:02.221 { 00:21:02.221 "name": null, 00:21:02.221 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:02.221 "is_configured": false, 00:21:02.221 "data_offset": 2048, 00:21:02.221 "data_size": 63488 00:21:02.221 }, 00:21:02.221 { 00:21:02.221 "name": "BaseBdev3", 00:21:02.221 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:02.221 "is_configured": true, 00:21:02.221 "data_offset": 2048, 00:21:02.221 "data_size": 63488 00:21:02.221 }, 00:21:02.221 { 00:21:02.221 "name": "BaseBdev4", 00:21:02.221 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:02.221 "is_configured": true, 00:21:02.221 "data_offset": 2048, 00:21:02.221 "data_size": 63488 00:21:02.221 } 00:21:02.221 ] 00:21:02.221 }' 00:21:02.221 06:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.221 06:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:02.787 06:37:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.787 06:37:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:03.045 06:37:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:03.045 06:37:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:03.303 [2024-07-25 06:37:16.653390] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:03.303 BaseBdev1 00:21:03.303 06:37:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:03.304 06:37:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:21:03.304 06:37:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:03.304 06:37:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:03.304 06:37:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:03.304 06:37:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:03.304 06:37:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:03.562 06:37:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:03.562 [ 00:21:03.562 { 00:21:03.562 "name": "BaseBdev1", 00:21:03.562 "aliases": [ 00:21:03.562 "d08cafa3-cc36-4ef9-8cad-35af3160815d" 00:21:03.562 ], 00:21:03.562 "product_name": "Malloc disk", 00:21:03.562 "block_size": 512, 00:21:03.562 "num_blocks": 65536, 00:21:03.562 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:03.562 "assigned_rate_limits": { 00:21:03.562 "rw_ios_per_sec": 0, 00:21:03.562 "rw_mbytes_per_sec": 0, 00:21:03.562 "r_mbytes_per_sec": 0, 00:21:03.562 "w_mbytes_per_sec": 0 00:21:03.562 }, 00:21:03.562 "claimed": true, 00:21:03.562 "claim_type": "exclusive_write", 00:21:03.562 "zoned": false, 00:21:03.562 "supported_io_types": { 00:21:03.562 "read": true, 00:21:03.562 "write": true, 00:21:03.562 "unmap": true, 00:21:03.562 "flush": true, 00:21:03.562 "reset": true, 00:21:03.562 "nvme_admin": false, 00:21:03.562 "nvme_io": false, 00:21:03.562 "nvme_io_md": false, 00:21:03.562 "write_zeroes": true, 00:21:03.562 "zcopy": true, 00:21:03.562 "get_zone_info": false, 00:21:03.562 "zone_management": false, 00:21:03.562 "zone_append": false, 00:21:03.562 "compare": false, 00:21:03.562 "compare_and_write": false, 00:21:03.562 "abort": true, 00:21:03.562 "seek_hole": false, 00:21:03.562 "seek_data": false, 00:21:03.562 "copy": true, 00:21:03.562 "nvme_iov_md": false 00:21:03.562 }, 00:21:03.562 "memory_domains": [ 00:21:03.562 { 00:21:03.562 "dma_device_id": "system", 00:21:03.562 "dma_device_type": 1 00:21:03.562 }, 00:21:03.562 { 00:21:03.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.562 "dma_device_type": 2 00:21:03.562 } 00:21:03.562 ], 00:21:03.562 "driver_specific": {} 00:21:03.562 } 00:21:03.562 ] 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.562 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:03.820 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.820 "name": "Existed_Raid", 00:21:03.820 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:03.820 "strip_size_kb": 64, 00:21:03.820 "state": "configuring", 00:21:03.820 "raid_level": "raid0", 00:21:03.820 "superblock": true, 00:21:03.820 "num_base_bdevs": 4, 00:21:03.820 "num_base_bdevs_discovered": 3, 00:21:03.820 "num_base_bdevs_operational": 4, 00:21:03.820 "base_bdevs_list": [ 00:21:03.820 { 00:21:03.820 "name": "BaseBdev1", 00:21:03.820 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:03.820 "is_configured": true, 00:21:03.820 "data_offset": 2048, 00:21:03.820 "data_size": 63488 00:21:03.820 }, 00:21:03.820 { 00:21:03.820 "name": null, 00:21:03.820 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:03.820 "is_configured": false, 00:21:03.820 "data_offset": 2048, 00:21:03.820 "data_size": 63488 00:21:03.820 }, 00:21:03.820 { 00:21:03.820 "name": "BaseBdev3", 00:21:03.820 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:03.820 "is_configured": true, 00:21:03.820 "data_offset": 2048, 00:21:03.820 "data_size": 63488 00:21:03.820 }, 00:21:03.820 { 00:21:03.820 "name": "BaseBdev4", 00:21:03.820 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:03.820 "is_configured": true, 00:21:03.820 "data_offset": 2048, 00:21:03.820 "data_size": 63488 00:21:03.820 } 00:21:03.820 ] 00:21:03.820 }' 00:21:03.820 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.820 06:37:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:04.386 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:04.386 06:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.644 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:04.645 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:04.903 [2024-07-25 06:37:18.233661] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.903 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:05.161 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.161 "name": "Existed_Raid", 00:21:05.161 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:05.161 "strip_size_kb": 64, 00:21:05.161 "state": "configuring", 00:21:05.161 "raid_level": "raid0", 00:21:05.161 "superblock": true, 00:21:05.161 "num_base_bdevs": 4, 00:21:05.161 "num_base_bdevs_discovered": 2, 00:21:05.161 "num_base_bdevs_operational": 4, 00:21:05.161 "base_bdevs_list": [ 00:21:05.161 { 00:21:05.161 "name": "BaseBdev1", 00:21:05.161 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:05.161 "is_configured": true, 00:21:05.161 "data_offset": 2048, 00:21:05.161 "data_size": 63488 00:21:05.161 }, 00:21:05.161 { 00:21:05.161 "name": null, 00:21:05.161 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:05.161 "is_configured": false, 00:21:05.161 "data_offset": 2048, 00:21:05.161 "data_size": 63488 00:21:05.161 }, 00:21:05.161 { 00:21:05.161 "name": null, 00:21:05.161 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:05.161 "is_configured": false, 00:21:05.161 "data_offset": 2048, 00:21:05.161 "data_size": 63488 00:21:05.161 }, 00:21:05.161 { 00:21:05.161 "name": "BaseBdev4", 00:21:05.161 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:05.161 "is_configured": true, 00:21:05.161 "data_offset": 2048, 00:21:05.161 "data_size": 63488 00:21:05.161 } 00:21:05.161 ] 00:21:05.161 }' 00:21:05.161 06:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.161 06:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:05.726 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:05.726 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.726 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:05.726 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:05.985 [2024-07-25 06:37:19.472942] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.985 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:06.244 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.244 "name": "Existed_Raid", 00:21:06.244 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:06.244 "strip_size_kb": 64, 00:21:06.244 "state": "configuring", 00:21:06.244 "raid_level": "raid0", 00:21:06.244 "superblock": true, 00:21:06.244 "num_base_bdevs": 4, 00:21:06.244 "num_base_bdevs_discovered": 3, 00:21:06.244 "num_base_bdevs_operational": 4, 00:21:06.244 "base_bdevs_list": [ 00:21:06.244 { 00:21:06.244 "name": "BaseBdev1", 00:21:06.244 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:06.244 "is_configured": true, 00:21:06.244 "data_offset": 2048, 00:21:06.244 "data_size": 63488 00:21:06.244 }, 00:21:06.244 { 00:21:06.244 "name": null, 00:21:06.244 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:06.244 "is_configured": false, 00:21:06.244 "data_offset": 2048, 00:21:06.244 "data_size": 63488 00:21:06.244 }, 00:21:06.244 { 00:21:06.244 "name": "BaseBdev3", 00:21:06.244 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:06.244 "is_configured": true, 00:21:06.244 "data_offset": 2048, 00:21:06.244 "data_size": 63488 00:21:06.244 }, 00:21:06.244 { 00:21:06.244 "name": "BaseBdev4", 00:21:06.244 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:06.244 "is_configured": true, 00:21:06.244 "data_offset": 2048, 00:21:06.244 "data_size": 63488 00:21:06.244 } 00:21:06.244 ] 00:21:06.244 }' 00:21:06.244 06:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.244 06:37:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:06.811 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.811 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:07.069 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:07.069 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:07.327 [2024-07-25 06:37:20.696174] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:07.327 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:21:07.327 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:07.327 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:07.327 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:07.327 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:07.327 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:07.327 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.327 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.328 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.328 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.328 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.328 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:07.586 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.586 "name": "Existed_Raid", 00:21:07.586 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:07.586 "strip_size_kb": 64, 00:21:07.586 "state": "configuring", 00:21:07.586 "raid_level": "raid0", 00:21:07.586 "superblock": true, 00:21:07.586 "num_base_bdevs": 4, 00:21:07.586 "num_base_bdevs_discovered": 2, 00:21:07.586 "num_base_bdevs_operational": 4, 00:21:07.586 "base_bdevs_list": [ 00:21:07.586 { 00:21:07.586 "name": null, 00:21:07.586 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:07.586 "is_configured": false, 00:21:07.586 "data_offset": 2048, 00:21:07.586 "data_size": 63488 00:21:07.586 }, 00:21:07.586 { 00:21:07.586 "name": null, 00:21:07.586 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:07.586 "is_configured": false, 00:21:07.586 "data_offset": 2048, 00:21:07.586 "data_size": 63488 00:21:07.586 }, 00:21:07.586 { 00:21:07.586 "name": "BaseBdev3", 00:21:07.586 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:07.586 "is_configured": true, 00:21:07.586 "data_offset": 2048, 00:21:07.586 "data_size": 63488 00:21:07.586 }, 00:21:07.586 { 00:21:07.586 "name": "BaseBdev4", 00:21:07.586 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:07.586 "is_configured": true, 00:21:07.586 "data_offset": 2048, 00:21:07.586 "data_size": 63488 00:21:07.586 } 00:21:07.586 ] 00:21:07.586 }' 00:21:07.586 06:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.586 06:37:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:08.153 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.153 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:08.412 [2024-07-25 06:37:21.937381] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.412 06:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:08.670 06:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:08.670 "name": "Existed_Raid", 00:21:08.670 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:08.670 "strip_size_kb": 64, 00:21:08.670 "state": "configuring", 00:21:08.670 "raid_level": "raid0", 00:21:08.670 "superblock": true, 00:21:08.670 "num_base_bdevs": 4, 00:21:08.670 "num_base_bdevs_discovered": 3, 00:21:08.670 "num_base_bdevs_operational": 4, 00:21:08.670 "base_bdevs_list": [ 00:21:08.670 { 00:21:08.670 "name": null, 00:21:08.670 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:08.670 "is_configured": false, 00:21:08.670 "data_offset": 2048, 00:21:08.670 "data_size": 63488 00:21:08.670 }, 00:21:08.670 { 00:21:08.670 "name": "BaseBdev2", 00:21:08.670 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:08.670 "is_configured": true, 00:21:08.670 "data_offset": 2048, 00:21:08.670 "data_size": 63488 00:21:08.670 }, 00:21:08.670 { 00:21:08.670 "name": "BaseBdev3", 00:21:08.670 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:08.670 "is_configured": true, 00:21:08.670 "data_offset": 2048, 00:21:08.670 "data_size": 63488 00:21:08.670 }, 00:21:08.670 { 00:21:08.670 "name": "BaseBdev4", 00:21:08.670 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:08.670 "is_configured": true, 00:21:08.670 "data_offset": 2048, 00:21:08.670 "data_size": 63488 00:21:08.670 } 00:21:08.670 ] 00:21:08.670 }' 00:21:08.670 06:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:08.670 06:37:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:09.235 06:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.235 06:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:09.493 06:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:09.493 06:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.493 06:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:09.750 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d08cafa3-cc36-4ef9-8cad-35af3160815d 00:21:10.030 [2024-07-25 06:37:23.396444] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:10.030 [2024-07-25 06:37:23.396590] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xdd8710 00:21:10.030 [2024-07-25 06:37:23.396601] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:10.030 [2024-07-25 06:37:23.396768] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd0000 00:21:10.030 [2024-07-25 06:37:23.396873] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdd8710 00:21:10.030 [2024-07-25 06:37:23.396883] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xdd8710 00:21:10.030 [2024-07-25 06:37:23.396963] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:10.030 NewBaseBdev 00:21:10.030 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:10.030 06:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:21:10.030 06:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:10.030 06:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:10.030 06:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:10.030 06:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:10.030 06:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:10.289 06:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:10.546 [ 00:21:10.546 { 00:21:10.546 "name": "NewBaseBdev", 00:21:10.546 "aliases": [ 00:21:10.546 "d08cafa3-cc36-4ef9-8cad-35af3160815d" 00:21:10.546 ], 00:21:10.546 "product_name": "Malloc disk", 00:21:10.546 "block_size": 512, 00:21:10.546 "num_blocks": 65536, 00:21:10.546 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:10.546 "assigned_rate_limits": { 00:21:10.546 "rw_ios_per_sec": 0, 00:21:10.546 "rw_mbytes_per_sec": 0, 00:21:10.546 "r_mbytes_per_sec": 0, 00:21:10.546 "w_mbytes_per_sec": 0 00:21:10.546 }, 00:21:10.546 "claimed": true, 00:21:10.546 "claim_type": "exclusive_write", 00:21:10.546 "zoned": false, 00:21:10.546 "supported_io_types": { 00:21:10.546 "read": true, 00:21:10.546 "write": true, 00:21:10.546 "unmap": true, 00:21:10.546 "flush": true, 00:21:10.546 "reset": true, 00:21:10.546 "nvme_admin": false, 00:21:10.546 "nvme_io": false, 00:21:10.546 "nvme_io_md": false, 00:21:10.546 "write_zeroes": true, 00:21:10.546 "zcopy": true, 00:21:10.546 "get_zone_info": false, 00:21:10.546 "zone_management": false, 00:21:10.546 "zone_append": false, 00:21:10.546 "compare": false, 00:21:10.546 "compare_and_write": false, 00:21:10.546 "abort": true, 00:21:10.546 "seek_hole": false, 00:21:10.546 "seek_data": false, 00:21:10.546 "copy": true, 00:21:10.546 "nvme_iov_md": false 00:21:10.546 }, 00:21:10.546 "memory_domains": [ 00:21:10.546 { 00:21:10.546 "dma_device_id": "system", 00:21:10.546 "dma_device_type": 1 00:21:10.546 }, 00:21:10.546 { 00:21:10.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:10.546 "dma_device_type": 2 00:21:10.546 } 00:21:10.546 ], 00:21:10.546 "driver_specific": {} 00:21:10.546 } 00:21:10.546 ] 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.546 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.547 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.547 06:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.804 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.804 "name": "Existed_Raid", 00:21:10.804 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:10.804 "strip_size_kb": 64, 00:21:10.804 "state": "online", 00:21:10.804 "raid_level": "raid0", 00:21:10.804 "superblock": true, 00:21:10.804 "num_base_bdevs": 4, 00:21:10.804 "num_base_bdevs_discovered": 4, 00:21:10.804 "num_base_bdevs_operational": 4, 00:21:10.804 "base_bdevs_list": [ 00:21:10.804 { 00:21:10.804 "name": "NewBaseBdev", 00:21:10.804 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:10.804 "is_configured": true, 00:21:10.804 "data_offset": 2048, 00:21:10.804 "data_size": 63488 00:21:10.804 }, 00:21:10.804 { 00:21:10.804 "name": "BaseBdev2", 00:21:10.804 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:10.804 "is_configured": true, 00:21:10.804 "data_offset": 2048, 00:21:10.804 "data_size": 63488 00:21:10.804 }, 00:21:10.804 { 00:21:10.804 "name": "BaseBdev3", 00:21:10.804 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:10.804 "is_configured": true, 00:21:10.804 "data_offset": 2048, 00:21:10.804 "data_size": 63488 00:21:10.804 }, 00:21:10.804 { 00:21:10.804 "name": "BaseBdev4", 00:21:10.804 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:10.804 "is_configured": true, 00:21:10.804 "data_offset": 2048, 00:21:10.804 "data_size": 63488 00:21:10.804 } 00:21:10.804 ] 00:21:10.804 }' 00:21:10.804 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.804 06:37:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:11.370 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:11.370 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:11.370 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:11.370 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:11.370 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:11.370 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:11.370 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:11.370 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:11.627 [2024-07-25 06:37:24.936815] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:11.627 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:11.627 "name": "Existed_Raid", 00:21:11.627 "aliases": [ 00:21:11.627 "654aad63-4591-408d-8145-2898f4f2bab3" 00:21:11.627 ], 00:21:11.627 "product_name": "Raid Volume", 00:21:11.627 "block_size": 512, 00:21:11.627 "num_blocks": 253952, 00:21:11.627 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:11.627 "assigned_rate_limits": { 00:21:11.627 "rw_ios_per_sec": 0, 00:21:11.627 "rw_mbytes_per_sec": 0, 00:21:11.627 "r_mbytes_per_sec": 0, 00:21:11.627 "w_mbytes_per_sec": 0 00:21:11.627 }, 00:21:11.627 "claimed": false, 00:21:11.627 "zoned": false, 00:21:11.627 "supported_io_types": { 00:21:11.627 "read": true, 00:21:11.627 "write": true, 00:21:11.627 "unmap": true, 00:21:11.627 "flush": true, 00:21:11.627 "reset": true, 00:21:11.627 "nvme_admin": false, 00:21:11.627 "nvme_io": false, 00:21:11.627 "nvme_io_md": false, 00:21:11.627 "write_zeroes": true, 00:21:11.627 "zcopy": false, 00:21:11.627 "get_zone_info": false, 00:21:11.627 "zone_management": false, 00:21:11.627 "zone_append": false, 00:21:11.627 "compare": false, 00:21:11.627 "compare_and_write": false, 00:21:11.627 "abort": false, 00:21:11.627 "seek_hole": false, 00:21:11.627 "seek_data": false, 00:21:11.627 "copy": false, 00:21:11.627 "nvme_iov_md": false 00:21:11.627 }, 00:21:11.627 "memory_domains": [ 00:21:11.627 { 00:21:11.627 "dma_device_id": "system", 00:21:11.627 "dma_device_type": 1 00:21:11.627 }, 00:21:11.627 { 00:21:11.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.627 "dma_device_type": 2 00:21:11.627 }, 00:21:11.627 { 00:21:11.627 "dma_device_id": "system", 00:21:11.627 "dma_device_type": 1 00:21:11.627 }, 00:21:11.627 { 00:21:11.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.627 "dma_device_type": 2 00:21:11.627 }, 00:21:11.627 { 00:21:11.628 "dma_device_id": "system", 00:21:11.628 "dma_device_type": 1 00:21:11.628 }, 00:21:11.628 { 00:21:11.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.628 "dma_device_type": 2 00:21:11.628 }, 00:21:11.628 { 00:21:11.628 "dma_device_id": "system", 00:21:11.628 "dma_device_type": 1 00:21:11.628 }, 00:21:11.628 { 00:21:11.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.628 "dma_device_type": 2 00:21:11.628 } 00:21:11.628 ], 00:21:11.628 "driver_specific": { 00:21:11.628 "raid": { 00:21:11.628 "uuid": "654aad63-4591-408d-8145-2898f4f2bab3", 00:21:11.628 "strip_size_kb": 64, 00:21:11.628 "state": "online", 00:21:11.628 "raid_level": "raid0", 00:21:11.628 "superblock": true, 00:21:11.628 "num_base_bdevs": 4, 00:21:11.628 "num_base_bdevs_discovered": 4, 00:21:11.628 "num_base_bdevs_operational": 4, 00:21:11.628 "base_bdevs_list": [ 00:21:11.628 { 00:21:11.628 "name": "NewBaseBdev", 00:21:11.628 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:11.628 "is_configured": true, 00:21:11.628 "data_offset": 2048, 00:21:11.628 "data_size": 63488 00:21:11.628 }, 00:21:11.628 { 00:21:11.628 "name": "BaseBdev2", 00:21:11.628 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:11.628 "is_configured": true, 00:21:11.628 "data_offset": 2048, 00:21:11.628 "data_size": 63488 00:21:11.628 }, 00:21:11.628 { 00:21:11.628 "name": "BaseBdev3", 00:21:11.628 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:11.628 "is_configured": true, 00:21:11.628 "data_offset": 2048, 00:21:11.628 "data_size": 63488 00:21:11.628 }, 00:21:11.628 { 00:21:11.628 "name": "BaseBdev4", 00:21:11.628 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:11.628 "is_configured": true, 00:21:11.628 "data_offset": 2048, 00:21:11.628 "data_size": 63488 00:21:11.628 } 00:21:11.628 ] 00:21:11.628 } 00:21:11.628 } 00:21:11.628 }' 00:21:11.628 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:11.628 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:11.628 BaseBdev2 00:21:11.628 BaseBdev3 00:21:11.628 BaseBdev4' 00:21:11.628 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:11.628 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:11.628 06:37:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:11.885 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:11.885 "name": "NewBaseBdev", 00:21:11.885 "aliases": [ 00:21:11.885 "d08cafa3-cc36-4ef9-8cad-35af3160815d" 00:21:11.885 ], 00:21:11.885 "product_name": "Malloc disk", 00:21:11.885 "block_size": 512, 00:21:11.885 "num_blocks": 65536, 00:21:11.885 "uuid": "d08cafa3-cc36-4ef9-8cad-35af3160815d", 00:21:11.885 "assigned_rate_limits": { 00:21:11.885 "rw_ios_per_sec": 0, 00:21:11.885 "rw_mbytes_per_sec": 0, 00:21:11.885 "r_mbytes_per_sec": 0, 00:21:11.885 "w_mbytes_per_sec": 0 00:21:11.885 }, 00:21:11.885 "claimed": true, 00:21:11.885 "claim_type": "exclusive_write", 00:21:11.885 "zoned": false, 00:21:11.885 "supported_io_types": { 00:21:11.885 "read": true, 00:21:11.885 "write": true, 00:21:11.885 "unmap": true, 00:21:11.885 "flush": true, 00:21:11.885 "reset": true, 00:21:11.885 "nvme_admin": false, 00:21:11.885 "nvme_io": false, 00:21:11.885 "nvme_io_md": false, 00:21:11.885 "write_zeroes": true, 00:21:11.885 "zcopy": true, 00:21:11.885 "get_zone_info": false, 00:21:11.885 "zone_management": false, 00:21:11.885 "zone_append": false, 00:21:11.885 "compare": false, 00:21:11.885 "compare_and_write": false, 00:21:11.885 "abort": true, 00:21:11.885 "seek_hole": false, 00:21:11.885 "seek_data": false, 00:21:11.885 "copy": true, 00:21:11.885 "nvme_iov_md": false 00:21:11.885 }, 00:21:11.885 "memory_domains": [ 00:21:11.885 { 00:21:11.885 "dma_device_id": "system", 00:21:11.885 "dma_device_type": 1 00:21:11.885 }, 00:21:11.885 { 00:21:11.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.885 "dma_device_type": 2 00:21:11.885 } 00:21:11.885 ], 00:21:11.885 "driver_specific": {} 00:21:11.885 }' 00:21:11.885 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.885 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.885 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:11.885 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:11.885 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:11.885 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:11.885 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.142 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.142 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:12.142 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.142 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.142 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:12.142 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:12.142 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:12.142 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:12.399 "name": "BaseBdev2", 00:21:12.399 "aliases": [ 00:21:12.399 "14d1144c-7903-40a2-b623-1195137c8240" 00:21:12.399 ], 00:21:12.399 "product_name": "Malloc disk", 00:21:12.399 "block_size": 512, 00:21:12.399 "num_blocks": 65536, 00:21:12.399 "uuid": "14d1144c-7903-40a2-b623-1195137c8240", 00:21:12.399 "assigned_rate_limits": { 00:21:12.399 "rw_ios_per_sec": 0, 00:21:12.399 "rw_mbytes_per_sec": 0, 00:21:12.399 "r_mbytes_per_sec": 0, 00:21:12.399 "w_mbytes_per_sec": 0 00:21:12.399 }, 00:21:12.399 "claimed": true, 00:21:12.399 "claim_type": "exclusive_write", 00:21:12.399 "zoned": false, 00:21:12.399 "supported_io_types": { 00:21:12.399 "read": true, 00:21:12.399 "write": true, 00:21:12.399 "unmap": true, 00:21:12.399 "flush": true, 00:21:12.399 "reset": true, 00:21:12.399 "nvme_admin": false, 00:21:12.399 "nvme_io": false, 00:21:12.399 "nvme_io_md": false, 00:21:12.399 "write_zeroes": true, 00:21:12.399 "zcopy": true, 00:21:12.399 "get_zone_info": false, 00:21:12.399 "zone_management": false, 00:21:12.399 "zone_append": false, 00:21:12.399 "compare": false, 00:21:12.399 "compare_and_write": false, 00:21:12.399 "abort": true, 00:21:12.399 "seek_hole": false, 00:21:12.399 "seek_data": false, 00:21:12.399 "copy": true, 00:21:12.399 "nvme_iov_md": false 00:21:12.399 }, 00:21:12.399 "memory_domains": [ 00:21:12.399 { 00:21:12.399 "dma_device_id": "system", 00:21:12.399 "dma_device_type": 1 00:21:12.399 }, 00:21:12.399 { 00:21:12.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.399 "dma_device_type": 2 00:21:12.399 } 00:21:12.399 ], 00:21:12.399 "driver_specific": {} 00:21:12.399 }' 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.399 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.656 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:12.656 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.656 06:37:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.656 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:12.656 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:12.656 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:12.656 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:13.219 "name": "BaseBdev3", 00:21:13.219 "aliases": [ 00:21:13.219 "7d8108ea-deb7-44aa-b33d-011b7c956944" 00:21:13.219 ], 00:21:13.219 "product_name": "Malloc disk", 00:21:13.219 "block_size": 512, 00:21:13.219 "num_blocks": 65536, 00:21:13.219 "uuid": "7d8108ea-deb7-44aa-b33d-011b7c956944", 00:21:13.219 "assigned_rate_limits": { 00:21:13.219 "rw_ios_per_sec": 0, 00:21:13.219 "rw_mbytes_per_sec": 0, 00:21:13.219 "r_mbytes_per_sec": 0, 00:21:13.219 "w_mbytes_per_sec": 0 00:21:13.219 }, 00:21:13.219 "claimed": true, 00:21:13.219 "claim_type": "exclusive_write", 00:21:13.219 "zoned": false, 00:21:13.219 "supported_io_types": { 00:21:13.219 "read": true, 00:21:13.219 "write": true, 00:21:13.219 "unmap": true, 00:21:13.219 "flush": true, 00:21:13.219 "reset": true, 00:21:13.219 "nvme_admin": false, 00:21:13.219 "nvme_io": false, 00:21:13.219 "nvme_io_md": false, 00:21:13.219 "write_zeroes": true, 00:21:13.219 "zcopy": true, 00:21:13.219 "get_zone_info": false, 00:21:13.219 "zone_management": false, 00:21:13.219 "zone_append": false, 00:21:13.219 "compare": false, 00:21:13.219 "compare_and_write": false, 00:21:13.219 "abort": true, 00:21:13.219 "seek_hole": false, 00:21:13.219 "seek_data": false, 00:21:13.219 "copy": true, 00:21:13.219 "nvme_iov_md": false 00:21:13.219 }, 00:21:13.219 "memory_domains": [ 00:21:13.219 { 00:21:13.219 "dma_device_id": "system", 00:21:13.219 "dma_device_type": 1 00:21:13.219 }, 00:21:13.219 { 00:21:13.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.219 "dma_device_type": 2 00:21:13.219 } 00:21:13.219 ], 00:21:13.219 "driver_specific": {} 00:21:13.219 }' 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.219 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.476 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:13.476 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:13.476 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:13.476 06:37:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:14.041 "name": "BaseBdev4", 00:21:14.041 "aliases": [ 00:21:14.041 "f3ce824a-a598-45af-8a2b-be40ee184ebe" 00:21:14.041 ], 00:21:14.041 "product_name": "Malloc disk", 00:21:14.041 "block_size": 512, 00:21:14.041 "num_blocks": 65536, 00:21:14.041 "uuid": "f3ce824a-a598-45af-8a2b-be40ee184ebe", 00:21:14.041 "assigned_rate_limits": { 00:21:14.041 "rw_ios_per_sec": 0, 00:21:14.041 "rw_mbytes_per_sec": 0, 00:21:14.041 "r_mbytes_per_sec": 0, 00:21:14.041 "w_mbytes_per_sec": 0 00:21:14.041 }, 00:21:14.041 "claimed": true, 00:21:14.041 "claim_type": "exclusive_write", 00:21:14.041 "zoned": false, 00:21:14.041 "supported_io_types": { 00:21:14.041 "read": true, 00:21:14.041 "write": true, 00:21:14.041 "unmap": true, 00:21:14.041 "flush": true, 00:21:14.041 "reset": true, 00:21:14.041 "nvme_admin": false, 00:21:14.041 "nvme_io": false, 00:21:14.041 "nvme_io_md": false, 00:21:14.041 "write_zeroes": true, 00:21:14.041 "zcopy": true, 00:21:14.041 "get_zone_info": false, 00:21:14.041 "zone_management": false, 00:21:14.041 "zone_append": false, 00:21:14.041 "compare": false, 00:21:14.041 "compare_and_write": false, 00:21:14.041 "abort": true, 00:21:14.041 "seek_hole": false, 00:21:14.041 "seek_data": false, 00:21:14.041 "copy": true, 00:21:14.041 "nvme_iov_md": false 00:21:14.041 }, 00:21:14.041 "memory_domains": [ 00:21:14.041 { 00:21:14.041 "dma_device_id": "system", 00:21:14.041 "dma_device_type": 1 00:21:14.041 }, 00:21:14.041 { 00:21:14.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.041 "dma_device_type": 2 00:21:14.041 } 00:21:14.041 ], 00:21:14.041 "driver_specific": {} 00:21:14.041 }' 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:14.041 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:14.298 [2024-07-25 06:37:27.771974] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:14.298 [2024-07-25 06:37:27.772000] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:14.298 [2024-07-25 06:37:27.772056] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:14.298 [2024-07-25 06:37:27.772113] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:14.298 [2024-07-25 06:37:27.772124] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdd8710 name Existed_Raid, state offline 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1173559 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1173559 ']' 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1173559 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:14.298 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1173559 00:21:14.556 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:14.556 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:14.556 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1173559' 00:21:14.556 killing process with pid 1173559 00:21:14.556 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1173559 00:21:14.556 [2024-07-25 06:37:27.872557] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:14.556 06:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1173559 00:21:14.556 [2024-07-25 06:37:27.904222] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:14.556 06:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:14.556 00:21:14.556 real 0m30.912s 00:21:14.556 user 0m56.611s 00:21:14.556 sys 0m5.710s 00:21:14.556 06:37:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:14.556 06:37:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:14.556 ************************************ 00:21:14.556 END TEST raid_state_function_test_sb 00:21:14.556 ************************************ 00:21:14.813 06:37:28 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:21:14.813 06:37:28 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:21:14.813 06:37:28 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:14.813 06:37:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:14.813 ************************************ 00:21:14.813 START TEST raid_superblock_test 00:21:14.813 ************************************ 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 4 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1179510 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1179510 /var/tmp/spdk-raid.sock 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1179510 ']' 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:14.813 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:14.813 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.813 [2024-07-25 06:37:28.216863] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:21:14.813 [2024-07-25 06:37:28.216917] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1179510 ] 00:21:14.813 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.813 EAL: Requested device 0000:3d:01.0 cannot be used 00:21:14.813 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.813 EAL: Requested device 0000:3d:01.1 cannot be used 00:21:14.813 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.813 EAL: Requested device 0000:3d:01.2 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:01.3 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:01.4 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:01.5 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:01.6 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:01.7 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:02.0 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:02.1 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:02.2 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:02.3 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:02.4 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:02.5 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:02.6 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3d:02.7 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:01.0 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:01.1 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:01.2 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:01.3 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:01.4 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:01.5 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:01.6 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:01.7 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:02.0 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:02.1 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:02.2 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:02.3 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:02.4 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:02.5 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:02.6 cannot be used 00:21:14.814 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:14.814 EAL: Requested device 0000:3f:02.7 cannot be used 00:21:14.814 [2024-07-25 06:37:28.353329] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:15.071 [2024-07-25 06:37:28.398406] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:15.071 [2024-07-25 06:37:28.456635] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:15.071 [2024-07-25 06:37:28.456659] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:15.328 malloc1 00:21:15.328 06:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:15.586 [2024-07-25 06:37:29.068795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:15.586 [2024-07-25 06:37:29.068837] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:15.586 [2024-07-25 06:37:29.068856] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1438d70 00:21:15.586 [2024-07-25 06:37:29.068867] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:15.586 [2024-07-25 06:37:29.070355] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:15.586 [2024-07-25 06:37:29.070384] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:15.586 pt1 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:15.586 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:15.842 malloc2 00:21:15.843 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:16.099 [2024-07-25 06:37:29.530297] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:16.099 [2024-07-25 06:37:29.530336] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:16.099 [2024-07-25 06:37:29.530351] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1287790 00:21:16.099 [2024-07-25 06:37:29.530362] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:16.099 [2024-07-25 06:37:29.531686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:16.099 [2024-07-25 06:37:29.531712] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:16.099 pt2 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:16.099 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:16.357 malloc3 00:21:16.357 06:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:16.614 [2024-07-25 06:37:29.995860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:16.614 [2024-07-25 06:37:29.995899] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:16.614 [2024-07-25 06:37:29.995915] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x142c8c0 00:21:16.614 [2024-07-25 06:37:29.995926] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:16.614 [2024-07-25 06:37:29.997251] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:16.614 [2024-07-25 06:37:29.997276] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:16.614 pt3 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:16.614 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:16.871 malloc4 00:21:16.871 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:17.129 [2024-07-25 06:37:30.433374] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:17.129 [2024-07-25 06:37:30.433415] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:17.129 [2024-07-25 06:37:30.433430] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x142f300 00:21:17.129 [2024-07-25 06:37:30.433442] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:17.129 [2024-07-25 06:37:30.434726] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:17.129 [2024-07-25 06:37:30.434751] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:17.129 pt4 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:17.129 [2024-07-25 06:37:30.657980] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:17.129 [2024-07-25 06:37:30.659100] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:17.129 [2024-07-25 06:37:30.659166] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:17.129 [2024-07-25 06:37:30.659208] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:17.129 [2024-07-25 06:37:30.659361] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x127f770 00:21:17.129 [2024-07-25 06:37:30.659371] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:17.129 [2024-07-25 06:37:30.659544] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x142b9f0 00:21:17.129 [2024-07-25 06:37:30.659676] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x127f770 00:21:17.129 [2024-07-25 06:37:30.659685] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x127f770 00:21:17.129 [2024-07-25 06:37:30.659768] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.129 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:17.387 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:17.387 "name": "raid_bdev1", 00:21:17.387 "uuid": "85a83b9f-9940-4b49-af25-65a337f0c470", 00:21:17.387 "strip_size_kb": 64, 00:21:17.387 "state": "online", 00:21:17.387 "raid_level": "raid0", 00:21:17.387 "superblock": true, 00:21:17.387 "num_base_bdevs": 4, 00:21:17.387 "num_base_bdevs_discovered": 4, 00:21:17.387 "num_base_bdevs_operational": 4, 00:21:17.387 "base_bdevs_list": [ 00:21:17.387 { 00:21:17.387 "name": "pt1", 00:21:17.387 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:17.387 "is_configured": true, 00:21:17.387 "data_offset": 2048, 00:21:17.387 "data_size": 63488 00:21:17.387 }, 00:21:17.387 { 00:21:17.387 "name": "pt2", 00:21:17.387 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:17.387 "is_configured": true, 00:21:17.387 "data_offset": 2048, 00:21:17.387 "data_size": 63488 00:21:17.387 }, 00:21:17.387 { 00:21:17.387 "name": "pt3", 00:21:17.387 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:17.387 "is_configured": true, 00:21:17.387 "data_offset": 2048, 00:21:17.387 "data_size": 63488 00:21:17.387 }, 00:21:17.387 { 00:21:17.387 "name": "pt4", 00:21:17.387 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:17.387 "is_configured": true, 00:21:17.387 "data_offset": 2048, 00:21:17.387 "data_size": 63488 00:21:17.387 } 00:21:17.387 ] 00:21:17.387 }' 00:21:17.387 06:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:17.387 06:37:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:17.952 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:21:17.952 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:17.952 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:17.952 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:17.952 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:17.952 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:17.952 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:17.952 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:18.209 [2024-07-25 06:37:31.636963] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:18.209 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:18.209 "name": "raid_bdev1", 00:21:18.209 "aliases": [ 00:21:18.209 "85a83b9f-9940-4b49-af25-65a337f0c470" 00:21:18.209 ], 00:21:18.209 "product_name": "Raid Volume", 00:21:18.209 "block_size": 512, 00:21:18.209 "num_blocks": 253952, 00:21:18.209 "uuid": "85a83b9f-9940-4b49-af25-65a337f0c470", 00:21:18.209 "assigned_rate_limits": { 00:21:18.209 "rw_ios_per_sec": 0, 00:21:18.209 "rw_mbytes_per_sec": 0, 00:21:18.209 "r_mbytes_per_sec": 0, 00:21:18.209 "w_mbytes_per_sec": 0 00:21:18.209 }, 00:21:18.209 "claimed": false, 00:21:18.209 "zoned": false, 00:21:18.209 "supported_io_types": { 00:21:18.209 "read": true, 00:21:18.209 "write": true, 00:21:18.209 "unmap": true, 00:21:18.209 "flush": true, 00:21:18.209 "reset": true, 00:21:18.209 "nvme_admin": false, 00:21:18.209 "nvme_io": false, 00:21:18.209 "nvme_io_md": false, 00:21:18.209 "write_zeroes": true, 00:21:18.209 "zcopy": false, 00:21:18.209 "get_zone_info": false, 00:21:18.209 "zone_management": false, 00:21:18.209 "zone_append": false, 00:21:18.209 "compare": false, 00:21:18.209 "compare_and_write": false, 00:21:18.209 "abort": false, 00:21:18.209 "seek_hole": false, 00:21:18.209 "seek_data": false, 00:21:18.209 "copy": false, 00:21:18.209 "nvme_iov_md": false 00:21:18.209 }, 00:21:18.209 "memory_domains": [ 00:21:18.209 { 00:21:18.209 "dma_device_id": "system", 00:21:18.209 "dma_device_type": 1 00:21:18.209 }, 00:21:18.209 { 00:21:18.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.209 "dma_device_type": 2 00:21:18.209 }, 00:21:18.209 { 00:21:18.209 "dma_device_id": "system", 00:21:18.209 "dma_device_type": 1 00:21:18.209 }, 00:21:18.209 { 00:21:18.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.209 "dma_device_type": 2 00:21:18.209 }, 00:21:18.209 { 00:21:18.209 "dma_device_id": "system", 00:21:18.209 "dma_device_type": 1 00:21:18.209 }, 00:21:18.209 { 00:21:18.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.209 "dma_device_type": 2 00:21:18.209 }, 00:21:18.209 { 00:21:18.209 "dma_device_id": "system", 00:21:18.209 "dma_device_type": 1 00:21:18.209 }, 00:21:18.209 { 00:21:18.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.209 "dma_device_type": 2 00:21:18.209 } 00:21:18.209 ], 00:21:18.209 "driver_specific": { 00:21:18.209 "raid": { 00:21:18.210 "uuid": "85a83b9f-9940-4b49-af25-65a337f0c470", 00:21:18.210 "strip_size_kb": 64, 00:21:18.210 "state": "online", 00:21:18.210 "raid_level": "raid0", 00:21:18.210 "superblock": true, 00:21:18.210 "num_base_bdevs": 4, 00:21:18.210 "num_base_bdevs_discovered": 4, 00:21:18.210 "num_base_bdevs_operational": 4, 00:21:18.210 "base_bdevs_list": [ 00:21:18.210 { 00:21:18.210 "name": "pt1", 00:21:18.210 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:18.210 "is_configured": true, 00:21:18.210 "data_offset": 2048, 00:21:18.210 "data_size": 63488 00:21:18.210 }, 00:21:18.210 { 00:21:18.210 "name": "pt2", 00:21:18.210 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:18.210 "is_configured": true, 00:21:18.210 "data_offset": 2048, 00:21:18.210 "data_size": 63488 00:21:18.210 }, 00:21:18.210 { 00:21:18.210 "name": "pt3", 00:21:18.210 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:18.210 "is_configured": true, 00:21:18.210 "data_offset": 2048, 00:21:18.210 "data_size": 63488 00:21:18.210 }, 00:21:18.210 { 00:21:18.210 "name": "pt4", 00:21:18.210 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:18.210 "is_configured": true, 00:21:18.210 "data_offset": 2048, 00:21:18.210 "data_size": 63488 00:21:18.210 } 00:21:18.210 ] 00:21:18.210 } 00:21:18.210 } 00:21:18.210 }' 00:21:18.210 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:18.210 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:18.210 pt2 00:21:18.210 pt3 00:21:18.210 pt4' 00:21:18.210 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:18.210 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:18.210 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:18.468 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:18.468 "name": "pt1", 00:21:18.468 "aliases": [ 00:21:18.468 "00000000-0000-0000-0000-000000000001" 00:21:18.468 ], 00:21:18.468 "product_name": "passthru", 00:21:18.468 "block_size": 512, 00:21:18.468 "num_blocks": 65536, 00:21:18.468 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:18.468 "assigned_rate_limits": { 00:21:18.468 "rw_ios_per_sec": 0, 00:21:18.468 "rw_mbytes_per_sec": 0, 00:21:18.468 "r_mbytes_per_sec": 0, 00:21:18.468 "w_mbytes_per_sec": 0 00:21:18.468 }, 00:21:18.468 "claimed": true, 00:21:18.468 "claim_type": "exclusive_write", 00:21:18.468 "zoned": false, 00:21:18.468 "supported_io_types": { 00:21:18.468 "read": true, 00:21:18.468 "write": true, 00:21:18.468 "unmap": true, 00:21:18.468 "flush": true, 00:21:18.468 "reset": true, 00:21:18.468 "nvme_admin": false, 00:21:18.468 "nvme_io": false, 00:21:18.468 "nvme_io_md": false, 00:21:18.468 "write_zeroes": true, 00:21:18.468 "zcopy": true, 00:21:18.468 "get_zone_info": false, 00:21:18.468 "zone_management": false, 00:21:18.468 "zone_append": false, 00:21:18.468 "compare": false, 00:21:18.468 "compare_and_write": false, 00:21:18.468 "abort": true, 00:21:18.468 "seek_hole": false, 00:21:18.468 "seek_data": false, 00:21:18.468 "copy": true, 00:21:18.468 "nvme_iov_md": false 00:21:18.468 }, 00:21:18.468 "memory_domains": [ 00:21:18.468 { 00:21:18.468 "dma_device_id": "system", 00:21:18.468 "dma_device_type": 1 00:21:18.468 }, 00:21:18.468 { 00:21:18.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.468 "dma_device_type": 2 00:21:18.468 } 00:21:18.468 ], 00:21:18.468 "driver_specific": { 00:21:18.468 "passthru": { 00:21:18.468 "name": "pt1", 00:21:18.468 "base_bdev_name": "malloc1" 00:21:18.468 } 00:21:18.468 } 00:21:18.468 }' 00:21:18.468 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.468 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.468 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:18.468 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.468 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.468 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:18.468 06:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:18.468 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:18.725 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:18.725 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:18.725 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:18.725 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:18.725 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:18.725 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:18.725 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:18.983 "name": "pt2", 00:21:18.983 "aliases": [ 00:21:18.983 "00000000-0000-0000-0000-000000000002" 00:21:18.983 ], 00:21:18.983 "product_name": "passthru", 00:21:18.983 "block_size": 512, 00:21:18.983 "num_blocks": 65536, 00:21:18.983 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:18.983 "assigned_rate_limits": { 00:21:18.983 "rw_ios_per_sec": 0, 00:21:18.983 "rw_mbytes_per_sec": 0, 00:21:18.983 "r_mbytes_per_sec": 0, 00:21:18.983 "w_mbytes_per_sec": 0 00:21:18.983 }, 00:21:18.983 "claimed": true, 00:21:18.983 "claim_type": "exclusive_write", 00:21:18.983 "zoned": false, 00:21:18.983 "supported_io_types": { 00:21:18.983 "read": true, 00:21:18.983 "write": true, 00:21:18.983 "unmap": true, 00:21:18.983 "flush": true, 00:21:18.983 "reset": true, 00:21:18.983 "nvme_admin": false, 00:21:18.983 "nvme_io": false, 00:21:18.983 "nvme_io_md": false, 00:21:18.983 "write_zeroes": true, 00:21:18.983 "zcopy": true, 00:21:18.983 "get_zone_info": false, 00:21:18.983 "zone_management": false, 00:21:18.983 "zone_append": false, 00:21:18.983 "compare": false, 00:21:18.983 "compare_and_write": false, 00:21:18.983 "abort": true, 00:21:18.983 "seek_hole": false, 00:21:18.983 "seek_data": false, 00:21:18.983 "copy": true, 00:21:18.983 "nvme_iov_md": false 00:21:18.983 }, 00:21:18.983 "memory_domains": [ 00:21:18.983 { 00:21:18.983 "dma_device_id": "system", 00:21:18.983 "dma_device_type": 1 00:21:18.983 }, 00:21:18.983 { 00:21:18.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.983 "dma_device_type": 2 00:21:18.983 } 00:21:18.983 ], 00:21:18.983 "driver_specific": { 00:21:18.983 "passthru": { 00:21:18.983 "name": "pt2", 00:21:18.983 "base_bdev_name": "malloc2" 00:21:18.983 } 00:21:18.983 } 00:21:18.983 }' 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:18.983 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.241 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:19.241 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.241 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.241 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:19.241 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:19.241 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:19.241 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:19.498 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:19.498 "name": "pt3", 00:21:19.498 "aliases": [ 00:21:19.498 "00000000-0000-0000-0000-000000000003" 00:21:19.498 ], 00:21:19.498 "product_name": "passthru", 00:21:19.498 "block_size": 512, 00:21:19.498 "num_blocks": 65536, 00:21:19.498 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:19.498 "assigned_rate_limits": { 00:21:19.498 "rw_ios_per_sec": 0, 00:21:19.498 "rw_mbytes_per_sec": 0, 00:21:19.498 "r_mbytes_per_sec": 0, 00:21:19.498 "w_mbytes_per_sec": 0 00:21:19.498 }, 00:21:19.498 "claimed": true, 00:21:19.498 "claim_type": "exclusive_write", 00:21:19.498 "zoned": false, 00:21:19.498 "supported_io_types": { 00:21:19.498 "read": true, 00:21:19.498 "write": true, 00:21:19.498 "unmap": true, 00:21:19.498 "flush": true, 00:21:19.498 "reset": true, 00:21:19.498 "nvme_admin": false, 00:21:19.498 "nvme_io": false, 00:21:19.498 "nvme_io_md": false, 00:21:19.498 "write_zeroes": true, 00:21:19.498 "zcopy": true, 00:21:19.498 "get_zone_info": false, 00:21:19.498 "zone_management": false, 00:21:19.498 "zone_append": false, 00:21:19.498 "compare": false, 00:21:19.498 "compare_and_write": false, 00:21:19.498 "abort": true, 00:21:19.498 "seek_hole": false, 00:21:19.498 "seek_data": false, 00:21:19.498 "copy": true, 00:21:19.498 "nvme_iov_md": false 00:21:19.498 }, 00:21:19.498 "memory_domains": [ 00:21:19.498 { 00:21:19.498 "dma_device_id": "system", 00:21:19.498 "dma_device_type": 1 00:21:19.498 }, 00:21:19.498 { 00:21:19.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.498 "dma_device_type": 2 00:21:19.498 } 00:21:19.498 ], 00:21:19.498 "driver_specific": { 00:21:19.498 "passthru": { 00:21:19.498 "name": "pt3", 00:21:19.498 "base_bdev_name": "malloc3" 00:21:19.498 } 00:21:19.498 } 00:21:19.498 }' 00:21:19.498 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.498 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.498 06:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:19.498 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.498 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:19.756 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:20.016 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:20.016 "name": "pt4", 00:21:20.016 "aliases": [ 00:21:20.016 "00000000-0000-0000-0000-000000000004" 00:21:20.016 ], 00:21:20.016 "product_name": "passthru", 00:21:20.016 "block_size": 512, 00:21:20.016 "num_blocks": 65536, 00:21:20.016 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:20.016 "assigned_rate_limits": { 00:21:20.016 "rw_ios_per_sec": 0, 00:21:20.016 "rw_mbytes_per_sec": 0, 00:21:20.016 "r_mbytes_per_sec": 0, 00:21:20.016 "w_mbytes_per_sec": 0 00:21:20.016 }, 00:21:20.016 "claimed": true, 00:21:20.016 "claim_type": "exclusive_write", 00:21:20.016 "zoned": false, 00:21:20.016 "supported_io_types": { 00:21:20.016 "read": true, 00:21:20.016 "write": true, 00:21:20.016 "unmap": true, 00:21:20.016 "flush": true, 00:21:20.016 "reset": true, 00:21:20.016 "nvme_admin": false, 00:21:20.016 "nvme_io": false, 00:21:20.016 "nvme_io_md": false, 00:21:20.016 "write_zeroes": true, 00:21:20.016 "zcopy": true, 00:21:20.016 "get_zone_info": false, 00:21:20.016 "zone_management": false, 00:21:20.016 "zone_append": false, 00:21:20.016 "compare": false, 00:21:20.016 "compare_and_write": false, 00:21:20.016 "abort": true, 00:21:20.016 "seek_hole": false, 00:21:20.016 "seek_data": false, 00:21:20.016 "copy": true, 00:21:20.016 "nvme_iov_md": false 00:21:20.016 }, 00:21:20.016 "memory_domains": [ 00:21:20.016 { 00:21:20.016 "dma_device_id": "system", 00:21:20.016 "dma_device_type": 1 00:21:20.016 }, 00:21:20.016 { 00:21:20.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.016 "dma_device_type": 2 00:21:20.016 } 00:21:20.016 ], 00:21:20.016 "driver_specific": { 00:21:20.016 "passthru": { 00:21:20.016 "name": "pt4", 00:21:20.016 "base_bdev_name": "malloc4" 00:21:20.016 } 00:21:20.016 } 00:21:20.016 }' 00:21:20.016 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.016 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.016 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:20.016 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.281 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.281 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:20.281 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.281 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.281 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:20.282 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.282 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.282 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:20.282 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:20.282 06:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:21:20.539 [2024-07-25 06:37:33.987136] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:20.539 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=85a83b9f-9940-4b49-af25-65a337f0c470 00:21:20.539 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 85a83b9f-9940-4b49-af25-65a337f0c470 ']' 00:21:20.539 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:20.797 [2024-07-25 06:37:34.219656] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:20.797 [2024-07-25 06:37:34.219673] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:20.797 [2024-07-25 06:37:34.219719] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:20.797 [2024-07-25 06:37:34.219779] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:20.797 [2024-07-25 06:37:34.219790] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x127f770 name raid_bdev1, state offline 00:21:20.797 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.797 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:21:21.054 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:21:21.054 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:21:21.054 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:21.054 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:21.312 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:21.312 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:21.569 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:21.569 06:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:21.569 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:21.569 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:21.827 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:21.827 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:22.084 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:22.085 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:22.085 [2024-07-25 06:37:35.635325] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:22.085 [2024-07-25 06:37:35.636563] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:22.085 [2024-07-25 06:37:35.636602] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:22.085 [2024-07-25 06:37:35.636633] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:22.085 [2024-07-25 06:37:35.636673] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:22.085 [2024-07-25 06:37:35.636708] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:22.085 [2024-07-25 06:37:35.636729] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:22.085 [2024-07-25 06:37:35.636749] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:22.085 [2024-07-25 06:37:35.636771] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:22.085 [2024-07-25 06:37:35.636781] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x127f140 name raid_bdev1, state configuring 00:21:22.085 request: 00:21:22.085 { 00:21:22.085 "name": "raid_bdev1", 00:21:22.085 "raid_level": "raid0", 00:21:22.085 "base_bdevs": [ 00:21:22.085 "malloc1", 00:21:22.085 "malloc2", 00:21:22.085 "malloc3", 00:21:22.085 "malloc4" 00:21:22.085 ], 00:21:22.085 "strip_size_kb": 64, 00:21:22.085 "superblock": false, 00:21:22.085 "method": "bdev_raid_create", 00:21:22.085 "req_id": 1 00:21:22.085 } 00:21:22.085 Got JSON-RPC error response 00:21:22.085 response: 00:21:22.085 { 00:21:22.085 "code": -17, 00:21:22.085 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:22.085 } 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:21:22.343 06:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:22.601 [2024-07-25 06:37:36.076432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:22.601 [2024-07-25 06:37:36.076467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:22.601 [2024-07-25 06:37:36.076483] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14386b0 00:21:22.601 [2024-07-25 06:37:36.076494] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:22.601 [2024-07-25 06:37:36.077943] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:22.601 [2024-07-25 06:37:36.077969] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:22.601 [2024-07-25 06:37:36.078027] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:22.601 [2024-07-25 06:37:36.078050] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:22.601 pt1 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.601 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.858 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.858 "name": "raid_bdev1", 00:21:22.858 "uuid": "85a83b9f-9940-4b49-af25-65a337f0c470", 00:21:22.858 "strip_size_kb": 64, 00:21:22.858 "state": "configuring", 00:21:22.858 "raid_level": "raid0", 00:21:22.858 "superblock": true, 00:21:22.858 "num_base_bdevs": 4, 00:21:22.858 "num_base_bdevs_discovered": 1, 00:21:22.858 "num_base_bdevs_operational": 4, 00:21:22.858 "base_bdevs_list": [ 00:21:22.858 { 00:21:22.858 "name": "pt1", 00:21:22.858 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:22.858 "is_configured": true, 00:21:22.858 "data_offset": 2048, 00:21:22.858 "data_size": 63488 00:21:22.858 }, 00:21:22.858 { 00:21:22.858 "name": null, 00:21:22.858 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:22.858 "is_configured": false, 00:21:22.858 "data_offset": 2048, 00:21:22.858 "data_size": 63488 00:21:22.858 }, 00:21:22.858 { 00:21:22.858 "name": null, 00:21:22.859 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:22.859 "is_configured": false, 00:21:22.859 "data_offset": 2048, 00:21:22.859 "data_size": 63488 00:21:22.859 }, 00:21:22.859 { 00:21:22.859 "name": null, 00:21:22.859 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:22.859 "is_configured": false, 00:21:22.859 "data_offset": 2048, 00:21:22.859 "data_size": 63488 00:21:22.859 } 00:21:22.859 ] 00:21:22.859 }' 00:21:22.859 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.859 06:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:23.423 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:21:23.423 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:23.423 [2024-07-25 06:37:36.978801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:23.423 [2024-07-25 06:37:36.978842] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:23.423 [2024-07-25 06:37:36.978858] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1429f60 00:21:23.423 [2024-07-25 06:37:36.978869] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:23.423 [2024-07-25 06:37:36.979176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:23.423 [2024-07-25 06:37:36.979193] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:23.423 [2024-07-25 06:37:36.979247] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:23.423 [2024-07-25 06:37:36.979263] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:23.682 pt2 00:21:23.682 06:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:23.682 [2024-07-25 06:37:37.211546] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:23.682 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.683 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.954 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:23.954 "name": "raid_bdev1", 00:21:23.954 "uuid": "85a83b9f-9940-4b49-af25-65a337f0c470", 00:21:23.954 "strip_size_kb": 64, 00:21:23.954 "state": "configuring", 00:21:23.954 "raid_level": "raid0", 00:21:23.954 "superblock": true, 00:21:23.954 "num_base_bdevs": 4, 00:21:23.954 "num_base_bdevs_discovered": 1, 00:21:23.954 "num_base_bdevs_operational": 4, 00:21:23.954 "base_bdevs_list": [ 00:21:23.954 { 00:21:23.954 "name": "pt1", 00:21:23.954 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:23.954 "is_configured": true, 00:21:23.954 "data_offset": 2048, 00:21:23.954 "data_size": 63488 00:21:23.954 }, 00:21:23.954 { 00:21:23.954 "name": null, 00:21:23.954 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:23.954 "is_configured": false, 00:21:23.954 "data_offset": 2048, 00:21:23.954 "data_size": 63488 00:21:23.954 }, 00:21:23.954 { 00:21:23.954 "name": null, 00:21:23.954 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:23.954 "is_configured": false, 00:21:23.954 "data_offset": 2048, 00:21:23.954 "data_size": 63488 00:21:23.954 }, 00:21:23.954 { 00:21:23.954 "name": null, 00:21:23.954 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:23.954 "is_configured": false, 00:21:23.954 "data_offset": 2048, 00:21:23.954 "data_size": 63488 00:21:23.954 } 00:21:23.954 ] 00:21:23.954 }' 00:21:23.954 06:37:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:23.954 06:37:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:24.518 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:21:24.518 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:24.518 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:24.776 [2024-07-25 06:37:38.230214] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:24.776 [2024-07-25 06:37:38.230260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:24.776 [2024-07-25 06:37:38.230276] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127dbc0 00:21:24.776 [2024-07-25 06:37:38.230287] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:24.776 [2024-07-25 06:37:38.230591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:24.776 [2024-07-25 06:37:38.230606] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:24.776 [2024-07-25 06:37:38.230660] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:24.776 [2024-07-25 06:37:38.230677] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:24.776 pt2 00:21:24.776 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:24.776 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:24.776 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:25.033 [2024-07-25 06:37:38.450800] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:25.033 [2024-07-25 06:37:38.450833] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.033 [2024-07-25 06:37:38.450850] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127fd60 00:21:25.033 [2024-07-25 06:37:38.450861] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.033 [2024-07-25 06:37:38.451118] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.033 [2024-07-25 06:37:38.451133] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:25.033 [2024-07-25 06:37:38.451188] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:25.033 [2024-07-25 06:37:38.451204] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:25.033 pt3 00:21:25.033 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:25.033 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:25.033 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:25.291 [2024-07-25 06:37:38.679406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:25.291 [2024-07-25 06:37:38.679434] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.291 [2024-07-25 06:37:38.679455] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x142b060 00:21:25.291 [2024-07-25 06:37:38.679466] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.291 [2024-07-25 06:37:38.679711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.291 [2024-07-25 06:37:38.679726] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:25.291 [2024-07-25 06:37:38.679772] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:25.291 [2024-07-25 06:37:38.679788] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:25.291 [2024-07-25 06:37:38.679890] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x142daf0 00:21:25.291 [2024-07-25 06:37:38.679899] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:25.291 [2024-07-25 06:37:38.680047] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1285db0 00:21:25.291 [2024-07-25 06:37:38.680171] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x142daf0 00:21:25.291 [2024-07-25 06:37:38.680180] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x142daf0 00:21:25.291 [2024-07-25 06:37:38.680264] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.291 pt4 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.291 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.548 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.548 "name": "raid_bdev1", 00:21:25.548 "uuid": "85a83b9f-9940-4b49-af25-65a337f0c470", 00:21:25.548 "strip_size_kb": 64, 00:21:25.548 "state": "online", 00:21:25.548 "raid_level": "raid0", 00:21:25.548 "superblock": true, 00:21:25.548 "num_base_bdevs": 4, 00:21:25.548 "num_base_bdevs_discovered": 4, 00:21:25.548 "num_base_bdevs_operational": 4, 00:21:25.548 "base_bdevs_list": [ 00:21:25.548 { 00:21:25.548 "name": "pt1", 00:21:25.548 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:25.548 "is_configured": true, 00:21:25.548 "data_offset": 2048, 00:21:25.548 "data_size": 63488 00:21:25.548 }, 00:21:25.548 { 00:21:25.548 "name": "pt2", 00:21:25.548 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:25.549 "is_configured": true, 00:21:25.549 "data_offset": 2048, 00:21:25.549 "data_size": 63488 00:21:25.549 }, 00:21:25.549 { 00:21:25.549 "name": "pt3", 00:21:25.549 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:25.549 "is_configured": true, 00:21:25.549 "data_offset": 2048, 00:21:25.549 "data_size": 63488 00:21:25.549 }, 00:21:25.549 { 00:21:25.549 "name": "pt4", 00:21:25.549 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:25.549 "is_configured": true, 00:21:25.549 "data_offset": 2048, 00:21:25.549 "data_size": 63488 00:21:25.549 } 00:21:25.549 ] 00:21:25.549 }' 00:21:25.549 06:37:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.549 06:37:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:26.113 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:21:26.113 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:26.113 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:26.113 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:26.113 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:26.113 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:26.113 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:26.113 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:26.370 [2024-07-25 06:37:39.702387] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:26.370 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:26.370 "name": "raid_bdev1", 00:21:26.370 "aliases": [ 00:21:26.370 "85a83b9f-9940-4b49-af25-65a337f0c470" 00:21:26.370 ], 00:21:26.370 "product_name": "Raid Volume", 00:21:26.370 "block_size": 512, 00:21:26.370 "num_blocks": 253952, 00:21:26.370 "uuid": "85a83b9f-9940-4b49-af25-65a337f0c470", 00:21:26.370 "assigned_rate_limits": { 00:21:26.370 "rw_ios_per_sec": 0, 00:21:26.370 "rw_mbytes_per_sec": 0, 00:21:26.370 "r_mbytes_per_sec": 0, 00:21:26.370 "w_mbytes_per_sec": 0 00:21:26.370 }, 00:21:26.370 "claimed": false, 00:21:26.370 "zoned": false, 00:21:26.370 "supported_io_types": { 00:21:26.370 "read": true, 00:21:26.370 "write": true, 00:21:26.370 "unmap": true, 00:21:26.370 "flush": true, 00:21:26.370 "reset": true, 00:21:26.370 "nvme_admin": false, 00:21:26.370 "nvme_io": false, 00:21:26.370 "nvme_io_md": false, 00:21:26.370 "write_zeroes": true, 00:21:26.370 "zcopy": false, 00:21:26.370 "get_zone_info": false, 00:21:26.370 "zone_management": false, 00:21:26.370 "zone_append": false, 00:21:26.370 "compare": false, 00:21:26.370 "compare_and_write": false, 00:21:26.370 "abort": false, 00:21:26.370 "seek_hole": false, 00:21:26.370 "seek_data": false, 00:21:26.370 "copy": false, 00:21:26.370 "nvme_iov_md": false 00:21:26.370 }, 00:21:26.370 "memory_domains": [ 00:21:26.370 { 00:21:26.370 "dma_device_id": "system", 00:21:26.370 "dma_device_type": 1 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.370 "dma_device_type": 2 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "dma_device_id": "system", 00:21:26.370 "dma_device_type": 1 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.370 "dma_device_type": 2 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "dma_device_id": "system", 00:21:26.370 "dma_device_type": 1 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.370 "dma_device_type": 2 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "dma_device_id": "system", 00:21:26.370 "dma_device_type": 1 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.370 "dma_device_type": 2 00:21:26.370 } 00:21:26.370 ], 00:21:26.370 "driver_specific": { 00:21:26.370 "raid": { 00:21:26.370 "uuid": "85a83b9f-9940-4b49-af25-65a337f0c470", 00:21:26.370 "strip_size_kb": 64, 00:21:26.370 "state": "online", 00:21:26.370 "raid_level": "raid0", 00:21:26.370 "superblock": true, 00:21:26.370 "num_base_bdevs": 4, 00:21:26.370 "num_base_bdevs_discovered": 4, 00:21:26.370 "num_base_bdevs_operational": 4, 00:21:26.370 "base_bdevs_list": [ 00:21:26.370 { 00:21:26.370 "name": "pt1", 00:21:26.370 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:26.370 "is_configured": true, 00:21:26.370 "data_offset": 2048, 00:21:26.370 "data_size": 63488 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "name": "pt2", 00:21:26.370 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:26.370 "is_configured": true, 00:21:26.370 "data_offset": 2048, 00:21:26.370 "data_size": 63488 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "name": "pt3", 00:21:26.370 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:26.370 "is_configured": true, 00:21:26.370 "data_offset": 2048, 00:21:26.370 "data_size": 63488 00:21:26.370 }, 00:21:26.370 { 00:21:26.370 "name": "pt4", 00:21:26.370 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:26.370 "is_configured": true, 00:21:26.370 "data_offset": 2048, 00:21:26.370 "data_size": 63488 00:21:26.370 } 00:21:26.370 ] 00:21:26.370 } 00:21:26.370 } 00:21:26.370 }' 00:21:26.370 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:26.370 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:26.370 pt2 00:21:26.370 pt3 00:21:26.370 pt4' 00:21:26.370 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.370 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:26.370 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:26.627 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:26.627 "name": "pt1", 00:21:26.627 "aliases": [ 00:21:26.627 "00000000-0000-0000-0000-000000000001" 00:21:26.627 ], 00:21:26.627 "product_name": "passthru", 00:21:26.627 "block_size": 512, 00:21:26.627 "num_blocks": 65536, 00:21:26.627 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:26.627 "assigned_rate_limits": { 00:21:26.627 "rw_ios_per_sec": 0, 00:21:26.627 "rw_mbytes_per_sec": 0, 00:21:26.627 "r_mbytes_per_sec": 0, 00:21:26.627 "w_mbytes_per_sec": 0 00:21:26.627 }, 00:21:26.627 "claimed": true, 00:21:26.627 "claim_type": "exclusive_write", 00:21:26.627 "zoned": false, 00:21:26.627 "supported_io_types": { 00:21:26.627 "read": true, 00:21:26.627 "write": true, 00:21:26.627 "unmap": true, 00:21:26.627 "flush": true, 00:21:26.627 "reset": true, 00:21:26.627 "nvme_admin": false, 00:21:26.627 "nvme_io": false, 00:21:26.627 "nvme_io_md": false, 00:21:26.627 "write_zeroes": true, 00:21:26.627 "zcopy": true, 00:21:26.627 "get_zone_info": false, 00:21:26.627 "zone_management": false, 00:21:26.627 "zone_append": false, 00:21:26.627 "compare": false, 00:21:26.627 "compare_and_write": false, 00:21:26.627 "abort": true, 00:21:26.627 "seek_hole": false, 00:21:26.627 "seek_data": false, 00:21:26.627 "copy": true, 00:21:26.627 "nvme_iov_md": false 00:21:26.627 }, 00:21:26.627 "memory_domains": [ 00:21:26.627 { 00:21:26.627 "dma_device_id": "system", 00:21:26.627 "dma_device_type": 1 00:21:26.627 }, 00:21:26.627 { 00:21:26.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.627 "dma_device_type": 2 00:21:26.627 } 00:21:26.627 ], 00:21:26.627 "driver_specific": { 00:21:26.627 "passthru": { 00:21:26.627 "name": "pt1", 00:21:26.627 "base_bdev_name": "malloc1" 00:21:26.627 } 00:21:26.627 } 00:21:26.627 }' 00:21:26.627 06:37:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.627 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.627 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:26.627 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.627 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.627 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:26.627 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.884 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.884 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:26.884 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.884 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.884 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:26.884 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.884 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:26.884 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:27.140 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:27.140 "name": "pt2", 00:21:27.140 "aliases": [ 00:21:27.140 "00000000-0000-0000-0000-000000000002" 00:21:27.140 ], 00:21:27.140 "product_name": "passthru", 00:21:27.140 "block_size": 512, 00:21:27.140 "num_blocks": 65536, 00:21:27.140 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:27.140 "assigned_rate_limits": { 00:21:27.140 "rw_ios_per_sec": 0, 00:21:27.140 "rw_mbytes_per_sec": 0, 00:21:27.140 "r_mbytes_per_sec": 0, 00:21:27.140 "w_mbytes_per_sec": 0 00:21:27.140 }, 00:21:27.140 "claimed": true, 00:21:27.140 "claim_type": "exclusive_write", 00:21:27.140 "zoned": false, 00:21:27.140 "supported_io_types": { 00:21:27.140 "read": true, 00:21:27.140 "write": true, 00:21:27.140 "unmap": true, 00:21:27.140 "flush": true, 00:21:27.140 "reset": true, 00:21:27.140 "nvme_admin": false, 00:21:27.140 "nvme_io": false, 00:21:27.140 "nvme_io_md": false, 00:21:27.140 "write_zeroes": true, 00:21:27.140 "zcopy": true, 00:21:27.140 "get_zone_info": false, 00:21:27.140 "zone_management": false, 00:21:27.140 "zone_append": false, 00:21:27.140 "compare": false, 00:21:27.140 "compare_and_write": false, 00:21:27.140 "abort": true, 00:21:27.140 "seek_hole": false, 00:21:27.140 "seek_data": false, 00:21:27.140 "copy": true, 00:21:27.140 "nvme_iov_md": false 00:21:27.140 }, 00:21:27.140 "memory_domains": [ 00:21:27.140 { 00:21:27.140 "dma_device_id": "system", 00:21:27.140 "dma_device_type": 1 00:21:27.140 }, 00:21:27.140 { 00:21:27.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.140 "dma_device_type": 2 00:21:27.140 } 00:21:27.140 ], 00:21:27.140 "driver_specific": { 00:21:27.140 "passthru": { 00:21:27.140 "name": "pt2", 00:21:27.140 "base_bdev_name": "malloc2" 00:21:27.140 } 00:21:27.140 } 00:21:27.140 }' 00:21:27.140 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.140 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.140 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:27.140 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:27.397 06:37:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:27.654 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:27.654 "name": "pt3", 00:21:27.654 "aliases": [ 00:21:27.654 "00000000-0000-0000-0000-000000000003" 00:21:27.654 ], 00:21:27.654 "product_name": "passthru", 00:21:27.654 "block_size": 512, 00:21:27.654 "num_blocks": 65536, 00:21:27.654 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:27.654 "assigned_rate_limits": { 00:21:27.654 "rw_ios_per_sec": 0, 00:21:27.654 "rw_mbytes_per_sec": 0, 00:21:27.654 "r_mbytes_per_sec": 0, 00:21:27.654 "w_mbytes_per_sec": 0 00:21:27.654 }, 00:21:27.654 "claimed": true, 00:21:27.654 "claim_type": "exclusive_write", 00:21:27.654 "zoned": false, 00:21:27.654 "supported_io_types": { 00:21:27.654 "read": true, 00:21:27.654 "write": true, 00:21:27.654 "unmap": true, 00:21:27.654 "flush": true, 00:21:27.654 "reset": true, 00:21:27.654 "nvme_admin": false, 00:21:27.654 "nvme_io": false, 00:21:27.654 "nvme_io_md": false, 00:21:27.654 "write_zeroes": true, 00:21:27.654 "zcopy": true, 00:21:27.654 "get_zone_info": false, 00:21:27.654 "zone_management": false, 00:21:27.654 "zone_append": false, 00:21:27.654 "compare": false, 00:21:27.654 "compare_and_write": false, 00:21:27.654 "abort": true, 00:21:27.654 "seek_hole": false, 00:21:27.654 "seek_data": false, 00:21:27.654 "copy": true, 00:21:27.654 "nvme_iov_md": false 00:21:27.654 }, 00:21:27.654 "memory_domains": [ 00:21:27.654 { 00:21:27.654 "dma_device_id": "system", 00:21:27.654 "dma_device_type": 1 00:21:27.654 }, 00:21:27.654 { 00:21:27.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.654 "dma_device_type": 2 00:21:27.654 } 00:21:27.654 ], 00:21:27.654 "driver_specific": { 00:21:27.654 "passthru": { 00:21:27.654 "name": "pt3", 00:21:27.654 "base_bdev_name": "malloc3" 00:21:27.654 } 00:21:27.654 } 00:21:27.654 }' 00:21:27.654 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.654 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.910 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.167 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:28.167 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:28.167 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:28.167 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:28.167 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:28.167 "name": "pt4", 00:21:28.167 "aliases": [ 00:21:28.167 "00000000-0000-0000-0000-000000000004" 00:21:28.167 ], 00:21:28.167 "product_name": "passthru", 00:21:28.167 "block_size": 512, 00:21:28.167 "num_blocks": 65536, 00:21:28.167 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:28.167 "assigned_rate_limits": { 00:21:28.167 "rw_ios_per_sec": 0, 00:21:28.167 "rw_mbytes_per_sec": 0, 00:21:28.167 "r_mbytes_per_sec": 0, 00:21:28.167 "w_mbytes_per_sec": 0 00:21:28.167 }, 00:21:28.167 "claimed": true, 00:21:28.167 "claim_type": "exclusive_write", 00:21:28.167 "zoned": false, 00:21:28.167 "supported_io_types": { 00:21:28.167 "read": true, 00:21:28.167 "write": true, 00:21:28.167 "unmap": true, 00:21:28.167 "flush": true, 00:21:28.167 "reset": true, 00:21:28.167 "nvme_admin": false, 00:21:28.167 "nvme_io": false, 00:21:28.167 "nvme_io_md": false, 00:21:28.167 "write_zeroes": true, 00:21:28.167 "zcopy": true, 00:21:28.167 "get_zone_info": false, 00:21:28.167 "zone_management": false, 00:21:28.167 "zone_append": false, 00:21:28.167 "compare": false, 00:21:28.167 "compare_and_write": false, 00:21:28.167 "abort": true, 00:21:28.167 "seek_hole": false, 00:21:28.167 "seek_data": false, 00:21:28.167 "copy": true, 00:21:28.167 "nvme_iov_md": false 00:21:28.167 }, 00:21:28.167 "memory_domains": [ 00:21:28.167 { 00:21:28.167 "dma_device_id": "system", 00:21:28.167 "dma_device_type": 1 00:21:28.167 }, 00:21:28.167 { 00:21:28.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:28.167 "dma_device_type": 2 00:21:28.167 } 00:21:28.167 ], 00:21:28.167 "driver_specific": { 00:21:28.167 "passthru": { 00:21:28.167 "name": "pt4", 00:21:28.167 "base_bdev_name": "malloc4" 00:21:28.167 } 00:21:28.167 } 00:21:28.167 }' 00:21:28.167 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:28.424 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:28.424 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:28.424 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.424 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.424 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:28.424 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.424 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.681 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:28.681 06:37:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.681 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.681 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:28.681 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:21:28.681 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:28.938 [2024-07-25 06:37:42.273384] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 85a83b9f-9940-4b49-af25-65a337f0c470 '!=' 85a83b9f-9940-4b49-af25-65a337f0c470 ']' 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1179510 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1179510 ']' 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1179510 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1179510 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1179510' 00:21:28.939 killing process with pid 1179510 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1179510 00:21:28.939 [2024-07-25 06:37:42.352157] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:28.939 [2024-07-25 06:37:42.352220] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:28.939 [2024-07-25 06:37:42.352283] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:28.939 [2024-07-25 06:37:42.352294] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x142daf0 name raid_bdev1, state offline 00:21:28.939 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1179510 00:21:28.939 [2024-07-25 06:37:42.384330] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:29.197 06:37:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:21:29.197 00:21:29.197 real 0m14.393s 00:21:29.197 user 0m26.394s 00:21:29.197 sys 0m2.701s 00:21:29.197 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:29.197 06:37:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:29.197 ************************************ 00:21:29.197 END TEST raid_superblock_test 00:21:29.197 ************************************ 00:21:29.197 06:37:42 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:21:29.197 06:37:42 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:29.197 06:37:42 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:29.197 06:37:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:29.197 ************************************ 00:21:29.197 START TEST raid_read_error_test 00:21:29.197 ************************************ 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 read 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.Nuquva1LIB 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1182216 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1182216 /var/tmp/spdk-raid.sock 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1182216 ']' 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:29.197 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:29.197 06:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:29.197 [2024-07-25 06:37:42.731152] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:21:29.197 [2024-07-25 06:37:42.731218] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1182216 ] 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:01.0 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:01.1 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:01.2 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:01.3 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:01.4 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:01.5 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:01.6 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:01.7 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:02.0 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:02.1 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:02.2 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:02.3 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:02.4 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:02.5 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:02.6 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3d:02.7 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:01.0 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:01.1 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:01.2 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:01.3 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:01.4 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:01.5 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:01.6 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:01.7 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:02.0 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:02.1 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:02.2 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:02.3 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:02.4 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:02.5 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:02.6 cannot be used 00:21:29.455 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:29.455 EAL: Requested device 0000:3f:02.7 cannot be used 00:21:29.455 [2024-07-25 06:37:42.870170] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:29.455 [2024-07-25 06:37:42.914024] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:29.455 [2024-07-25 06:37:42.966681] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:29.455 [2024-07-25 06:37:42.966713] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:30.384 06:37:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:30.384 06:37:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:21:30.384 06:37:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:30.384 06:37:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:30.384 BaseBdev1_malloc 00:21:30.384 06:37:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:30.641 true 00:21:30.641 06:37:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:30.897 [2024-07-25 06:37:44.248649] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:30.897 [2024-07-25 06:37:44.248689] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.897 [2024-07-25 06:37:44.248706] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c39a60 00:21:30.897 [2024-07-25 06:37:44.248717] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.897 [2024-07-25 06:37:44.250109] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.897 [2024-07-25 06:37:44.250136] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:30.897 BaseBdev1 00:21:30.897 06:37:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:30.897 06:37:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:31.154 BaseBdev2_malloc 00:21:31.154 06:37:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:31.154 true 00:21:31.412 06:37:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:31.412 [2024-07-25 06:37:44.930502] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:31.412 [2024-07-25 06:37:44.930541] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.412 [2024-07-25 06:37:44.930559] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3edc0 00:21:31.412 [2024-07-25 06:37:44.930571] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.412 [2024-07-25 06:37:44.931879] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.412 [2024-07-25 06:37:44.931906] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:31.412 BaseBdev2 00:21:31.412 06:37:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:31.412 06:37:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:31.669 BaseBdev3_malloc 00:21:31.669 06:37:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:31.927 true 00:21:31.927 06:37:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:32.185 [2024-07-25 06:37:45.620639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:32.185 [2024-07-25 06:37:45.620678] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:32.185 [2024-07-25 06:37:45.620695] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3f420 00:21:32.185 [2024-07-25 06:37:45.620706] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:32.185 [2024-07-25 06:37:45.621978] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:32.185 [2024-07-25 06:37:45.622003] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:32.185 BaseBdev3 00:21:32.185 06:37:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:32.185 06:37:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:32.442 BaseBdev4_malloc 00:21:32.443 06:37:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:32.700 true 00:21:32.700 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:32.958 [2024-07-25 06:37:46.302512] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:32.958 [2024-07-25 06:37:46.302550] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:32.958 [2024-07-25 06:37:46.302568] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c429b0 00:21:32.958 [2024-07-25 06:37:46.302580] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:32.958 [2024-07-25 06:37:46.303893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:32.958 [2024-07-25 06:37:46.303920] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:32.958 BaseBdev4 00:21:32.958 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:33.215 [2024-07-25 06:37:46.535152] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:33.215 [2024-07-25 06:37:46.536249] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:33.215 [2024-07-25 06:37:46.536311] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:33.215 [2024-07-25 06:37:46.536366] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:33.215 [2024-07-25 06:37:46.536575] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c42ec0 00:21:33.215 [2024-07-25 06:37:46.536585] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:33.215 [2024-07-25 06:37:46.536749] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a954b0 00:21:33.215 [2024-07-25 06:37:46.536881] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c42ec0 00:21:33.215 [2024-07-25 06:37:46.536891] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c42ec0 00:21:33.215 [2024-07-25 06:37:46.536979] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.215 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:33.473 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.473 "name": "raid_bdev1", 00:21:33.473 "uuid": "795a7d78-1148-4134-9aca-3361a0fecdcd", 00:21:33.473 "strip_size_kb": 64, 00:21:33.473 "state": "online", 00:21:33.473 "raid_level": "raid0", 00:21:33.473 "superblock": true, 00:21:33.473 "num_base_bdevs": 4, 00:21:33.473 "num_base_bdevs_discovered": 4, 00:21:33.473 "num_base_bdevs_operational": 4, 00:21:33.473 "base_bdevs_list": [ 00:21:33.473 { 00:21:33.473 "name": "BaseBdev1", 00:21:33.473 "uuid": "8af1d4b7-fade-58f2-af2e-024ea72da9be", 00:21:33.473 "is_configured": true, 00:21:33.473 "data_offset": 2048, 00:21:33.473 "data_size": 63488 00:21:33.473 }, 00:21:33.473 { 00:21:33.473 "name": "BaseBdev2", 00:21:33.473 "uuid": "2f03bf45-ada3-501d-aca5-a43d17b875f8", 00:21:33.473 "is_configured": true, 00:21:33.473 "data_offset": 2048, 00:21:33.473 "data_size": 63488 00:21:33.473 }, 00:21:33.473 { 00:21:33.473 "name": "BaseBdev3", 00:21:33.473 "uuid": "405d1094-b75d-53cb-8443-dbc382ad218f", 00:21:33.473 "is_configured": true, 00:21:33.473 "data_offset": 2048, 00:21:33.473 "data_size": 63488 00:21:33.473 }, 00:21:33.473 { 00:21:33.473 "name": "BaseBdev4", 00:21:33.473 "uuid": "80b44ecb-d951-5b3f-af4e-eb9c72411287", 00:21:33.473 "is_configured": true, 00:21:33.473 "data_offset": 2048, 00:21:33.473 "data_size": 63488 00:21:33.473 } 00:21:33.473 ] 00:21:33.473 }' 00:21:33.473 06:37:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.473 06:37:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:34.038 06:37:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:21:34.038 06:37:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:34.038 [2024-07-25 06:37:47.445784] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c46430 00:21:34.971 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.229 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.487 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.487 "name": "raid_bdev1", 00:21:35.487 "uuid": "795a7d78-1148-4134-9aca-3361a0fecdcd", 00:21:35.487 "strip_size_kb": 64, 00:21:35.487 "state": "online", 00:21:35.487 "raid_level": "raid0", 00:21:35.487 "superblock": true, 00:21:35.487 "num_base_bdevs": 4, 00:21:35.487 "num_base_bdevs_discovered": 4, 00:21:35.487 "num_base_bdevs_operational": 4, 00:21:35.487 "base_bdevs_list": [ 00:21:35.487 { 00:21:35.487 "name": "BaseBdev1", 00:21:35.487 "uuid": "8af1d4b7-fade-58f2-af2e-024ea72da9be", 00:21:35.487 "is_configured": true, 00:21:35.487 "data_offset": 2048, 00:21:35.487 "data_size": 63488 00:21:35.487 }, 00:21:35.487 { 00:21:35.487 "name": "BaseBdev2", 00:21:35.487 "uuid": "2f03bf45-ada3-501d-aca5-a43d17b875f8", 00:21:35.487 "is_configured": true, 00:21:35.487 "data_offset": 2048, 00:21:35.487 "data_size": 63488 00:21:35.487 }, 00:21:35.487 { 00:21:35.487 "name": "BaseBdev3", 00:21:35.487 "uuid": "405d1094-b75d-53cb-8443-dbc382ad218f", 00:21:35.487 "is_configured": true, 00:21:35.487 "data_offset": 2048, 00:21:35.487 "data_size": 63488 00:21:35.487 }, 00:21:35.487 { 00:21:35.487 "name": "BaseBdev4", 00:21:35.487 "uuid": "80b44ecb-d951-5b3f-af4e-eb9c72411287", 00:21:35.487 "is_configured": true, 00:21:35.487 "data_offset": 2048, 00:21:35.487 "data_size": 63488 00:21:35.487 } 00:21:35.487 ] 00:21:35.487 }' 00:21:35.487 06:37:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.487 06:37:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:36.053 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:36.053 [2024-07-25 06:37:49.591628] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:36.053 [2024-07-25 06:37:49.591660] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:36.053 [2024-07-25 06:37:49.594575] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:36.053 [2024-07-25 06:37:49.594614] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:36.053 [2024-07-25 06:37:49.594650] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:36.053 [2024-07-25 06:37:49.594660] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c42ec0 name raid_bdev1, state offline 00:21:36.053 0 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1182216 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1182216 ']' 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1182216 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1182216 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1182216' 00:21:36.311 killing process with pid 1182216 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1182216 00:21:36.311 [2024-07-25 06:37:49.668033] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:36.311 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1182216 00:21:36.311 [2024-07-25 06:37:49.694549] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.Nuquva1LIB 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:21:36.570 00:21:36.570 real 0m7.236s 00:21:36.570 user 0m11.500s 00:21:36.570 sys 0m1.307s 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:36.570 06:37:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:36.570 ************************************ 00:21:36.570 END TEST raid_read_error_test 00:21:36.570 ************************************ 00:21:36.570 06:37:49 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:21:36.570 06:37:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:36.570 06:37:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:36.570 06:37:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:36.570 ************************************ 00:21:36.570 START TEST raid_write_error_test 00:21:36.570 ************************************ 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 write 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.wIcTL87R8F 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1183629 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1183629 /var/tmp/spdk-raid.sock 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1183629 ']' 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:36.570 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:36.570 06:37:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:36.570 [2024-07-25 06:37:50.053171] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:21:36.571 [2024-07-25 06:37:50.053231] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1183629 ] 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:01.0 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:01.1 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:01.2 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:01.3 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:01.4 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:01.5 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:01.6 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:01.7 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:02.0 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:02.1 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:02.2 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:02.3 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:02.4 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:02.5 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:02.6 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3d:02.7 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:01.0 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:01.1 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:01.2 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:01.3 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:01.4 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:01.5 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:01.6 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:01.7 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:02.0 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:02.1 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:02.2 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:02.3 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:02.4 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:02.5 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:02.6 cannot be used 00:21:36.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:36.829 EAL: Requested device 0000:3f:02.7 cannot be used 00:21:36.829 [2024-07-25 06:37:50.191044] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:36.829 [2024-07-25 06:37:50.235798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:36.829 [2024-07-25 06:37:50.296023] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:36.829 [2024-07-25 06:37:50.296060] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:37.394 06:37:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:37.394 06:37:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:21:37.394 06:37:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:37.394 06:37:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:37.651 BaseBdev1_malloc 00:21:37.651 06:37:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:37.910 true 00:21:37.910 06:37:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:38.203 [2024-07-25 06:37:51.506985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:38.203 [2024-07-25 06:37:51.507029] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.203 [2024-07-25 06:37:51.507047] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a23a60 00:21:38.203 [2024-07-25 06:37:51.507058] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.203 [2024-07-25 06:37:51.508527] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.204 [2024-07-25 06:37:51.508555] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:38.204 BaseBdev1 00:21:38.204 06:37:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:38.204 06:37:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:38.204 BaseBdev2_malloc 00:21:38.204 06:37:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:38.461 true 00:21:38.461 06:37:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:38.461 [2024-07-25 06:37:51.992366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:38.461 [2024-07-25 06:37:51.992403] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.461 [2024-07-25 06:37:51.992421] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a28dc0 00:21:38.461 [2024-07-25 06:37:51.992433] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.461 [2024-07-25 06:37:51.993695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.461 [2024-07-25 06:37:51.993719] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:38.461 BaseBdev2 00:21:38.461 06:37:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:38.461 06:37:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:38.718 BaseBdev3_malloc 00:21:38.718 06:37:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:38.975 true 00:21:38.975 06:37:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:38.975 [2024-07-25 06:37:52.473701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:38.975 [2024-07-25 06:37:52.473736] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.975 [2024-07-25 06:37:52.473752] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a29420 00:21:38.975 [2024-07-25 06:37:52.473763] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.975 [2024-07-25 06:37:52.475013] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.975 [2024-07-25 06:37:52.475038] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:38.975 BaseBdev3 00:21:38.975 06:37:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:38.975 06:37:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:39.232 BaseBdev4_malloc 00:21:39.232 06:37:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:39.488 true 00:21:39.488 06:37:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:39.488 [2024-07-25 06:37:53.035325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:39.489 [2024-07-25 06:37:53.035361] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.489 [2024-07-25 06:37:53.035379] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a2c9b0 00:21:39.489 [2024-07-25 06:37:53.035395] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.489 [2024-07-25 06:37:53.036659] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.489 [2024-07-25 06:37:53.036685] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:39.489 BaseBdev4 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:39.746 [2024-07-25 06:37:53.263951] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:39.746 [2024-07-25 06:37:53.264993] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:39.746 [2024-07-25 06:37:53.265053] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:39.746 [2024-07-25 06:37:53.265107] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:39.746 [2024-07-25 06:37:53.265322] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a2cec0 00:21:39.746 [2024-07-25 06:37:53.265332] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:39.746 [2024-07-25 06:37:53.265489] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x187f4b0 00:21:39.746 [2024-07-25 06:37:53.265618] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a2cec0 00:21:39.746 [2024-07-25 06:37:53.265627] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a2cec0 00:21:39.746 [2024-07-25 06:37:53.265713] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.746 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.004 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.004 "name": "raid_bdev1", 00:21:40.004 "uuid": "57a7c47d-1517-424e-bd0c-88269fecb584", 00:21:40.004 "strip_size_kb": 64, 00:21:40.004 "state": "online", 00:21:40.004 "raid_level": "raid0", 00:21:40.004 "superblock": true, 00:21:40.004 "num_base_bdevs": 4, 00:21:40.004 "num_base_bdevs_discovered": 4, 00:21:40.004 "num_base_bdevs_operational": 4, 00:21:40.004 "base_bdevs_list": [ 00:21:40.004 { 00:21:40.004 "name": "BaseBdev1", 00:21:40.004 "uuid": "07855ee4-22bb-5128-aacd-03186653d9f5", 00:21:40.004 "is_configured": true, 00:21:40.004 "data_offset": 2048, 00:21:40.004 "data_size": 63488 00:21:40.004 }, 00:21:40.004 { 00:21:40.004 "name": "BaseBdev2", 00:21:40.004 "uuid": "5f894966-b62f-5baf-a6fa-2845e1911023", 00:21:40.004 "is_configured": true, 00:21:40.004 "data_offset": 2048, 00:21:40.004 "data_size": 63488 00:21:40.004 }, 00:21:40.004 { 00:21:40.004 "name": "BaseBdev3", 00:21:40.004 "uuid": "2ad02221-66cb-5edf-b7af-6e72ae1ebcb7", 00:21:40.004 "is_configured": true, 00:21:40.004 "data_offset": 2048, 00:21:40.004 "data_size": 63488 00:21:40.004 }, 00:21:40.004 { 00:21:40.004 "name": "BaseBdev4", 00:21:40.004 "uuid": "b8981796-fdcf-54fa-93cd-3fee275a2d0a", 00:21:40.004 "is_configured": true, 00:21:40.004 "data_offset": 2048, 00:21:40.004 "data_size": 63488 00:21:40.004 } 00:21:40.004 ] 00:21:40.004 }' 00:21:40.004 06:37:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.004 06:37:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.568 06:37:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:21:40.568 06:37:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:40.825 [2024-07-25 06:37:54.294914] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a30430 00:21:41.756 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.321 "name": "raid_bdev1", 00:21:42.321 "uuid": "57a7c47d-1517-424e-bd0c-88269fecb584", 00:21:42.321 "strip_size_kb": 64, 00:21:42.321 "state": "online", 00:21:42.321 "raid_level": "raid0", 00:21:42.321 "superblock": true, 00:21:42.321 "num_base_bdevs": 4, 00:21:42.321 "num_base_bdevs_discovered": 4, 00:21:42.321 "num_base_bdevs_operational": 4, 00:21:42.321 "base_bdevs_list": [ 00:21:42.321 { 00:21:42.321 "name": "BaseBdev1", 00:21:42.321 "uuid": "07855ee4-22bb-5128-aacd-03186653d9f5", 00:21:42.321 "is_configured": true, 00:21:42.321 "data_offset": 2048, 00:21:42.321 "data_size": 63488 00:21:42.321 }, 00:21:42.321 { 00:21:42.321 "name": "BaseBdev2", 00:21:42.321 "uuid": "5f894966-b62f-5baf-a6fa-2845e1911023", 00:21:42.321 "is_configured": true, 00:21:42.321 "data_offset": 2048, 00:21:42.321 "data_size": 63488 00:21:42.321 }, 00:21:42.321 { 00:21:42.321 "name": "BaseBdev3", 00:21:42.321 "uuid": "2ad02221-66cb-5edf-b7af-6e72ae1ebcb7", 00:21:42.321 "is_configured": true, 00:21:42.321 "data_offset": 2048, 00:21:42.321 "data_size": 63488 00:21:42.321 }, 00:21:42.321 { 00:21:42.321 "name": "BaseBdev4", 00:21:42.321 "uuid": "b8981796-fdcf-54fa-93cd-3fee275a2d0a", 00:21:42.321 "is_configured": true, 00:21:42.321 "data_offset": 2048, 00:21:42.321 "data_size": 63488 00:21:42.321 } 00:21:42.321 ] 00:21:42.321 }' 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.321 06:37:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.886 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:43.143 [2024-07-25 06:37:56.567040] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:43.143 [2024-07-25 06:37:56.567082] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:43.143 [2024-07-25 06:37:56.570072] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:43.143 [2024-07-25 06:37:56.570112] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:43.143 [2024-07-25 06:37:56.570152] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:43.143 [2024-07-25 06:37:56.570162] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a2cec0 name raid_bdev1, state offline 00:21:43.143 0 00:21:43.143 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1183629 00:21:43.143 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1183629 ']' 00:21:43.143 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1183629 00:21:43.143 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:21:43.143 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:43.143 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1183629 00:21:43.143 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:43.144 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:43.144 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1183629' 00:21:43.144 killing process with pid 1183629 00:21:43.144 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1183629 00:21:43.144 [2024-07-25 06:37:56.643905] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:43.144 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1183629 00:21:43.144 [2024-07-25 06:37:56.670160] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:43.402 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.wIcTL87R8F 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.44 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.44 != \0\.\0\0 ]] 00:21:43.403 00:21:43.403 real 0m6.889s 00:21:43.403 user 0m10.962s 00:21:43.403 sys 0m1.267s 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:43.403 06:37:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:43.403 ************************************ 00:21:43.403 END TEST raid_write_error_test 00:21:43.403 ************************************ 00:21:43.403 06:37:56 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:21:43.403 06:37:56 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:21:43.403 06:37:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:43.403 06:37:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:43.403 06:37:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:43.403 ************************************ 00:21:43.403 START TEST raid_state_function_test 00:21:43.403 ************************************ 00:21:43.403 06:37:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 false 00:21:43.403 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:21:43.403 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:43.403 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:43.403 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1184799 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1184799' 00:21:43.661 Process raid pid: 1184799 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1184799 /var/tmp/spdk-raid.sock 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1184799 ']' 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:43.661 06:37:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:43.662 06:37:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:43.662 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:43.662 06:37:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:43.662 06:37:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:43.662 [2024-07-25 06:37:57.026594] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:21:43.662 [2024-07-25 06:37:57.026654] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:01.0 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:01.1 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:01.2 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:01.3 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:01.4 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:01.5 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:01.6 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:01.7 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:02.0 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:02.1 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:02.2 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:02.3 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:02.4 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:02.5 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:02.6 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3d:02.7 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:01.0 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:01.1 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:01.2 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:01.3 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:01.4 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:01.5 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:01.6 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:01.7 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:02.0 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:02.1 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:02.2 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:02.3 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:02.4 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:02.5 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:02.6 cannot be used 00:21:43.662 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:21:43.662 EAL: Requested device 0000:3f:02.7 cannot be used 00:21:43.662 [2024-07-25 06:37:57.164987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:43.662 [2024-07-25 06:37:57.207836] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:43.920 [2024-07-25 06:37:57.260778] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.920 [2024-07-25 06:37:57.260821] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:44.484 06:37:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:44.484 06:37:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:21:44.484 06:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:44.741 [2024-07-25 06:37:58.080343] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:44.741 [2024-07-25 06:37:58.080382] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:44.741 [2024-07-25 06:37:58.080392] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:44.741 [2024-07-25 06:37:58.080402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:44.741 [2024-07-25 06:37:58.080410] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:44.741 [2024-07-25 06:37:58.080420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:44.741 [2024-07-25 06:37:58.080428] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:44.741 [2024-07-25 06:37:58.080438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:44.741 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:44.741 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:44.741 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:44.741 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:44.741 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:44.741 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:44.741 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.741 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.742 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.742 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.742 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.742 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.001 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.001 "name": "Existed_Raid", 00:21:45.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.001 "strip_size_kb": 64, 00:21:45.001 "state": "configuring", 00:21:45.001 "raid_level": "concat", 00:21:45.001 "superblock": false, 00:21:45.001 "num_base_bdevs": 4, 00:21:45.001 "num_base_bdevs_discovered": 0, 00:21:45.001 "num_base_bdevs_operational": 4, 00:21:45.001 "base_bdevs_list": [ 00:21:45.001 { 00:21:45.001 "name": "BaseBdev1", 00:21:45.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.001 "is_configured": false, 00:21:45.001 "data_offset": 0, 00:21:45.001 "data_size": 0 00:21:45.001 }, 00:21:45.001 { 00:21:45.001 "name": "BaseBdev2", 00:21:45.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.001 "is_configured": false, 00:21:45.001 "data_offset": 0, 00:21:45.001 "data_size": 0 00:21:45.001 }, 00:21:45.001 { 00:21:45.001 "name": "BaseBdev3", 00:21:45.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.001 "is_configured": false, 00:21:45.001 "data_offset": 0, 00:21:45.001 "data_size": 0 00:21:45.001 }, 00:21:45.001 { 00:21:45.001 "name": "BaseBdev4", 00:21:45.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.001 "is_configured": false, 00:21:45.001 "data_offset": 0, 00:21:45.001 "data_size": 0 00:21:45.001 } 00:21:45.001 ] 00:21:45.001 }' 00:21:45.001 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.001 06:37:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:45.565 06:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:45.823 [2024-07-25 06:37:59.126969] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:45.823 [2024-07-25 06:37:59.126998] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c0470 name Existed_Raid, state configuring 00:21:45.823 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:45.823 [2024-07-25 06:37:59.355582] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:45.823 [2024-07-25 06:37:59.355606] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:45.823 [2024-07-25 06:37:59.355615] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:45.823 [2024-07-25 06:37:59.355625] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:45.823 [2024-07-25 06:37:59.355633] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:45.823 [2024-07-25 06:37:59.355643] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:45.823 [2024-07-25 06:37:59.355651] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:45.823 [2024-07-25 06:37:59.355660] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:45.823 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:46.081 [2024-07-25 06:37:59.593610] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:46.081 BaseBdev1 00:21:46.081 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:46.081 06:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:21:46.081 06:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:46.081 06:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:46.081 06:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:46.081 06:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:46.081 06:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:46.338 06:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:46.596 [ 00:21:46.596 { 00:21:46.596 "name": "BaseBdev1", 00:21:46.596 "aliases": [ 00:21:46.596 "57b8321a-f21e-4f42-b4b6-0a55d174dff6" 00:21:46.596 ], 00:21:46.596 "product_name": "Malloc disk", 00:21:46.596 "block_size": 512, 00:21:46.596 "num_blocks": 65536, 00:21:46.596 "uuid": "57b8321a-f21e-4f42-b4b6-0a55d174dff6", 00:21:46.596 "assigned_rate_limits": { 00:21:46.596 "rw_ios_per_sec": 0, 00:21:46.596 "rw_mbytes_per_sec": 0, 00:21:46.596 "r_mbytes_per_sec": 0, 00:21:46.596 "w_mbytes_per_sec": 0 00:21:46.596 }, 00:21:46.596 "claimed": true, 00:21:46.596 "claim_type": "exclusive_write", 00:21:46.596 "zoned": false, 00:21:46.596 "supported_io_types": { 00:21:46.596 "read": true, 00:21:46.596 "write": true, 00:21:46.596 "unmap": true, 00:21:46.596 "flush": true, 00:21:46.596 "reset": true, 00:21:46.596 "nvme_admin": false, 00:21:46.596 "nvme_io": false, 00:21:46.596 "nvme_io_md": false, 00:21:46.596 "write_zeroes": true, 00:21:46.596 "zcopy": true, 00:21:46.596 "get_zone_info": false, 00:21:46.596 "zone_management": false, 00:21:46.596 "zone_append": false, 00:21:46.596 "compare": false, 00:21:46.596 "compare_and_write": false, 00:21:46.596 "abort": true, 00:21:46.596 "seek_hole": false, 00:21:46.596 "seek_data": false, 00:21:46.596 "copy": true, 00:21:46.596 "nvme_iov_md": false 00:21:46.596 }, 00:21:46.596 "memory_domains": [ 00:21:46.596 { 00:21:46.596 "dma_device_id": "system", 00:21:46.596 "dma_device_type": 1 00:21:46.596 }, 00:21:46.596 { 00:21:46.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.596 "dma_device_type": 2 00:21:46.596 } 00:21:46.596 ], 00:21:46.596 "driver_specific": {} 00:21:46.596 } 00:21:46.596 ] 00:21:46.596 06:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.597 06:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:46.854 06:38:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:46.854 "name": "Existed_Raid", 00:21:46.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.854 "strip_size_kb": 64, 00:21:46.854 "state": "configuring", 00:21:46.854 "raid_level": "concat", 00:21:46.854 "superblock": false, 00:21:46.854 "num_base_bdevs": 4, 00:21:46.854 "num_base_bdevs_discovered": 1, 00:21:46.854 "num_base_bdevs_operational": 4, 00:21:46.854 "base_bdevs_list": [ 00:21:46.854 { 00:21:46.854 "name": "BaseBdev1", 00:21:46.854 "uuid": "57b8321a-f21e-4f42-b4b6-0a55d174dff6", 00:21:46.854 "is_configured": true, 00:21:46.854 "data_offset": 0, 00:21:46.854 "data_size": 65536 00:21:46.854 }, 00:21:46.854 { 00:21:46.854 "name": "BaseBdev2", 00:21:46.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.854 "is_configured": false, 00:21:46.854 "data_offset": 0, 00:21:46.854 "data_size": 0 00:21:46.854 }, 00:21:46.854 { 00:21:46.854 "name": "BaseBdev3", 00:21:46.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.854 "is_configured": false, 00:21:46.854 "data_offset": 0, 00:21:46.854 "data_size": 0 00:21:46.854 }, 00:21:46.854 { 00:21:46.854 "name": "BaseBdev4", 00:21:46.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.854 "is_configured": false, 00:21:46.854 "data_offset": 0, 00:21:46.854 "data_size": 0 00:21:46.854 } 00:21:46.854 ] 00:21:46.854 }' 00:21:46.854 06:38:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:46.854 06:38:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:47.419 06:38:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:47.419 [2024-07-25 06:38:00.885006] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:47.419 [2024-07-25 06:38:00.885043] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26bfce0 name Existed_Raid, state configuring 00:21:47.419 06:38:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:47.676 [2024-07-25 06:38:01.061517] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:47.676 [2024-07-25 06:38:01.062897] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:47.676 [2024-07-25 06:38:01.062931] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:47.676 [2024-07-25 06:38:01.062940] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:47.676 [2024-07-25 06:38:01.062950] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:47.676 [2024-07-25 06:38:01.062959] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:47.676 [2024-07-25 06:38:01.062969] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.676 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.677 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.677 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:47.934 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.934 "name": "Existed_Raid", 00:21:47.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.934 "strip_size_kb": 64, 00:21:47.934 "state": "configuring", 00:21:47.934 "raid_level": "concat", 00:21:47.934 "superblock": false, 00:21:47.934 "num_base_bdevs": 4, 00:21:47.934 "num_base_bdevs_discovered": 1, 00:21:47.934 "num_base_bdevs_operational": 4, 00:21:47.934 "base_bdevs_list": [ 00:21:47.934 { 00:21:47.934 "name": "BaseBdev1", 00:21:47.934 "uuid": "57b8321a-f21e-4f42-b4b6-0a55d174dff6", 00:21:47.934 "is_configured": true, 00:21:47.934 "data_offset": 0, 00:21:47.934 "data_size": 65536 00:21:47.934 }, 00:21:47.934 { 00:21:47.934 "name": "BaseBdev2", 00:21:47.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.934 "is_configured": false, 00:21:47.934 "data_offset": 0, 00:21:47.934 "data_size": 0 00:21:47.934 }, 00:21:47.934 { 00:21:47.934 "name": "BaseBdev3", 00:21:47.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.934 "is_configured": false, 00:21:47.934 "data_offset": 0, 00:21:47.934 "data_size": 0 00:21:47.934 }, 00:21:47.934 { 00:21:47.934 "name": "BaseBdev4", 00:21:47.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.934 "is_configured": false, 00:21:47.934 "data_offset": 0, 00:21:47.934 "data_size": 0 00:21:47.934 } 00:21:47.934 ] 00:21:47.934 }' 00:21:47.934 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.934 06:38:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.499 06:38:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:48.757 [2024-07-25 06:38:02.107374] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:48.757 BaseBdev2 00:21:48.757 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:48.757 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:21:48.757 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:48.757 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:48.757 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:48.757 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:48.757 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:49.014 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:49.014 [ 00:21:49.014 { 00:21:49.014 "name": "BaseBdev2", 00:21:49.014 "aliases": [ 00:21:49.014 "c51da5e7-5d09-43e7-b587-6b6304c1370a" 00:21:49.014 ], 00:21:49.014 "product_name": "Malloc disk", 00:21:49.014 "block_size": 512, 00:21:49.014 "num_blocks": 65536, 00:21:49.014 "uuid": "c51da5e7-5d09-43e7-b587-6b6304c1370a", 00:21:49.014 "assigned_rate_limits": { 00:21:49.014 "rw_ios_per_sec": 0, 00:21:49.014 "rw_mbytes_per_sec": 0, 00:21:49.015 "r_mbytes_per_sec": 0, 00:21:49.015 "w_mbytes_per_sec": 0 00:21:49.015 }, 00:21:49.015 "claimed": true, 00:21:49.015 "claim_type": "exclusive_write", 00:21:49.015 "zoned": false, 00:21:49.015 "supported_io_types": { 00:21:49.015 "read": true, 00:21:49.015 "write": true, 00:21:49.015 "unmap": true, 00:21:49.015 "flush": true, 00:21:49.015 "reset": true, 00:21:49.015 "nvme_admin": false, 00:21:49.015 "nvme_io": false, 00:21:49.015 "nvme_io_md": false, 00:21:49.015 "write_zeroes": true, 00:21:49.015 "zcopy": true, 00:21:49.015 "get_zone_info": false, 00:21:49.015 "zone_management": false, 00:21:49.015 "zone_append": false, 00:21:49.015 "compare": false, 00:21:49.015 "compare_and_write": false, 00:21:49.015 "abort": true, 00:21:49.015 "seek_hole": false, 00:21:49.015 "seek_data": false, 00:21:49.015 "copy": true, 00:21:49.015 "nvme_iov_md": false 00:21:49.015 }, 00:21:49.015 "memory_domains": [ 00:21:49.015 { 00:21:49.015 "dma_device_id": "system", 00:21:49.015 "dma_device_type": 1 00:21:49.015 }, 00:21:49.015 { 00:21:49.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.015 "dma_device_type": 2 00:21:49.015 } 00:21:49.015 ], 00:21:49.015 "driver_specific": {} 00:21:49.015 } 00:21:49.015 ] 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.272 "name": "Existed_Raid", 00:21:49.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.272 "strip_size_kb": 64, 00:21:49.272 "state": "configuring", 00:21:49.272 "raid_level": "concat", 00:21:49.272 "superblock": false, 00:21:49.272 "num_base_bdevs": 4, 00:21:49.272 "num_base_bdevs_discovered": 2, 00:21:49.272 "num_base_bdevs_operational": 4, 00:21:49.272 "base_bdevs_list": [ 00:21:49.272 { 00:21:49.272 "name": "BaseBdev1", 00:21:49.272 "uuid": "57b8321a-f21e-4f42-b4b6-0a55d174dff6", 00:21:49.272 "is_configured": true, 00:21:49.272 "data_offset": 0, 00:21:49.272 "data_size": 65536 00:21:49.272 }, 00:21:49.272 { 00:21:49.272 "name": "BaseBdev2", 00:21:49.272 "uuid": "c51da5e7-5d09-43e7-b587-6b6304c1370a", 00:21:49.272 "is_configured": true, 00:21:49.272 "data_offset": 0, 00:21:49.272 "data_size": 65536 00:21:49.272 }, 00:21:49.272 { 00:21:49.272 "name": "BaseBdev3", 00:21:49.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.272 "is_configured": false, 00:21:49.272 "data_offset": 0, 00:21:49.272 "data_size": 0 00:21:49.272 }, 00:21:49.272 { 00:21:49.272 "name": "BaseBdev4", 00:21:49.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.272 "is_configured": false, 00:21:49.272 "data_offset": 0, 00:21:49.272 "data_size": 0 00:21:49.272 } 00:21:49.272 ] 00:21:49.272 }' 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.272 06:38:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:49.836 06:38:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:50.093 [2024-07-25 06:38:03.590512] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:50.093 BaseBdev3 00:21:50.093 06:38:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:50.093 06:38:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:21:50.093 06:38:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:50.093 06:38:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:50.093 06:38:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:50.093 06:38:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:50.093 06:38:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:50.349 06:38:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:50.606 [ 00:21:50.606 { 00:21:50.606 "name": "BaseBdev3", 00:21:50.606 "aliases": [ 00:21:50.606 "970578d8-9d97-44a8-94c5-099b56e890ee" 00:21:50.606 ], 00:21:50.606 "product_name": "Malloc disk", 00:21:50.606 "block_size": 512, 00:21:50.606 "num_blocks": 65536, 00:21:50.606 "uuid": "970578d8-9d97-44a8-94c5-099b56e890ee", 00:21:50.606 "assigned_rate_limits": { 00:21:50.606 "rw_ios_per_sec": 0, 00:21:50.606 "rw_mbytes_per_sec": 0, 00:21:50.606 "r_mbytes_per_sec": 0, 00:21:50.606 "w_mbytes_per_sec": 0 00:21:50.606 }, 00:21:50.606 "claimed": true, 00:21:50.606 "claim_type": "exclusive_write", 00:21:50.606 "zoned": false, 00:21:50.606 "supported_io_types": { 00:21:50.606 "read": true, 00:21:50.606 "write": true, 00:21:50.606 "unmap": true, 00:21:50.606 "flush": true, 00:21:50.606 "reset": true, 00:21:50.606 "nvme_admin": false, 00:21:50.606 "nvme_io": false, 00:21:50.606 "nvme_io_md": false, 00:21:50.606 "write_zeroes": true, 00:21:50.606 "zcopy": true, 00:21:50.606 "get_zone_info": false, 00:21:50.607 "zone_management": false, 00:21:50.607 "zone_append": false, 00:21:50.607 "compare": false, 00:21:50.607 "compare_and_write": false, 00:21:50.607 "abort": true, 00:21:50.607 "seek_hole": false, 00:21:50.607 "seek_data": false, 00:21:50.607 "copy": true, 00:21:50.607 "nvme_iov_md": false 00:21:50.607 }, 00:21:50.607 "memory_domains": [ 00:21:50.607 { 00:21:50.607 "dma_device_id": "system", 00:21:50.607 "dma_device_type": 1 00:21:50.607 }, 00:21:50.607 { 00:21:50.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:50.607 "dma_device_type": 2 00:21:50.607 } 00:21:50.607 ], 00:21:50.607 "driver_specific": {} 00:21:50.607 } 00:21:50.607 ] 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.607 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:50.864 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.864 "name": "Existed_Raid", 00:21:50.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.864 "strip_size_kb": 64, 00:21:50.865 "state": "configuring", 00:21:50.865 "raid_level": "concat", 00:21:50.865 "superblock": false, 00:21:50.865 "num_base_bdevs": 4, 00:21:50.865 "num_base_bdevs_discovered": 3, 00:21:50.865 "num_base_bdevs_operational": 4, 00:21:50.865 "base_bdevs_list": [ 00:21:50.865 { 00:21:50.865 "name": "BaseBdev1", 00:21:50.865 "uuid": "57b8321a-f21e-4f42-b4b6-0a55d174dff6", 00:21:50.865 "is_configured": true, 00:21:50.865 "data_offset": 0, 00:21:50.865 "data_size": 65536 00:21:50.865 }, 00:21:50.865 { 00:21:50.865 "name": "BaseBdev2", 00:21:50.865 "uuid": "c51da5e7-5d09-43e7-b587-6b6304c1370a", 00:21:50.865 "is_configured": true, 00:21:50.865 "data_offset": 0, 00:21:50.865 "data_size": 65536 00:21:50.865 }, 00:21:50.865 { 00:21:50.865 "name": "BaseBdev3", 00:21:50.865 "uuid": "970578d8-9d97-44a8-94c5-099b56e890ee", 00:21:50.865 "is_configured": true, 00:21:50.865 "data_offset": 0, 00:21:50.865 "data_size": 65536 00:21:50.865 }, 00:21:50.865 { 00:21:50.865 "name": "BaseBdev4", 00:21:50.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.865 "is_configured": false, 00:21:50.865 "data_offset": 0, 00:21:50.865 "data_size": 0 00:21:50.865 } 00:21:50.865 ] 00:21:50.865 }' 00:21:50.865 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.865 06:38:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.430 06:38:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:51.688 [2024-07-25 06:38:05.077667] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:51.688 [2024-07-25 06:38:05.077698] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2873250 00:21:51.688 [2024-07-25 06:38:05.077706] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:21:51.688 [2024-07-25 06:38:05.077885] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2867cc0 00:21:51.688 [2024-07-25 06:38:05.078001] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2873250 00:21:51.688 [2024-07-25 06:38:05.078010] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2873250 00:21:51.688 [2024-07-25 06:38:05.078163] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.688 BaseBdev4 00:21:51.688 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:51.688 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:21:51.688 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:51.688 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:51.688 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:51.688 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:51.688 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:51.944 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:52.231 [ 00:21:52.231 { 00:21:52.231 "name": "BaseBdev4", 00:21:52.231 "aliases": [ 00:21:52.231 "3f57296b-322f-4767-8a21-1f09bb61ef11" 00:21:52.231 ], 00:21:52.231 "product_name": "Malloc disk", 00:21:52.231 "block_size": 512, 00:21:52.231 "num_blocks": 65536, 00:21:52.231 "uuid": "3f57296b-322f-4767-8a21-1f09bb61ef11", 00:21:52.231 "assigned_rate_limits": { 00:21:52.231 "rw_ios_per_sec": 0, 00:21:52.231 "rw_mbytes_per_sec": 0, 00:21:52.231 "r_mbytes_per_sec": 0, 00:21:52.231 "w_mbytes_per_sec": 0 00:21:52.231 }, 00:21:52.231 "claimed": true, 00:21:52.231 "claim_type": "exclusive_write", 00:21:52.231 "zoned": false, 00:21:52.231 "supported_io_types": { 00:21:52.231 "read": true, 00:21:52.231 "write": true, 00:21:52.231 "unmap": true, 00:21:52.231 "flush": true, 00:21:52.231 "reset": true, 00:21:52.231 "nvme_admin": false, 00:21:52.231 "nvme_io": false, 00:21:52.231 "nvme_io_md": false, 00:21:52.231 "write_zeroes": true, 00:21:52.231 "zcopy": true, 00:21:52.231 "get_zone_info": false, 00:21:52.231 "zone_management": false, 00:21:52.231 "zone_append": false, 00:21:52.231 "compare": false, 00:21:52.231 "compare_and_write": false, 00:21:52.231 "abort": true, 00:21:52.231 "seek_hole": false, 00:21:52.231 "seek_data": false, 00:21:52.231 "copy": true, 00:21:52.231 "nvme_iov_md": false 00:21:52.231 }, 00:21:52.231 "memory_domains": [ 00:21:52.231 { 00:21:52.231 "dma_device_id": "system", 00:21:52.231 "dma_device_type": 1 00:21:52.231 }, 00:21:52.231 { 00:21:52.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.231 "dma_device_type": 2 00:21:52.231 } 00:21:52.231 ], 00:21:52.231 "driver_specific": {} 00:21:52.231 } 00:21:52.231 ] 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.231 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.232 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.232 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.232 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:52.516 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.516 "name": "Existed_Raid", 00:21:52.516 "uuid": "3d60b239-4881-476f-a373-ecd6df0b54f6", 00:21:52.516 "strip_size_kb": 64, 00:21:52.516 "state": "online", 00:21:52.516 "raid_level": "concat", 00:21:52.516 "superblock": false, 00:21:52.516 "num_base_bdevs": 4, 00:21:52.516 "num_base_bdevs_discovered": 4, 00:21:52.516 "num_base_bdevs_operational": 4, 00:21:52.516 "base_bdevs_list": [ 00:21:52.516 { 00:21:52.516 "name": "BaseBdev1", 00:21:52.516 "uuid": "57b8321a-f21e-4f42-b4b6-0a55d174dff6", 00:21:52.516 "is_configured": true, 00:21:52.516 "data_offset": 0, 00:21:52.516 "data_size": 65536 00:21:52.516 }, 00:21:52.516 { 00:21:52.516 "name": "BaseBdev2", 00:21:52.516 "uuid": "c51da5e7-5d09-43e7-b587-6b6304c1370a", 00:21:52.516 "is_configured": true, 00:21:52.516 "data_offset": 0, 00:21:52.516 "data_size": 65536 00:21:52.516 }, 00:21:52.516 { 00:21:52.516 "name": "BaseBdev3", 00:21:52.516 "uuid": "970578d8-9d97-44a8-94c5-099b56e890ee", 00:21:52.516 "is_configured": true, 00:21:52.516 "data_offset": 0, 00:21:52.516 "data_size": 65536 00:21:52.516 }, 00:21:52.516 { 00:21:52.516 "name": "BaseBdev4", 00:21:52.517 "uuid": "3f57296b-322f-4767-8a21-1f09bb61ef11", 00:21:52.517 "is_configured": true, 00:21:52.517 "data_offset": 0, 00:21:52.517 "data_size": 65536 00:21:52.517 } 00:21:52.517 ] 00:21:52.517 }' 00:21:52.517 06:38:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.517 06:38:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:53.081 [2024-07-25 06:38:06.569924] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:53.081 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:53.081 "name": "Existed_Raid", 00:21:53.081 "aliases": [ 00:21:53.081 "3d60b239-4881-476f-a373-ecd6df0b54f6" 00:21:53.081 ], 00:21:53.081 "product_name": "Raid Volume", 00:21:53.081 "block_size": 512, 00:21:53.081 "num_blocks": 262144, 00:21:53.081 "uuid": "3d60b239-4881-476f-a373-ecd6df0b54f6", 00:21:53.081 "assigned_rate_limits": { 00:21:53.081 "rw_ios_per_sec": 0, 00:21:53.081 "rw_mbytes_per_sec": 0, 00:21:53.081 "r_mbytes_per_sec": 0, 00:21:53.081 "w_mbytes_per_sec": 0 00:21:53.081 }, 00:21:53.081 "claimed": false, 00:21:53.081 "zoned": false, 00:21:53.081 "supported_io_types": { 00:21:53.081 "read": true, 00:21:53.081 "write": true, 00:21:53.081 "unmap": true, 00:21:53.081 "flush": true, 00:21:53.081 "reset": true, 00:21:53.081 "nvme_admin": false, 00:21:53.081 "nvme_io": false, 00:21:53.081 "nvme_io_md": false, 00:21:53.081 "write_zeroes": true, 00:21:53.081 "zcopy": false, 00:21:53.081 "get_zone_info": false, 00:21:53.081 "zone_management": false, 00:21:53.081 "zone_append": false, 00:21:53.081 "compare": false, 00:21:53.081 "compare_and_write": false, 00:21:53.081 "abort": false, 00:21:53.081 "seek_hole": false, 00:21:53.081 "seek_data": false, 00:21:53.081 "copy": false, 00:21:53.081 "nvme_iov_md": false 00:21:53.081 }, 00:21:53.081 "memory_domains": [ 00:21:53.081 { 00:21:53.081 "dma_device_id": "system", 00:21:53.081 "dma_device_type": 1 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.081 "dma_device_type": 2 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "dma_device_id": "system", 00:21:53.081 "dma_device_type": 1 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.081 "dma_device_type": 2 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "dma_device_id": "system", 00:21:53.081 "dma_device_type": 1 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.081 "dma_device_type": 2 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "dma_device_id": "system", 00:21:53.081 "dma_device_type": 1 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.081 "dma_device_type": 2 00:21:53.081 } 00:21:53.081 ], 00:21:53.081 "driver_specific": { 00:21:53.081 "raid": { 00:21:53.081 "uuid": "3d60b239-4881-476f-a373-ecd6df0b54f6", 00:21:53.081 "strip_size_kb": 64, 00:21:53.081 "state": "online", 00:21:53.081 "raid_level": "concat", 00:21:53.081 "superblock": false, 00:21:53.081 "num_base_bdevs": 4, 00:21:53.081 "num_base_bdevs_discovered": 4, 00:21:53.081 "num_base_bdevs_operational": 4, 00:21:53.081 "base_bdevs_list": [ 00:21:53.081 { 00:21:53.081 "name": "BaseBdev1", 00:21:53.081 "uuid": "57b8321a-f21e-4f42-b4b6-0a55d174dff6", 00:21:53.081 "is_configured": true, 00:21:53.081 "data_offset": 0, 00:21:53.081 "data_size": 65536 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "name": "BaseBdev2", 00:21:53.081 "uuid": "c51da5e7-5d09-43e7-b587-6b6304c1370a", 00:21:53.081 "is_configured": true, 00:21:53.081 "data_offset": 0, 00:21:53.081 "data_size": 65536 00:21:53.081 }, 00:21:53.081 { 00:21:53.081 "name": "BaseBdev3", 00:21:53.081 "uuid": "970578d8-9d97-44a8-94c5-099b56e890ee", 00:21:53.082 "is_configured": true, 00:21:53.082 "data_offset": 0, 00:21:53.082 "data_size": 65536 00:21:53.082 }, 00:21:53.082 { 00:21:53.082 "name": "BaseBdev4", 00:21:53.082 "uuid": "3f57296b-322f-4767-8a21-1f09bb61ef11", 00:21:53.082 "is_configured": true, 00:21:53.082 "data_offset": 0, 00:21:53.082 "data_size": 65536 00:21:53.082 } 00:21:53.082 ] 00:21:53.082 } 00:21:53.082 } 00:21:53.082 }' 00:21:53.082 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:53.082 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:53.082 BaseBdev2 00:21:53.082 BaseBdev3 00:21:53.082 BaseBdev4' 00:21:53.082 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.082 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:53.339 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:53.339 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:53.339 "name": "BaseBdev1", 00:21:53.339 "aliases": [ 00:21:53.339 "57b8321a-f21e-4f42-b4b6-0a55d174dff6" 00:21:53.339 ], 00:21:53.339 "product_name": "Malloc disk", 00:21:53.339 "block_size": 512, 00:21:53.339 "num_blocks": 65536, 00:21:53.339 "uuid": "57b8321a-f21e-4f42-b4b6-0a55d174dff6", 00:21:53.339 "assigned_rate_limits": { 00:21:53.339 "rw_ios_per_sec": 0, 00:21:53.339 "rw_mbytes_per_sec": 0, 00:21:53.339 "r_mbytes_per_sec": 0, 00:21:53.339 "w_mbytes_per_sec": 0 00:21:53.339 }, 00:21:53.339 "claimed": true, 00:21:53.339 "claim_type": "exclusive_write", 00:21:53.339 "zoned": false, 00:21:53.339 "supported_io_types": { 00:21:53.339 "read": true, 00:21:53.339 "write": true, 00:21:53.339 "unmap": true, 00:21:53.339 "flush": true, 00:21:53.339 "reset": true, 00:21:53.339 "nvme_admin": false, 00:21:53.339 "nvme_io": false, 00:21:53.339 "nvme_io_md": false, 00:21:53.339 "write_zeroes": true, 00:21:53.339 "zcopy": true, 00:21:53.339 "get_zone_info": false, 00:21:53.339 "zone_management": false, 00:21:53.339 "zone_append": false, 00:21:53.339 "compare": false, 00:21:53.339 "compare_and_write": false, 00:21:53.339 "abort": true, 00:21:53.339 "seek_hole": false, 00:21:53.339 "seek_data": false, 00:21:53.339 "copy": true, 00:21:53.340 "nvme_iov_md": false 00:21:53.340 }, 00:21:53.340 "memory_domains": [ 00:21:53.340 { 00:21:53.340 "dma_device_id": "system", 00:21:53.340 "dma_device_type": 1 00:21:53.340 }, 00:21:53.340 { 00:21:53.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.340 "dma_device_type": 2 00:21:53.340 } 00:21:53.340 ], 00:21:53.340 "driver_specific": {} 00:21:53.340 }' 00:21:53.340 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.597 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.597 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.597 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.597 06:38:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.597 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:53.597 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.597 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.597 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:53.597 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.854 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.854 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.854 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.855 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:53.855 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.112 "name": "BaseBdev2", 00:21:54.112 "aliases": [ 00:21:54.112 "c51da5e7-5d09-43e7-b587-6b6304c1370a" 00:21:54.112 ], 00:21:54.112 "product_name": "Malloc disk", 00:21:54.112 "block_size": 512, 00:21:54.112 "num_blocks": 65536, 00:21:54.112 "uuid": "c51da5e7-5d09-43e7-b587-6b6304c1370a", 00:21:54.112 "assigned_rate_limits": { 00:21:54.112 "rw_ios_per_sec": 0, 00:21:54.112 "rw_mbytes_per_sec": 0, 00:21:54.112 "r_mbytes_per_sec": 0, 00:21:54.112 "w_mbytes_per_sec": 0 00:21:54.112 }, 00:21:54.112 "claimed": true, 00:21:54.112 "claim_type": "exclusive_write", 00:21:54.112 "zoned": false, 00:21:54.112 "supported_io_types": { 00:21:54.112 "read": true, 00:21:54.112 "write": true, 00:21:54.112 "unmap": true, 00:21:54.112 "flush": true, 00:21:54.112 "reset": true, 00:21:54.112 "nvme_admin": false, 00:21:54.112 "nvme_io": false, 00:21:54.112 "nvme_io_md": false, 00:21:54.112 "write_zeroes": true, 00:21:54.112 "zcopy": true, 00:21:54.112 "get_zone_info": false, 00:21:54.112 "zone_management": false, 00:21:54.112 "zone_append": false, 00:21:54.112 "compare": false, 00:21:54.112 "compare_and_write": false, 00:21:54.112 "abort": true, 00:21:54.112 "seek_hole": false, 00:21:54.112 "seek_data": false, 00:21:54.112 "copy": true, 00:21:54.112 "nvme_iov_md": false 00:21:54.112 }, 00:21:54.112 "memory_domains": [ 00:21:54.112 { 00:21:54.112 "dma_device_id": "system", 00:21:54.112 "dma_device_type": 1 00:21:54.112 }, 00:21:54.112 { 00:21:54.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.112 "dma_device_type": 2 00:21:54.112 } 00:21:54.112 ], 00:21:54.112 "driver_specific": {} 00:21:54.112 }' 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.112 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.370 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.370 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.370 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.370 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.370 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.370 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:54.370 06:38:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.627 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.627 "name": "BaseBdev3", 00:21:54.627 "aliases": [ 00:21:54.627 "970578d8-9d97-44a8-94c5-099b56e890ee" 00:21:54.627 ], 00:21:54.627 "product_name": "Malloc disk", 00:21:54.627 "block_size": 512, 00:21:54.627 "num_blocks": 65536, 00:21:54.627 "uuid": "970578d8-9d97-44a8-94c5-099b56e890ee", 00:21:54.627 "assigned_rate_limits": { 00:21:54.627 "rw_ios_per_sec": 0, 00:21:54.627 "rw_mbytes_per_sec": 0, 00:21:54.627 "r_mbytes_per_sec": 0, 00:21:54.627 "w_mbytes_per_sec": 0 00:21:54.627 }, 00:21:54.627 "claimed": true, 00:21:54.627 "claim_type": "exclusive_write", 00:21:54.627 "zoned": false, 00:21:54.627 "supported_io_types": { 00:21:54.627 "read": true, 00:21:54.627 "write": true, 00:21:54.627 "unmap": true, 00:21:54.627 "flush": true, 00:21:54.627 "reset": true, 00:21:54.627 "nvme_admin": false, 00:21:54.627 "nvme_io": false, 00:21:54.627 "nvme_io_md": false, 00:21:54.627 "write_zeroes": true, 00:21:54.627 "zcopy": true, 00:21:54.627 "get_zone_info": false, 00:21:54.627 "zone_management": false, 00:21:54.627 "zone_append": false, 00:21:54.627 "compare": false, 00:21:54.627 "compare_and_write": false, 00:21:54.627 "abort": true, 00:21:54.627 "seek_hole": false, 00:21:54.627 "seek_data": false, 00:21:54.627 "copy": true, 00:21:54.627 "nvme_iov_md": false 00:21:54.627 }, 00:21:54.627 "memory_domains": [ 00:21:54.627 { 00:21:54.627 "dma_device_id": "system", 00:21:54.627 "dma_device_type": 1 00:21:54.627 }, 00:21:54.627 { 00:21:54.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.627 "dma_device_type": 2 00:21:54.627 } 00:21:54.627 ], 00:21:54.627 "driver_specific": {} 00:21:54.627 }' 00:21:54.627 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.627 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.627 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:54.627 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.627 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.628 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.628 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.885 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.885 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.885 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.885 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.885 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.885 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.885 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:54.885 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:55.143 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:55.143 "name": "BaseBdev4", 00:21:55.143 "aliases": [ 00:21:55.143 "3f57296b-322f-4767-8a21-1f09bb61ef11" 00:21:55.143 ], 00:21:55.143 "product_name": "Malloc disk", 00:21:55.143 "block_size": 512, 00:21:55.143 "num_blocks": 65536, 00:21:55.143 "uuid": "3f57296b-322f-4767-8a21-1f09bb61ef11", 00:21:55.143 "assigned_rate_limits": { 00:21:55.143 "rw_ios_per_sec": 0, 00:21:55.143 "rw_mbytes_per_sec": 0, 00:21:55.143 "r_mbytes_per_sec": 0, 00:21:55.143 "w_mbytes_per_sec": 0 00:21:55.143 }, 00:21:55.143 "claimed": true, 00:21:55.143 "claim_type": "exclusive_write", 00:21:55.143 "zoned": false, 00:21:55.143 "supported_io_types": { 00:21:55.143 "read": true, 00:21:55.143 "write": true, 00:21:55.143 "unmap": true, 00:21:55.143 "flush": true, 00:21:55.143 "reset": true, 00:21:55.143 "nvme_admin": false, 00:21:55.143 "nvme_io": false, 00:21:55.143 "nvme_io_md": false, 00:21:55.143 "write_zeroes": true, 00:21:55.143 "zcopy": true, 00:21:55.143 "get_zone_info": false, 00:21:55.143 "zone_management": false, 00:21:55.143 "zone_append": false, 00:21:55.143 "compare": false, 00:21:55.143 "compare_and_write": false, 00:21:55.143 "abort": true, 00:21:55.143 "seek_hole": false, 00:21:55.143 "seek_data": false, 00:21:55.143 "copy": true, 00:21:55.143 "nvme_iov_md": false 00:21:55.143 }, 00:21:55.143 "memory_domains": [ 00:21:55.143 { 00:21:55.143 "dma_device_id": "system", 00:21:55.143 "dma_device_type": 1 00:21:55.143 }, 00:21:55.143 { 00:21:55.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.143 "dma_device_type": 2 00:21:55.143 } 00:21:55.143 ], 00:21:55.143 "driver_specific": {} 00:21:55.143 }' 00:21:55.143 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.143 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.143 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:55.143 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.143 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.401 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:55.401 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.401 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.401 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.401 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.401 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.401 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:55.401 06:38:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:55.658 [2024-07-25 06:38:09.056210] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:55.658 [2024-07-25 06:38:09.056236] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:55.658 [2024-07-25 06:38:09.056282] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:55.658 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.915 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.915 "name": "Existed_Raid", 00:21:55.915 "uuid": "3d60b239-4881-476f-a373-ecd6df0b54f6", 00:21:55.915 "strip_size_kb": 64, 00:21:55.915 "state": "offline", 00:21:55.915 "raid_level": "concat", 00:21:55.915 "superblock": false, 00:21:55.915 "num_base_bdevs": 4, 00:21:55.915 "num_base_bdevs_discovered": 3, 00:21:55.916 "num_base_bdevs_operational": 3, 00:21:55.916 "base_bdevs_list": [ 00:21:55.916 { 00:21:55.916 "name": null, 00:21:55.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.916 "is_configured": false, 00:21:55.916 "data_offset": 0, 00:21:55.916 "data_size": 65536 00:21:55.916 }, 00:21:55.916 { 00:21:55.916 "name": "BaseBdev2", 00:21:55.916 "uuid": "c51da5e7-5d09-43e7-b587-6b6304c1370a", 00:21:55.916 "is_configured": true, 00:21:55.916 "data_offset": 0, 00:21:55.916 "data_size": 65536 00:21:55.916 }, 00:21:55.916 { 00:21:55.916 "name": "BaseBdev3", 00:21:55.916 "uuid": "970578d8-9d97-44a8-94c5-099b56e890ee", 00:21:55.916 "is_configured": true, 00:21:55.916 "data_offset": 0, 00:21:55.916 "data_size": 65536 00:21:55.916 }, 00:21:55.916 { 00:21:55.916 "name": "BaseBdev4", 00:21:55.916 "uuid": "3f57296b-322f-4767-8a21-1f09bb61ef11", 00:21:55.916 "is_configured": true, 00:21:55.916 "data_offset": 0, 00:21:55.916 "data_size": 65536 00:21:55.916 } 00:21:55.916 ] 00:21:55.916 }' 00:21:55.916 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.916 06:38:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.480 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:56.480 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:56.481 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.481 06:38:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:56.738 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:56.738 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:56.738 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:56.995 [2024-07-25 06:38:10.344585] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:56.995 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:56.995 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:56.995 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.995 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:57.252 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:57.252 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:57.252 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:57.520 [2024-07-25 06:38:10.811854] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:57.520 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:57.520 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:57.520 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.520 06:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:57.784 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:57.784 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:57.784 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:57.784 [2024-07-25 06:38:11.290773] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:57.784 [2024-07-25 06:38:11.290815] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2873250 name Existed_Raid, state offline 00:21:57.784 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:57.784 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:57.784 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.784 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:58.041 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:58.041 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:58.041 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:58.041 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:58.041 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:58.041 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:58.297 BaseBdev2 00:21:58.297 06:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:58.297 06:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:21:58.297 06:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:58.297 06:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:58.297 06:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:58.297 06:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:58.297 06:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:58.554 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:58.811 [ 00:21:58.811 { 00:21:58.811 "name": "BaseBdev2", 00:21:58.811 "aliases": [ 00:21:58.811 "3689ff29-b312-4efc-b784-59f73986b0d4" 00:21:58.811 ], 00:21:58.811 "product_name": "Malloc disk", 00:21:58.811 "block_size": 512, 00:21:58.811 "num_blocks": 65536, 00:21:58.811 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:21:58.811 "assigned_rate_limits": { 00:21:58.811 "rw_ios_per_sec": 0, 00:21:58.811 "rw_mbytes_per_sec": 0, 00:21:58.811 "r_mbytes_per_sec": 0, 00:21:58.811 "w_mbytes_per_sec": 0 00:21:58.811 }, 00:21:58.811 "claimed": false, 00:21:58.811 "zoned": false, 00:21:58.811 "supported_io_types": { 00:21:58.811 "read": true, 00:21:58.811 "write": true, 00:21:58.811 "unmap": true, 00:21:58.811 "flush": true, 00:21:58.811 "reset": true, 00:21:58.811 "nvme_admin": false, 00:21:58.811 "nvme_io": false, 00:21:58.811 "nvme_io_md": false, 00:21:58.811 "write_zeroes": true, 00:21:58.811 "zcopy": true, 00:21:58.811 "get_zone_info": false, 00:21:58.811 "zone_management": false, 00:21:58.811 "zone_append": false, 00:21:58.811 "compare": false, 00:21:58.811 "compare_and_write": false, 00:21:58.811 "abort": true, 00:21:58.811 "seek_hole": false, 00:21:58.811 "seek_data": false, 00:21:58.811 "copy": true, 00:21:58.811 "nvme_iov_md": false 00:21:58.811 }, 00:21:58.811 "memory_domains": [ 00:21:58.811 { 00:21:58.811 "dma_device_id": "system", 00:21:58.811 "dma_device_type": 1 00:21:58.811 }, 00:21:58.811 { 00:21:58.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.812 "dma_device_type": 2 00:21:58.812 } 00:21:58.812 ], 00:21:58.812 "driver_specific": {} 00:21:58.812 } 00:21:58.812 ] 00:21:58.812 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:58.812 06:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:58.812 06:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:58.812 06:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:59.068 BaseBdev3 00:21:59.068 06:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:59.069 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:21:59.069 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:59.069 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:59.069 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:59.069 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:59.069 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:59.326 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:59.326 [ 00:21:59.326 { 00:21:59.326 "name": "BaseBdev3", 00:21:59.326 "aliases": [ 00:21:59.326 "08e55f1c-e04c-4d88-b1c6-24a1f74a044d" 00:21:59.326 ], 00:21:59.326 "product_name": "Malloc disk", 00:21:59.326 "block_size": 512, 00:21:59.326 "num_blocks": 65536, 00:21:59.326 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:21:59.326 "assigned_rate_limits": { 00:21:59.326 "rw_ios_per_sec": 0, 00:21:59.326 "rw_mbytes_per_sec": 0, 00:21:59.326 "r_mbytes_per_sec": 0, 00:21:59.326 "w_mbytes_per_sec": 0 00:21:59.326 }, 00:21:59.326 "claimed": false, 00:21:59.326 "zoned": false, 00:21:59.326 "supported_io_types": { 00:21:59.326 "read": true, 00:21:59.326 "write": true, 00:21:59.326 "unmap": true, 00:21:59.326 "flush": true, 00:21:59.326 "reset": true, 00:21:59.326 "nvme_admin": false, 00:21:59.326 "nvme_io": false, 00:21:59.326 "nvme_io_md": false, 00:21:59.326 "write_zeroes": true, 00:21:59.326 "zcopy": true, 00:21:59.326 "get_zone_info": false, 00:21:59.326 "zone_management": false, 00:21:59.326 "zone_append": false, 00:21:59.326 "compare": false, 00:21:59.326 "compare_and_write": false, 00:21:59.326 "abort": true, 00:21:59.326 "seek_hole": false, 00:21:59.326 "seek_data": false, 00:21:59.326 "copy": true, 00:21:59.326 "nvme_iov_md": false 00:21:59.326 }, 00:21:59.326 "memory_domains": [ 00:21:59.326 { 00:21:59.326 "dma_device_id": "system", 00:21:59.326 "dma_device_type": 1 00:21:59.326 }, 00:21:59.326 { 00:21:59.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.326 "dma_device_type": 2 00:21:59.326 } 00:21:59.326 ], 00:21:59.326 "driver_specific": {} 00:21:59.326 } 00:21:59.326 ] 00:21:59.584 06:38:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:59.584 06:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:59.584 06:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:59.584 06:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:59.584 BaseBdev4 00:21:59.584 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:59.584 06:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:21:59.584 06:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:59.584 06:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:59.584 06:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:59.584 06:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:59.584 06:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:59.841 06:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:00.099 [ 00:22:00.099 { 00:22:00.099 "name": "BaseBdev4", 00:22:00.099 "aliases": [ 00:22:00.099 "d356b5de-9fab-44ff-b39d-7ce8fa7088c6" 00:22:00.099 ], 00:22:00.099 "product_name": "Malloc disk", 00:22:00.099 "block_size": 512, 00:22:00.099 "num_blocks": 65536, 00:22:00.099 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:00.099 "assigned_rate_limits": { 00:22:00.099 "rw_ios_per_sec": 0, 00:22:00.099 "rw_mbytes_per_sec": 0, 00:22:00.099 "r_mbytes_per_sec": 0, 00:22:00.099 "w_mbytes_per_sec": 0 00:22:00.099 }, 00:22:00.099 "claimed": false, 00:22:00.099 "zoned": false, 00:22:00.099 "supported_io_types": { 00:22:00.099 "read": true, 00:22:00.099 "write": true, 00:22:00.099 "unmap": true, 00:22:00.099 "flush": true, 00:22:00.099 "reset": true, 00:22:00.099 "nvme_admin": false, 00:22:00.099 "nvme_io": false, 00:22:00.099 "nvme_io_md": false, 00:22:00.099 "write_zeroes": true, 00:22:00.099 "zcopy": true, 00:22:00.099 "get_zone_info": false, 00:22:00.099 "zone_management": false, 00:22:00.099 "zone_append": false, 00:22:00.099 "compare": false, 00:22:00.099 "compare_and_write": false, 00:22:00.099 "abort": true, 00:22:00.099 "seek_hole": false, 00:22:00.099 "seek_data": false, 00:22:00.099 "copy": true, 00:22:00.099 "nvme_iov_md": false 00:22:00.099 }, 00:22:00.099 "memory_domains": [ 00:22:00.099 { 00:22:00.099 "dma_device_id": "system", 00:22:00.099 "dma_device_type": 1 00:22:00.099 }, 00:22:00.099 { 00:22:00.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.099 "dma_device_type": 2 00:22:00.099 } 00:22:00.099 ], 00:22:00.099 "driver_specific": {} 00:22:00.099 } 00:22:00.099 ] 00:22:00.099 06:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:00.099 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:00.099 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:00.099 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:00.356 [2024-07-25 06:38:13.784061] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:00.356 [2024-07-25 06:38:13.784102] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:00.356 [2024-07-25 06:38:13.784120] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:00.356 [2024-07-25 06:38:13.785334] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:00.356 [2024-07-25 06:38:13.785373] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.356 06:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:00.613 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.613 "name": "Existed_Raid", 00:22:00.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.613 "strip_size_kb": 64, 00:22:00.613 "state": "configuring", 00:22:00.613 "raid_level": "concat", 00:22:00.613 "superblock": false, 00:22:00.613 "num_base_bdevs": 4, 00:22:00.613 "num_base_bdevs_discovered": 3, 00:22:00.613 "num_base_bdevs_operational": 4, 00:22:00.613 "base_bdevs_list": [ 00:22:00.613 { 00:22:00.613 "name": "BaseBdev1", 00:22:00.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.613 "is_configured": false, 00:22:00.613 "data_offset": 0, 00:22:00.613 "data_size": 0 00:22:00.613 }, 00:22:00.613 { 00:22:00.613 "name": "BaseBdev2", 00:22:00.613 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:00.613 "is_configured": true, 00:22:00.613 "data_offset": 0, 00:22:00.613 "data_size": 65536 00:22:00.613 }, 00:22:00.613 { 00:22:00.613 "name": "BaseBdev3", 00:22:00.613 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:00.613 "is_configured": true, 00:22:00.613 "data_offset": 0, 00:22:00.613 "data_size": 65536 00:22:00.613 }, 00:22:00.613 { 00:22:00.613 "name": "BaseBdev4", 00:22:00.613 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:00.613 "is_configured": true, 00:22:00.613 "data_offset": 0, 00:22:00.613 "data_size": 65536 00:22:00.613 } 00:22:00.613 ] 00:22:00.613 }' 00:22:00.613 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.613 06:38:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.176 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:01.433 [2024-07-25 06:38:14.786662] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:01.433 06:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.691 06:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.691 "name": "Existed_Raid", 00:22:01.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.691 "strip_size_kb": 64, 00:22:01.691 "state": "configuring", 00:22:01.691 "raid_level": "concat", 00:22:01.691 "superblock": false, 00:22:01.691 "num_base_bdevs": 4, 00:22:01.691 "num_base_bdevs_discovered": 2, 00:22:01.691 "num_base_bdevs_operational": 4, 00:22:01.691 "base_bdevs_list": [ 00:22:01.691 { 00:22:01.691 "name": "BaseBdev1", 00:22:01.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.691 "is_configured": false, 00:22:01.691 "data_offset": 0, 00:22:01.691 "data_size": 0 00:22:01.691 }, 00:22:01.691 { 00:22:01.691 "name": null, 00:22:01.691 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:01.691 "is_configured": false, 00:22:01.691 "data_offset": 0, 00:22:01.691 "data_size": 65536 00:22:01.691 }, 00:22:01.691 { 00:22:01.691 "name": "BaseBdev3", 00:22:01.691 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:01.691 "is_configured": true, 00:22:01.691 "data_offset": 0, 00:22:01.691 "data_size": 65536 00:22:01.691 }, 00:22:01.691 { 00:22:01.691 "name": "BaseBdev4", 00:22:01.691 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:01.691 "is_configured": true, 00:22:01.691 "data_offset": 0, 00:22:01.691 "data_size": 65536 00:22:01.691 } 00:22:01.691 ] 00:22:01.691 }' 00:22:01.691 06:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.691 06:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:02.256 06:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.256 06:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:02.514 06:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:02.514 06:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:02.514 [2024-07-25 06:38:16.049263] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:02.514 BaseBdev1 00:22:02.514 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:02.514 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:02.514 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:02.514 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:02.515 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:02.515 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:02.515 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:02.773 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:03.032 [ 00:22:03.032 { 00:22:03.032 "name": "BaseBdev1", 00:22:03.032 "aliases": [ 00:22:03.032 "e4bb5616-6534-431c-b1cb-df3993ed4a25" 00:22:03.032 ], 00:22:03.032 "product_name": "Malloc disk", 00:22:03.032 "block_size": 512, 00:22:03.032 "num_blocks": 65536, 00:22:03.032 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:03.032 "assigned_rate_limits": { 00:22:03.032 "rw_ios_per_sec": 0, 00:22:03.032 "rw_mbytes_per_sec": 0, 00:22:03.032 "r_mbytes_per_sec": 0, 00:22:03.032 "w_mbytes_per_sec": 0 00:22:03.032 }, 00:22:03.032 "claimed": true, 00:22:03.032 "claim_type": "exclusive_write", 00:22:03.032 "zoned": false, 00:22:03.032 "supported_io_types": { 00:22:03.032 "read": true, 00:22:03.032 "write": true, 00:22:03.032 "unmap": true, 00:22:03.032 "flush": true, 00:22:03.032 "reset": true, 00:22:03.032 "nvme_admin": false, 00:22:03.032 "nvme_io": false, 00:22:03.032 "nvme_io_md": false, 00:22:03.032 "write_zeroes": true, 00:22:03.032 "zcopy": true, 00:22:03.032 "get_zone_info": false, 00:22:03.032 "zone_management": false, 00:22:03.032 "zone_append": false, 00:22:03.032 "compare": false, 00:22:03.032 "compare_and_write": false, 00:22:03.032 "abort": true, 00:22:03.032 "seek_hole": false, 00:22:03.032 "seek_data": false, 00:22:03.032 "copy": true, 00:22:03.032 "nvme_iov_md": false 00:22:03.032 }, 00:22:03.032 "memory_domains": [ 00:22:03.032 { 00:22:03.032 "dma_device_id": "system", 00:22:03.032 "dma_device_type": 1 00:22:03.032 }, 00:22:03.032 { 00:22:03.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.032 "dma_device_type": 2 00:22:03.032 } 00:22:03.032 ], 00:22:03.032 "driver_specific": {} 00:22:03.032 } 00:22:03.032 ] 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.032 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:03.291 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.291 "name": "Existed_Raid", 00:22:03.291 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.291 "strip_size_kb": 64, 00:22:03.291 "state": "configuring", 00:22:03.291 "raid_level": "concat", 00:22:03.291 "superblock": false, 00:22:03.291 "num_base_bdevs": 4, 00:22:03.291 "num_base_bdevs_discovered": 3, 00:22:03.291 "num_base_bdevs_operational": 4, 00:22:03.291 "base_bdevs_list": [ 00:22:03.291 { 00:22:03.291 "name": "BaseBdev1", 00:22:03.291 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:03.291 "is_configured": true, 00:22:03.291 "data_offset": 0, 00:22:03.291 "data_size": 65536 00:22:03.291 }, 00:22:03.291 { 00:22:03.291 "name": null, 00:22:03.291 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:03.291 "is_configured": false, 00:22:03.291 "data_offset": 0, 00:22:03.291 "data_size": 65536 00:22:03.291 }, 00:22:03.291 { 00:22:03.291 "name": "BaseBdev3", 00:22:03.291 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:03.291 "is_configured": true, 00:22:03.291 "data_offset": 0, 00:22:03.291 "data_size": 65536 00:22:03.291 }, 00:22:03.291 { 00:22:03.291 "name": "BaseBdev4", 00:22:03.291 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:03.291 "is_configured": true, 00:22:03.291 "data_offset": 0, 00:22:03.291 "data_size": 65536 00:22:03.291 } 00:22:03.291 ] 00:22:03.291 }' 00:22:03.291 06:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.291 06:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:03.858 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.858 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:04.116 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:04.116 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:04.375 [2024-07-25 06:38:17.765829] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.375 06:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:04.634 06:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.634 "name": "Existed_Raid", 00:22:04.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.634 "strip_size_kb": 64, 00:22:04.634 "state": "configuring", 00:22:04.634 "raid_level": "concat", 00:22:04.634 "superblock": false, 00:22:04.634 "num_base_bdevs": 4, 00:22:04.634 "num_base_bdevs_discovered": 2, 00:22:04.634 "num_base_bdevs_operational": 4, 00:22:04.634 "base_bdevs_list": [ 00:22:04.634 { 00:22:04.634 "name": "BaseBdev1", 00:22:04.634 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:04.634 "is_configured": true, 00:22:04.634 "data_offset": 0, 00:22:04.634 "data_size": 65536 00:22:04.634 }, 00:22:04.634 { 00:22:04.634 "name": null, 00:22:04.634 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:04.634 "is_configured": false, 00:22:04.634 "data_offset": 0, 00:22:04.634 "data_size": 65536 00:22:04.634 }, 00:22:04.634 { 00:22:04.634 "name": null, 00:22:04.634 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:04.634 "is_configured": false, 00:22:04.634 "data_offset": 0, 00:22:04.634 "data_size": 65536 00:22:04.634 }, 00:22:04.634 { 00:22:04.634 "name": "BaseBdev4", 00:22:04.634 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:04.634 "is_configured": true, 00:22:04.634 "data_offset": 0, 00:22:04.634 "data_size": 65536 00:22:04.634 } 00:22:04.634 ] 00:22:04.634 }' 00:22:04.634 06:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.634 06:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.200 06:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.200 06:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:05.459 06:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:05.459 06:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:05.459 [2024-07-25 06:38:19.001095] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:05.717 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.718 "name": "Existed_Raid", 00:22:05.718 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.718 "strip_size_kb": 64, 00:22:05.718 "state": "configuring", 00:22:05.718 "raid_level": "concat", 00:22:05.718 "superblock": false, 00:22:05.718 "num_base_bdevs": 4, 00:22:05.718 "num_base_bdevs_discovered": 3, 00:22:05.718 "num_base_bdevs_operational": 4, 00:22:05.718 "base_bdevs_list": [ 00:22:05.718 { 00:22:05.718 "name": "BaseBdev1", 00:22:05.718 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:05.718 "is_configured": true, 00:22:05.718 "data_offset": 0, 00:22:05.718 "data_size": 65536 00:22:05.718 }, 00:22:05.718 { 00:22:05.718 "name": null, 00:22:05.718 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:05.718 "is_configured": false, 00:22:05.718 "data_offset": 0, 00:22:05.718 "data_size": 65536 00:22:05.718 }, 00:22:05.718 { 00:22:05.718 "name": "BaseBdev3", 00:22:05.718 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:05.718 "is_configured": true, 00:22:05.718 "data_offset": 0, 00:22:05.718 "data_size": 65536 00:22:05.718 }, 00:22:05.718 { 00:22:05.718 "name": "BaseBdev4", 00:22:05.718 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:05.718 "is_configured": true, 00:22:05.718 "data_offset": 0, 00:22:05.718 "data_size": 65536 00:22:05.718 } 00:22:05.718 ] 00:22:05.718 }' 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.718 06:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:06.292 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:06.292 06:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.584 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:06.584 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:06.842 [2024-07-25 06:38:20.272484] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.842 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:07.101 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.101 "name": "Existed_Raid", 00:22:07.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.101 "strip_size_kb": 64, 00:22:07.101 "state": "configuring", 00:22:07.101 "raid_level": "concat", 00:22:07.101 "superblock": false, 00:22:07.101 "num_base_bdevs": 4, 00:22:07.101 "num_base_bdevs_discovered": 2, 00:22:07.101 "num_base_bdevs_operational": 4, 00:22:07.101 "base_bdevs_list": [ 00:22:07.101 { 00:22:07.101 "name": null, 00:22:07.101 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:07.101 "is_configured": false, 00:22:07.101 "data_offset": 0, 00:22:07.101 "data_size": 65536 00:22:07.101 }, 00:22:07.101 { 00:22:07.101 "name": null, 00:22:07.101 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:07.101 "is_configured": false, 00:22:07.101 "data_offset": 0, 00:22:07.101 "data_size": 65536 00:22:07.101 }, 00:22:07.101 { 00:22:07.101 "name": "BaseBdev3", 00:22:07.101 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:07.101 "is_configured": true, 00:22:07.101 "data_offset": 0, 00:22:07.101 "data_size": 65536 00:22:07.101 }, 00:22:07.101 { 00:22:07.101 "name": "BaseBdev4", 00:22:07.101 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:07.101 "is_configured": true, 00:22:07.101 "data_offset": 0, 00:22:07.101 "data_size": 65536 00:22:07.101 } 00:22:07.101 ] 00:22:07.101 }' 00:22:07.101 06:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.101 06:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.668 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.668 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:07.926 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:07.926 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:07.926 [2024-07-25 06:38:21.469881] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.185 "name": "Existed_Raid", 00:22:08.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.185 "strip_size_kb": 64, 00:22:08.185 "state": "configuring", 00:22:08.185 "raid_level": "concat", 00:22:08.185 "superblock": false, 00:22:08.185 "num_base_bdevs": 4, 00:22:08.185 "num_base_bdevs_discovered": 3, 00:22:08.185 "num_base_bdevs_operational": 4, 00:22:08.185 "base_bdevs_list": [ 00:22:08.185 { 00:22:08.185 "name": null, 00:22:08.185 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:08.185 "is_configured": false, 00:22:08.185 "data_offset": 0, 00:22:08.185 "data_size": 65536 00:22:08.185 }, 00:22:08.185 { 00:22:08.185 "name": "BaseBdev2", 00:22:08.185 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:08.185 "is_configured": true, 00:22:08.185 "data_offset": 0, 00:22:08.185 "data_size": 65536 00:22:08.185 }, 00:22:08.185 { 00:22:08.185 "name": "BaseBdev3", 00:22:08.185 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:08.185 "is_configured": true, 00:22:08.185 "data_offset": 0, 00:22:08.185 "data_size": 65536 00:22:08.185 }, 00:22:08.185 { 00:22:08.185 "name": "BaseBdev4", 00:22:08.185 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:08.185 "is_configured": true, 00:22:08.185 "data_offset": 0, 00:22:08.185 "data_size": 65536 00:22:08.185 } 00:22:08.185 ] 00:22:08.185 }' 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.185 06:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.751 06:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:08.751 06:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.008 06:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:09.008 06:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.008 06:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:09.266 06:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e4bb5616-6534-431c-b1cb-df3993ed4a25 00:22:09.525 [2024-07-25 06:38:22.940851] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:09.525 [2024-07-25 06:38:22.940885] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c1710 00:22:09.525 [2024-07-25 06:38:22.940893] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:22:09.525 [2024-07-25 06:38:22.941071] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2869620 00:22:09.525 [2024-07-25 06:38:22.941186] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c1710 00:22:09.525 [2024-07-25 06:38:22.941195] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26c1710 00:22:09.525 [2024-07-25 06:38:22.941341] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.525 NewBaseBdev 00:22:09.525 06:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:09.525 06:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:22:09.525 06:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:09.525 06:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:09.525 06:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:09.525 06:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:09.525 06:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:09.783 06:38:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:10.040 [ 00:22:10.040 { 00:22:10.040 "name": "NewBaseBdev", 00:22:10.040 "aliases": [ 00:22:10.040 "e4bb5616-6534-431c-b1cb-df3993ed4a25" 00:22:10.040 ], 00:22:10.040 "product_name": "Malloc disk", 00:22:10.040 "block_size": 512, 00:22:10.040 "num_blocks": 65536, 00:22:10.040 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:10.040 "assigned_rate_limits": { 00:22:10.040 "rw_ios_per_sec": 0, 00:22:10.040 "rw_mbytes_per_sec": 0, 00:22:10.040 "r_mbytes_per_sec": 0, 00:22:10.040 "w_mbytes_per_sec": 0 00:22:10.040 }, 00:22:10.040 "claimed": true, 00:22:10.040 "claim_type": "exclusive_write", 00:22:10.040 "zoned": false, 00:22:10.040 "supported_io_types": { 00:22:10.040 "read": true, 00:22:10.040 "write": true, 00:22:10.040 "unmap": true, 00:22:10.040 "flush": true, 00:22:10.040 "reset": true, 00:22:10.040 "nvme_admin": false, 00:22:10.040 "nvme_io": false, 00:22:10.040 "nvme_io_md": false, 00:22:10.040 "write_zeroes": true, 00:22:10.040 "zcopy": true, 00:22:10.040 "get_zone_info": false, 00:22:10.040 "zone_management": false, 00:22:10.040 "zone_append": false, 00:22:10.040 "compare": false, 00:22:10.040 "compare_and_write": false, 00:22:10.040 "abort": true, 00:22:10.040 "seek_hole": false, 00:22:10.040 "seek_data": false, 00:22:10.040 "copy": true, 00:22:10.040 "nvme_iov_md": false 00:22:10.040 }, 00:22:10.040 "memory_domains": [ 00:22:10.040 { 00:22:10.040 "dma_device_id": "system", 00:22:10.040 "dma_device_type": 1 00:22:10.040 }, 00:22:10.040 { 00:22:10.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.040 "dma_device_type": 2 00:22:10.040 } 00:22:10.040 ], 00:22:10.040 "driver_specific": {} 00:22:10.040 } 00:22:10.040 ] 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:10.040 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.298 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.298 "name": "Existed_Raid", 00:22:10.298 "uuid": "3539a4af-5f2d-455e-a4d6-94cf72183ea2", 00:22:10.298 "strip_size_kb": 64, 00:22:10.298 "state": "online", 00:22:10.298 "raid_level": "concat", 00:22:10.298 "superblock": false, 00:22:10.298 "num_base_bdevs": 4, 00:22:10.298 "num_base_bdevs_discovered": 4, 00:22:10.298 "num_base_bdevs_operational": 4, 00:22:10.298 "base_bdevs_list": [ 00:22:10.298 { 00:22:10.298 "name": "NewBaseBdev", 00:22:10.298 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:10.298 "is_configured": true, 00:22:10.298 "data_offset": 0, 00:22:10.298 "data_size": 65536 00:22:10.298 }, 00:22:10.298 { 00:22:10.298 "name": "BaseBdev2", 00:22:10.298 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:10.298 "is_configured": true, 00:22:10.298 "data_offset": 0, 00:22:10.298 "data_size": 65536 00:22:10.298 }, 00:22:10.298 { 00:22:10.298 "name": "BaseBdev3", 00:22:10.298 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:10.298 "is_configured": true, 00:22:10.298 "data_offset": 0, 00:22:10.298 "data_size": 65536 00:22:10.298 }, 00:22:10.298 { 00:22:10.298 "name": "BaseBdev4", 00:22:10.298 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:10.298 "is_configured": true, 00:22:10.298 "data_offset": 0, 00:22:10.298 "data_size": 65536 00:22:10.298 } 00:22:10.298 ] 00:22:10.298 }' 00:22:10.298 06:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.298 06:38:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.863 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:10.863 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:10.863 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:10.863 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:10.863 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:10.863 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:10.863 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:10.863 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:11.122 [2024-07-25 06:38:24.429433] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:11.122 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:11.122 "name": "Existed_Raid", 00:22:11.122 "aliases": [ 00:22:11.122 "3539a4af-5f2d-455e-a4d6-94cf72183ea2" 00:22:11.122 ], 00:22:11.122 "product_name": "Raid Volume", 00:22:11.122 "block_size": 512, 00:22:11.122 "num_blocks": 262144, 00:22:11.122 "uuid": "3539a4af-5f2d-455e-a4d6-94cf72183ea2", 00:22:11.122 "assigned_rate_limits": { 00:22:11.122 "rw_ios_per_sec": 0, 00:22:11.122 "rw_mbytes_per_sec": 0, 00:22:11.122 "r_mbytes_per_sec": 0, 00:22:11.122 "w_mbytes_per_sec": 0 00:22:11.122 }, 00:22:11.122 "claimed": false, 00:22:11.122 "zoned": false, 00:22:11.122 "supported_io_types": { 00:22:11.122 "read": true, 00:22:11.122 "write": true, 00:22:11.122 "unmap": true, 00:22:11.122 "flush": true, 00:22:11.122 "reset": true, 00:22:11.122 "nvme_admin": false, 00:22:11.122 "nvme_io": false, 00:22:11.122 "nvme_io_md": false, 00:22:11.122 "write_zeroes": true, 00:22:11.122 "zcopy": false, 00:22:11.122 "get_zone_info": false, 00:22:11.122 "zone_management": false, 00:22:11.122 "zone_append": false, 00:22:11.122 "compare": false, 00:22:11.122 "compare_and_write": false, 00:22:11.122 "abort": false, 00:22:11.122 "seek_hole": false, 00:22:11.122 "seek_data": false, 00:22:11.122 "copy": false, 00:22:11.122 "nvme_iov_md": false 00:22:11.122 }, 00:22:11.122 "memory_domains": [ 00:22:11.122 { 00:22:11.122 "dma_device_id": "system", 00:22:11.122 "dma_device_type": 1 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.122 "dma_device_type": 2 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "dma_device_id": "system", 00:22:11.122 "dma_device_type": 1 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.122 "dma_device_type": 2 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "dma_device_id": "system", 00:22:11.122 "dma_device_type": 1 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.122 "dma_device_type": 2 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "dma_device_id": "system", 00:22:11.122 "dma_device_type": 1 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.122 "dma_device_type": 2 00:22:11.122 } 00:22:11.122 ], 00:22:11.122 "driver_specific": { 00:22:11.122 "raid": { 00:22:11.122 "uuid": "3539a4af-5f2d-455e-a4d6-94cf72183ea2", 00:22:11.122 "strip_size_kb": 64, 00:22:11.122 "state": "online", 00:22:11.122 "raid_level": "concat", 00:22:11.122 "superblock": false, 00:22:11.122 "num_base_bdevs": 4, 00:22:11.122 "num_base_bdevs_discovered": 4, 00:22:11.122 "num_base_bdevs_operational": 4, 00:22:11.122 "base_bdevs_list": [ 00:22:11.122 { 00:22:11.122 "name": "NewBaseBdev", 00:22:11.122 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:11.122 "is_configured": true, 00:22:11.122 "data_offset": 0, 00:22:11.122 "data_size": 65536 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "name": "BaseBdev2", 00:22:11.122 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:11.122 "is_configured": true, 00:22:11.122 "data_offset": 0, 00:22:11.122 "data_size": 65536 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "name": "BaseBdev3", 00:22:11.122 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:11.122 "is_configured": true, 00:22:11.122 "data_offset": 0, 00:22:11.122 "data_size": 65536 00:22:11.122 }, 00:22:11.122 { 00:22:11.122 "name": "BaseBdev4", 00:22:11.122 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:11.122 "is_configured": true, 00:22:11.122 "data_offset": 0, 00:22:11.122 "data_size": 65536 00:22:11.122 } 00:22:11.122 ] 00:22:11.122 } 00:22:11.122 } 00:22:11.122 }' 00:22:11.122 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:11.122 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:11.122 BaseBdev2 00:22:11.122 BaseBdev3 00:22:11.122 BaseBdev4' 00:22:11.122 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:11.122 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:11.122 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:11.380 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:11.380 "name": "NewBaseBdev", 00:22:11.380 "aliases": [ 00:22:11.380 "e4bb5616-6534-431c-b1cb-df3993ed4a25" 00:22:11.380 ], 00:22:11.381 "product_name": "Malloc disk", 00:22:11.381 "block_size": 512, 00:22:11.381 "num_blocks": 65536, 00:22:11.381 "uuid": "e4bb5616-6534-431c-b1cb-df3993ed4a25", 00:22:11.381 "assigned_rate_limits": { 00:22:11.381 "rw_ios_per_sec": 0, 00:22:11.381 "rw_mbytes_per_sec": 0, 00:22:11.381 "r_mbytes_per_sec": 0, 00:22:11.381 "w_mbytes_per_sec": 0 00:22:11.381 }, 00:22:11.381 "claimed": true, 00:22:11.381 "claim_type": "exclusive_write", 00:22:11.381 "zoned": false, 00:22:11.381 "supported_io_types": { 00:22:11.381 "read": true, 00:22:11.381 "write": true, 00:22:11.381 "unmap": true, 00:22:11.381 "flush": true, 00:22:11.381 "reset": true, 00:22:11.381 "nvme_admin": false, 00:22:11.381 "nvme_io": false, 00:22:11.381 "nvme_io_md": false, 00:22:11.381 "write_zeroes": true, 00:22:11.381 "zcopy": true, 00:22:11.381 "get_zone_info": false, 00:22:11.381 "zone_management": false, 00:22:11.381 "zone_append": false, 00:22:11.381 "compare": false, 00:22:11.381 "compare_and_write": false, 00:22:11.381 "abort": true, 00:22:11.381 "seek_hole": false, 00:22:11.381 "seek_data": false, 00:22:11.381 "copy": true, 00:22:11.381 "nvme_iov_md": false 00:22:11.381 }, 00:22:11.381 "memory_domains": [ 00:22:11.381 { 00:22:11.381 "dma_device_id": "system", 00:22:11.381 "dma_device_type": 1 00:22:11.381 }, 00:22:11.381 { 00:22:11.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.381 "dma_device_type": 2 00:22:11.381 } 00:22:11.381 ], 00:22:11.381 "driver_specific": {} 00:22:11.381 }' 00:22:11.381 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.381 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.381 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:11.381 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.381 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.381 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:11.381 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.381 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.639 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:11.639 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.639 06:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.639 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:11.639 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:11.639 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:11.639 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:11.897 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:11.897 "name": "BaseBdev2", 00:22:11.897 "aliases": [ 00:22:11.897 "3689ff29-b312-4efc-b784-59f73986b0d4" 00:22:11.897 ], 00:22:11.897 "product_name": "Malloc disk", 00:22:11.897 "block_size": 512, 00:22:11.897 "num_blocks": 65536, 00:22:11.897 "uuid": "3689ff29-b312-4efc-b784-59f73986b0d4", 00:22:11.897 "assigned_rate_limits": { 00:22:11.897 "rw_ios_per_sec": 0, 00:22:11.897 "rw_mbytes_per_sec": 0, 00:22:11.897 "r_mbytes_per_sec": 0, 00:22:11.897 "w_mbytes_per_sec": 0 00:22:11.897 }, 00:22:11.897 "claimed": true, 00:22:11.897 "claim_type": "exclusive_write", 00:22:11.897 "zoned": false, 00:22:11.897 "supported_io_types": { 00:22:11.897 "read": true, 00:22:11.897 "write": true, 00:22:11.897 "unmap": true, 00:22:11.897 "flush": true, 00:22:11.897 "reset": true, 00:22:11.897 "nvme_admin": false, 00:22:11.897 "nvme_io": false, 00:22:11.897 "nvme_io_md": false, 00:22:11.897 "write_zeroes": true, 00:22:11.897 "zcopy": true, 00:22:11.897 "get_zone_info": false, 00:22:11.897 "zone_management": false, 00:22:11.897 "zone_append": false, 00:22:11.897 "compare": false, 00:22:11.897 "compare_and_write": false, 00:22:11.897 "abort": true, 00:22:11.897 "seek_hole": false, 00:22:11.897 "seek_data": false, 00:22:11.897 "copy": true, 00:22:11.897 "nvme_iov_md": false 00:22:11.897 }, 00:22:11.897 "memory_domains": [ 00:22:11.897 { 00:22:11.897 "dma_device_id": "system", 00:22:11.897 "dma_device_type": 1 00:22:11.897 }, 00:22:11.897 { 00:22:11.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.897 "dma_device_type": 2 00:22:11.897 } 00:22:11.897 ], 00:22:11.897 "driver_specific": {} 00:22:11.897 }' 00:22:11.897 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.897 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.897 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:11.897 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.897 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.897 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:11.897 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.155 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.155 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:12.155 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.155 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.155 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:12.155 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.155 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:12.155 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:12.413 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:12.413 "name": "BaseBdev3", 00:22:12.413 "aliases": [ 00:22:12.413 "08e55f1c-e04c-4d88-b1c6-24a1f74a044d" 00:22:12.413 ], 00:22:12.413 "product_name": "Malloc disk", 00:22:12.413 "block_size": 512, 00:22:12.413 "num_blocks": 65536, 00:22:12.413 "uuid": "08e55f1c-e04c-4d88-b1c6-24a1f74a044d", 00:22:12.413 "assigned_rate_limits": { 00:22:12.413 "rw_ios_per_sec": 0, 00:22:12.413 "rw_mbytes_per_sec": 0, 00:22:12.413 "r_mbytes_per_sec": 0, 00:22:12.413 "w_mbytes_per_sec": 0 00:22:12.413 }, 00:22:12.413 "claimed": true, 00:22:12.413 "claim_type": "exclusive_write", 00:22:12.413 "zoned": false, 00:22:12.413 "supported_io_types": { 00:22:12.413 "read": true, 00:22:12.413 "write": true, 00:22:12.413 "unmap": true, 00:22:12.413 "flush": true, 00:22:12.413 "reset": true, 00:22:12.413 "nvme_admin": false, 00:22:12.413 "nvme_io": false, 00:22:12.413 "nvme_io_md": false, 00:22:12.413 "write_zeroes": true, 00:22:12.413 "zcopy": true, 00:22:12.413 "get_zone_info": false, 00:22:12.413 "zone_management": false, 00:22:12.413 "zone_append": false, 00:22:12.413 "compare": false, 00:22:12.413 "compare_and_write": false, 00:22:12.413 "abort": true, 00:22:12.413 "seek_hole": false, 00:22:12.413 "seek_data": false, 00:22:12.413 "copy": true, 00:22:12.413 "nvme_iov_md": false 00:22:12.413 }, 00:22:12.413 "memory_domains": [ 00:22:12.413 { 00:22:12.413 "dma_device_id": "system", 00:22:12.413 "dma_device_type": 1 00:22:12.413 }, 00:22:12.413 { 00:22:12.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.413 "dma_device_type": 2 00:22:12.413 } 00:22:12.413 ], 00:22:12.413 "driver_specific": {} 00:22:12.413 }' 00:22:12.413 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.413 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.413 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:12.413 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.413 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.413 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:12.671 06:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.671 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.671 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:12.671 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.671 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.671 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:12.671 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.671 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:12.671 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:12.929 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:12.929 "name": "BaseBdev4", 00:22:12.929 "aliases": [ 00:22:12.929 "d356b5de-9fab-44ff-b39d-7ce8fa7088c6" 00:22:12.929 ], 00:22:12.929 "product_name": "Malloc disk", 00:22:12.929 "block_size": 512, 00:22:12.929 "num_blocks": 65536, 00:22:12.929 "uuid": "d356b5de-9fab-44ff-b39d-7ce8fa7088c6", 00:22:12.929 "assigned_rate_limits": { 00:22:12.929 "rw_ios_per_sec": 0, 00:22:12.929 "rw_mbytes_per_sec": 0, 00:22:12.929 "r_mbytes_per_sec": 0, 00:22:12.929 "w_mbytes_per_sec": 0 00:22:12.929 }, 00:22:12.929 "claimed": true, 00:22:12.929 "claim_type": "exclusive_write", 00:22:12.929 "zoned": false, 00:22:12.929 "supported_io_types": { 00:22:12.929 "read": true, 00:22:12.929 "write": true, 00:22:12.929 "unmap": true, 00:22:12.929 "flush": true, 00:22:12.929 "reset": true, 00:22:12.929 "nvme_admin": false, 00:22:12.929 "nvme_io": false, 00:22:12.929 "nvme_io_md": false, 00:22:12.929 "write_zeroes": true, 00:22:12.929 "zcopy": true, 00:22:12.929 "get_zone_info": false, 00:22:12.929 "zone_management": false, 00:22:12.929 "zone_append": false, 00:22:12.929 "compare": false, 00:22:12.929 "compare_and_write": false, 00:22:12.929 "abort": true, 00:22:12.929 "seek_hole": false, 00:22:12.929 "seek_data": false, 00:22:12.929 "copy": true, 00:22:12.929 "nvme_iov_md": false 00:22:12.929 }, 00:22:12.929 "memory_domains": [ 00:22:12.929 { 00:22:12.929 "dma_device_id": "system", 00:22:12.929 "dma_device_type": 1 00:22:12.929 }, 00:22:12.929 { 00:22:12.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.929 "dma_device_type": 2 00:22:12.929 } 00:22:12.929 ], 00:22:12.929 "driver_specific": {} 00:22:12.929 }' 00:22:12.930 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.930 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.930 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:12.930 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.930 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.930 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:12.930 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.187 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.187 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:13.187 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.187 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.187 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:13.187 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:13.445 [2024-07-25 06:38:26.831463] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:13.445 [2024-07-25 06:38:26.831488] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:13.445 [2024-07-25 06:38:26.831532] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:13.445 [2024-07-25 06:38:26.831586] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:13.445 [2024-07-25 06:38:26.831597] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c1710 name Existed_Raid, state offline 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1184799 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1184799 ']' 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1184799 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1184799 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1184799' 00:22:13.445 killing process with pid 1184799 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1184799 00:22:13.445 [2024-07-25 06:38:26.921561] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:13.445 06:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1184799 00:22:13.445 [2024-07-25 06:38:26.953322] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:13.704 00:22:13.704 real 0m30.179s 00:22:13.704 user 0m55.341s 00:22:13.704 sys 0m5.429s 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:13.704 ************************************ 00:22:13.704 END TEST raid_state_function_test 00:22:13.704 ************************************ 00:22:13.704 06:38:27 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:22:13.704 06:38:27 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:22:13.704 06:38:27 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:13.704 06:38:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:13.704 ************************************ 00:22:13.704 START TEST raid_state_function_test_sb 00:22:13.704 ************************************ 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 true 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1190489 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1190489' 00:22:13.704 Process raid pid: 1190489 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1190489 /var/tmp/spdk-raid.sock 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1190489 ']' 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:13.704 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:13.704 06:38:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:13.962 [2024-07-25 06:38:27.292880] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:22:13.962 [2024-07-25 06:38:27.292943] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:13.962 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.962 EAL: Requested device 0000:3d:01.0 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:01.1 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:01.2 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:01.3 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:01.4 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:01.5 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:01.6 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:01.7 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:02.0 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:02.1 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:02.2 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:02.3 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:02.4 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:02.5 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:02.6 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3d:02.7 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:01.0 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:01.1 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:01.2 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:01.3 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:01.4 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:01.5 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:01.6 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:01.7 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:02.0 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:02.1 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:02.2 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:02.3 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:02.4 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:02.5 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:02.6 cannot be used 00:22:13.963 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:13.963 EAL: Requested device 0000:3f:02.7 cannot be used 00:22:13.963 [2024-07-25 06:38:27.428593] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:13.963 [2024-07-25 06:38:27.473478] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:14.221 [2024-07-25 06:38:27.527525] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:14.221 [2024-07-25 06:38:27.527551] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:14.786 06:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:14.786 06:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:22:14.786 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:15.044 [2024-07-25 06:38:28.407421] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:15.044 [2024-07-25 06:38:28.407456] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:15.044 [2024-07-25 06:38:28.407465] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:15.044 [2024-07-25 06:38:28.407476] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:15.044 [2024-07-25 06:38:28.407485] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:15.045 [2024-07-25 06:38:28.407494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:15.045 [2024-07-25 06:38:28.407502] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:15.045 [2024-07-25 06:38:28.407512] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.045 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:15.303 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.303 "name": "Existed_Raid", 00:22:15.303 "uuid": "fe86df00-b214-4d2c-8813-98485c29fcfb", 00:22:15.303 "strip_size_kb": 64, 00:22:15.303 "state": "configuring", 00:22:15.303 "raid_level": "concat", 00:22:15.303 "superblock": true, 00:22:15.303 "num_base_bdevs": 4, 00:22:15.303 "num_base_bdevs_discovered": 0, 00:22:15.303 "num_base_bdevs_operational": 4, 00:22:15.303 "base_bdevs_list": [ 00:22:15.303 { 00:22:15.303 "name": "BaseBdev1", 00:22:15.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.303 "is_configured": false, 00:22:15.303 "data_offset": 0, 00:22:15.303 "data_size": 0 00:22:15.303 }, 00:22:15.303 { 00:22:15.303 "name": "BaseBdev2", 00:22:15.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.303 "is_configured": false, 00:22:15.303 "data_offset": 0, 00:22:15.303 "data_size": 0 00:22:15.303 }, 00:22:15.303 { 00:22:15.303 "name": "BaseBdev3", 00:22:15.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.303 "is_configured": false, 00:22:15.303 "data_offset": 0, 00:22:15.303 "data_size": 0 00:22:15.303 }, 00:22:15.303 { 00:22:15.304 "name": "BaseBdev4", 00:22:15.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.304 "is_configured": false, 00:22:15.304 "data_offset": 0, 00:22:15.304 "data_size": 0 00:22:15.304 } 00:22:15.304 ] 00:22:15.304 }' 00:22:15.304 06:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.304 06:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:15.870 06:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:15.870 [2024-07-25 06:38:29.413986] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:15.870 [2024-07-25 06:38:29.414013] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b33470 name Existed_Raid, state configuring 00:22:16.128 06:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:16.128 [2024-07-25 06:38:29.642611] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:16.128 [2024-07-25 06:38:29.642638] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:16.128 [2024-07-25 06:38:29.642647] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:16.128 [2024-07-25 06:38:29.642658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:16.128 [2024-07-25 06:38:29.642666] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:16.128 [2024-07-25 06:38:29.642676] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:16.128 [2024-07-25 06:38:29.642684] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:16.128 [2024-07-25 06:38:29.642694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:16.128 06:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:16.386 [2024-07-25 06:38:29.880676] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:16.386 BaseBdev1 00:22:16.386 06:38:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:16.386 06:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:16.386 06:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:16.386 06:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:16.386 06:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:16.386 06:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:16.386 06:38:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:16.645 06:38:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:16.903 [ 00:22:16.903 { 00:22:16.903 "name": "BaseBdev1", 00:22:16.903 "aliases": [ 00:22:16.903 "c7ae05ac-9223-49c2-853e-c11537cf82c9" 00:22:16.903 ], 00:22:16.903 "product_name": "Malloc disk", 00:22:16.903 "block_size": 512, 00:22:16.903 "num_blocks": 65536, 00:22:16.903 "uuid": "c7ae05ac-9223-49c2-853e-c11537cf82c9", 00:22:16.903 "assigned_rate_limits": { 00:22:16.903 "rw_ios_per_sec": 0, 00:22:16.903 "rw_mbytes_per_sec": 0, 00:22:16.903 "r_mbytes_per_sec": 0, 00:22:16.903 "w_mbytes_per_sec": 0 00:22:16.903 }, 00:22:16.903 "claimed": true, 00:22:16.903 "claim_type": "exclusive_write", 00:22:16.903 "zoned": false, 00:22:16.903 "supported_io_types": { 00:22:16.903 "read": true, 00:22:16.903 "write": true, 00:22:16.903 "unmap": true, 00:22:16.903 "flush": true, 00:22:16.903 "reset": true, 00:22:16.903 "nvme_admin": false, 00:22:16.903 "nvme_io": false, 00:22:16.903 "nvme_io_md": false, 00:22:16.903 "write_zeroes": true, 00:22:16.903 "zcopy": true, 00:22:16.903 "get_zone_info": false, 00:22:16.903 "zone_management": false, 00:22:16.903 "zone_append": false, 00:22:16.903 "compare": false, 00:22:16.903 "compare_and_write": false, 00:22:16.903 "abort": true, 00:22:16.903 "seek_hole": false, 00:22:16.903 "seek_data": false, 00:22:16.903 "copy": true, 00:22:16.903 "nvme_iov_md": false 00:22:16.903 }, 00:22:16.903 "memory_domains": [ 00:22:16.903 { 00:22:16.903 "dma_device_id": "system", 00:22:16.903 "dma_device_type": 1 00:22:16.903 }, 00:22:16.903 { 00:22:16.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.903 "dma_device_type": 2 00:22:16.903 } 00:22:16.903 ], 00:22:16.903 "driver_specific": {} 00:22:16.903 } 00:22:16.903 ] 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.903 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:17.162 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.162 "name": "Existed_Raid", 00:22:17.162 "uuid": "b3dbf9a2-6831-4d26-9bd1-e9cfcea06e47", 00:22:17.162 "strip_size_kb": 64, 00:22:17.162 "state": "configuring", 00:22:17.162 "raid_level": "concat", 00:22:17.162 "superblock": true, 00:22:17.162 "num_base_bdevs": 4, 00:22:17.162 "num_base_bdevs_discovered": 1, 00:22:17.162 "num_base_bdevs_operational": 4, 00:22:17.162 "base_bdevs_list": [ 00:22:17.162 { 00:22:17.162 "name": "BaseBdev1", 00:22:17.162 "uuid": "c7ae05ac-9223-49c2-853e-c11537cf82c9", 00:22:17.162 "is_configured": true, 00:22:17.162 "data_offset": 2048, 00:22:17.162 "data_size": 63488 00:22:17.162 }, 00:22:17.162 { 00:22:17.162 "name": "BaseBdev2", 00:22:17.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.162 "is_configured": false, 00:22:17.162 "data_offset": 0, 00:22:17.162 "data_size": 0 00:22:17.162 }, 00:22:17.162 { 00:22:17.162 "name": "BaseBdev3", 00:22:17.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.162 "is_configured": false, 00:22:17.162 "data_offset": 0, 00:22:17.162 "data_size": 0 00:22:17.162 }, 00:22:17.162 { 00:22:17.162 "name": "BaseBdev4", 00:22:17.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.162 "is_configured": false, 00:22:17.162 "data_offset": 0, 00:22:17.162 "data_size": 0 00:22:17.162 } 00:22:17.162 ] 00:22:17.162 }' 00:22:17.162 06:38:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.162 06:38:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:17.728 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:17.987 [2024-07-25 06:38:31.300424] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:17.987 [2024-07-25 06:38:31.300461] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b32ce0 name Existed_Raid, state configuring 00:22:17.987 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:17.987 [2024-07-25 06:38:31.525057] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:17.987 [2024-07-25 06:38:31.526442] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:17.987 [2024-07-25 06:38:31.526475] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:17.987 [2024-07-25 06:38:31.526485] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:17.987 [2024-07-25 06:38:31.526495] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:17.987 [2024-07-25 06:38:31.526503] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:17.987 [2024-07-25 06:38:31.526514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:17.987 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:17.987 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.245 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.246 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.246 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.246 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:18.246 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.246 "name": "Existed_Raid", 00:22:18.246 "uuid": "902767f6-772d-4758-b1ab-0e0959edbd9c", 00:22:18.246 "strip_size_kb": 64, 00:22:18.246 "state": "configuring", 00:22:18.246 "raid_level": "concat", 00:22:18.246 "superblock": true, 00:22:18.246 "num_base_bdevs": 4, 00:22:18.246 "num_base_bdevs_discovered": 1, 00:22:18.246 "num_base_bdevs_operational": 4, 00:22:18.246 "base_bdevs_list": [ 00:22:18.246 { 00:22:18.246 "name": "BaseBdev1", 00:22:18.246 "uuid": "c7ae05ac-9223-49c2-853e-c11537cf82c9", 00:22:18.246 "is_configured": true, 00:22:18.246 "data_offset": 2048, 00:22:18.246 "data_size": 63488 00:22:18.246 }, 00:22:18.246 { 00:22:18.246 "name": "BaseBdev2", 00:22:18.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.246 "is_configured": false, 00:22:18.246 "data_offset": 0, 00:22:18.246 "data_size": 0 00:22:18.246 }, 00:22:18.246 { 00:22:18.246 "name": "BaseBdev3", 00:22:18.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.246 "is_configured": false, 00:22:18.246 "data_offset": 0, 00:22:18.246 "data_size": 0 00:22:18.246 }, 00:22:18.246 { 00:22:18.246 "name": "BaseBdev4", 00:22:18.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.246 "is_configured": false, 00:22:18.246 "data_offset": 0, 00:22:18.246 "data_size": 0 00:22:18.246 } 00:22:18.246 ] 00:22:18.246 }' 00:22:18.246 06:38:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.246 06:38:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:18.813 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:19.071 [2024-07-25 06:38:32.486675] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:19.071 BaseBdev2 00:22:19.071 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:19.071 06:38:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:19.071 06:38:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:19.071 06:38:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:19.071 06:38:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:19.071 06:38:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:19.071 06:38:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:19.329 06:38:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:19.587 [ 00:22:19.587 { 00:22:19.587 "name": "BaseBdev2", 00:22:19.587 "aliases": [ 00:22:19.587 "b2604905-974c-470c-827a-8167b47824aa" 00:22:19.587 ], 00:22:19.587 "product_name": "Malloc disk", 00:22:19.587 "block_size": 512, 00:22:19.587 "num_blocks": 65536, 00:22:19.587 "uuid": "b2604905-974c-470c-827a-8167b47824aa", 00:22:19.587 "assigned_rate_limits": { 00:22:19.587 "rw_ios_per_sec": 0, 00:22:19.587 "rw_mbytes_per_sec": 0, 00:22:19.587 "r_mbytes_per_sec": 0, 00:22:19.587 "w_mbytes_per_sec": 0 00:22:19.587 }, 00:22:19.587 "claimed": true, 00:22:19.587 "claim_type": "exclusive_write", 00:22:19.587 "zoned": false, 00:22:19.587 "supported_io_types": { 00:22:19.587 "read": true, 00:22:19.587 "write": true, 00:22:19.587 "unmap": true, 00:22:19.587 "flush": true, 00:22:19.587 "reset": true, 00:22:19.587 "nvme_admin": false, 00:22:19.587 "nvme_io": false, 00:22:19.587 "nvme_io_md": false, 00:22:19.587 "write_zeroes": true, 00:22:19.587 "zcopy": true, 00:22:19.587 "get_zone_info": false, 00:22:19.587 "zone_management": false, 00:22:19.587 "zone_append": false, 00:22:19.587 "compare": false, 00:22:19.587 "compare_and_write": false, 00:22:19.587 "abort": true, 00:22:19.587 "seek_hole": false, 00:22:19.587 "seek_data": false, 00:22:19.587 "copy": true, 00:22:19.587 "nvme_iov_md": false 00:22:19.587 }, 00:22:19.587 "memory_domains": [ 00:22:19.587 { 00:22:19.588 "dma_device_id": "system", 00:22:19.588 "dma_device_type": 1 00:22:19.588 }, 00:22:19.588 { 00:22:19.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:19.588 "dma_device_type": 2 00:22:19.588 } 00:22:19.588 ], 00:22:19.588 "driver_specific": {} 00:22:19.588 } 00:22:19.588 ] 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:19.588 06:38:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.846 06:38:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.846 "name": "Existed_Raid", 00:22:19.846 "uuid": "902767f6-772d-4758-b1ab-0e0959edbd9c", 00:22:19.846 "strip_size_kb": 64, 00:22:19.846 "state": "configuring", 00:22:19.846 "raid_level": "concat", 00:22:19.846 "superblock": true, 00:22:19.846 "num_base_bdevs": 4, 00:22:19.846 "num_base_bdevs_discovered": 2, 00:22:19.846 "num_base_bdevs_operational": 4, 00:22:19.846 "base_bdevs_list": [ 00:22:19.846 { 00:22:19.846 "name": "BaseBdev1", 00:22:19.846 "uuid": "c7ae05ac-9223-49c2-853e-c11537cf82c9", 00:22:19.846 "is_configured": true, 00:22:19.846 "data_offset": 2048, 00:22:19.846 "data_size": 63488 00:22:19.846 }, 00:22:19.846 { 00:22:19.846 "name": "BaseBdev2", 00:22:19.846 "uuid": "b2604905-974c-470c-827a-8167b47824aa", 00:22:19.846 "is_configured": true, 00:22:19.846 "data_offset": 2048, 00:22:19.846 "data_size": 63488 00:22:19.846 }, 00:22:19.846 { 00:22:19.846 "name": "BaseBdev3", 00:22:19.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.846 "is_configured": false, 00:22:19.846 "data_offset": 0, 00:22:19.846 "data_size": 0 00:22:19.846 }, 00:22:19.846 { 00:22:19.846 "name": "BaseBdev4", 00:22:19.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.846 "is_configured": false, 00:22:19.846 "data_offset": 0, 00:22:19.846 "data_size": 0 00:22:19.846 } 00:22:19.846 ] 00:22:19.846 }' 00:22:19.846 06:38:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.846 06:38:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:20.436 06:38:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:20.436 [2024-07-25 06:38:33.941931] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:20.436 BaseBdev3 00:22:20.436 06:38:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:20.436 06:38:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:22:20.436 06:38:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:20.436 06:38:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:20.436 06:38:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:20.436 06:38:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:20.436 06:38:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:20.722 06:38:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:20.980 [ 00:22:20.980 { 00:22:20.980 "name": "BaseBdev3", 00:22:20.980 "aliases": [ 00:22:20.980 "aea24b47-2b5c-4c38-b524-dad2033447ec" 00:22:20.980 ], 00:22:20.980 "product_name": "Malloc disk", 00:22:20.980 "block_size": 512, 00:22:20.980 "num_blocks": 65536, 00:22:20.980 "uuid": "aea24b47-2b5c-4c38-b524-dad2033447ec", 00:22:20.980 "assigned_rate_limits": { 00:22:20.980 "rw_ios_per_sec": 0, 00:22:20.980 "rw_mbytes_per_sec": 0, 00:22:20.980 "r_mbytes_per_sec": 0, 00:22:20.980 "w_mbytes_per_sec": 0 00:22:20.980 }, 00:22:20.980 "claimed": true, 00:22:20.980 "claim_type": "exclusive_write", 00:22:20.980 "zoned": false, 00:22:20.980 "supported_io_types": { 00:22:20.980 "read": true, 00:22:20.980 "write": true, 00:22:20.980 "unmap": true, 00:22:20.980 "flush": true, 00:22:20.980 "reset": true, 00:22:20.980 "nvme_admin": false, 00:22:20.980 "nvme_io": false, 00:22:20.980 "nvme_io_md": false, 00:22:20.980 "write_zeroes": true, 00:22:20.980 "zcopy": true, 00:22:20.980 "get_zone_info": false, 00:22:20.980 "zone_management": false, 00:22:20.980 "zone_append": false, 00:22:20.980 "compare": false, 00:22:20.980 "compare_and_write": false, 00:22:20.980 "abort": true, 00:22:20.980 "seek_hole": false, 00:22:20.980 "seek_data": false, 00:22:20.980 "copy": true, 00:22:20.980 "nvme_iov_md": false 00:22:20.980 }, 00:22:20.980 "memory_domains": [ 00:22:20.980 { 00:22:20.980 "dma_device_id": "system", 00:22:20.980 "dma_device_type": 1 00:22:20.980 }, 00:22:20.980 { 00:22:20.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:20.980 "dma_device_type": 2 00:22:20.980 } 00:22:20.980 ], 00:22:20.980 "driver_specific": {} 00:22:20.980 } 00:22:20.980 ] 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.980 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:21.238 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.238 "name": "Existed_Raid", 00:22:21.238 "uuid": "902767f6-772d-4758-b1ab-0e0959edbd9c", 00:22:21.238 "strip_size_kb": 64, 00:22:21.238 "state": "configuring", 00:22:21.238 "raid_level": "concat", 00:22:21.238 "superblock": true, 00:22:21.238 "num_base_bdevs": 4, 00:22:21.238 "num_base_bdevs_discovered": 3, 00:22:21.238 "num_base_bdevs_operational": 4, 00:22:21.238 "base_bdevs_list": [ 00:22:21.238 { 00:22:21.238 "name": "BaseBdev1", 00:22:21.238 "uuid": "c7ae05ac-9223-49c2-853e-c11537cf82c9", 00:22:21.238 "is_configured": true, 00:22:21.238 "data_offset": 2048, 00:22:21.238 "data_size": 63488 00:22:21.238 }, 00:22:21.238 { 00:22:21.238 "name": "BaseBdev2", 00:22:21.238 "uuid": "b2604905-974c-470c-827a-8167b47824aa", 00:22:21.238 "is_configured": true, 00:22:21.238 "data_offset": 2048, 00:22:21.238 "data_size": 63488 00:22:21.238 }, 00:22:21.238 { 00:22:21.238 "name": "BaseBdev3", 00:22:21.238 "uuid": "aea24b47-2b5c-4c38-b524-dad2033447ec", 00:22:21.238 "is_configured": true, 00:22:21.238 "data_offset": 2048, 00:22:21.238 "data_size": 63488 00:22:21.238 }, 00:22:21.238 { 00:22:21.238 "name": "BaseBdev4", 00:22:21.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.238 "is_configured": false, 00:22:21.238 "data_offset": 0, 00:22:21.238 "data_size": 0 00:22:21.238 } 00:22:21.238 ] 00:22:21.238 }' 00:22:21.238 06:38:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.238 06:38:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:21.804 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:21.804 [2024-07-25 06:38:35.348947] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:21.804 [2024-07-25 06:38:35.349107] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ce6250 00:22:21.804 [2024-07-25 06:38:35.349119] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:21.804 [2024-07-25 06:38:35.349305] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b32030 00:22:21.804 [2024-07-25 06:38:35.349430] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ce6250 00:22:21.804 [2024-07-25 06:38:35.349439] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ce6250 00:22:21.804 [2024-07-25 06:38:35.349526] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:21.804 BaseBdev4 00:22:22.062 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:22.062 06:38:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:22.062 06:38:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:22.062 06:38:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:22.062 06:38:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:22.062 06:38:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:22.062 06:38:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:22.062 06:38:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:22.321 [ 00:22:22.321 { 00:22:22.321 "name": "BaseBdev4", 00:22:22.321 "aliases": [ 00:22:22.321 "51ce7b0f-9b64-47e3-b1f8-c748d744d423" 00:22:22.321 ], 00:22:22.321 "product_name": "Malloc disk", 00:22:22.321 "block_size": 512, 00:22:22.321 "num_blocks": 65536, 00:22:22.321 "uuid": "51ce7b0f-9b64-47e3-b1f8-c748d744d423", 00:22:22.321 "assigned_rate_limits": { 00:22:22.321 "rw_ios_per_sec": 0, 00:22:22.321 "rw_mbytes_per_sec": 0, 00:22:22.321 "r_mbytes_per_sec": 0, 00:22:22.321 "w_mbytes_per_sec": 0 00:22:22.321 }, 00:22:22.321 "claimed": true, 00:22:22.321 "claim_type": "exclusive_write", 00:22:22.321 "zoned": false, 00:22:22.321 "supported_io_types": { 00:22:22.321 "read": true, 00:22:22.321 "write": true, 00:22:22.321 "unmap": true, 00:22:22.321 "flush": true, 00:22:22.321 "reset": true, 00:22:22.321 "nvme_admin": false, 00:22:22.321 "nvme_io": false, 00:22:22.321 "nvme_io_md": false, 00:22:22.321 "write_zeroes": true, 00:22:22.321 "zcopy": true, 00:22:22.321 "get_zone_info": false, 00:22:22.321 "zone_management": false, 00:22:22.321 "zone_append": false, 00:22:22.321 "compare": false, 00:22:22.321 "compare_and_write": false, 00:22:22.321 "abort": true, 00:22:22.321 "seek_hole": false, 00:22:22.321 "seek_data": false, 00:22:22.321 "copy": true, 00:22:22.321 "nvme_iov_md": false 00:22:22.321 }, 00:22:22.321 "memory_domains": [ 00:22:22.321 { 00:22:22.321 "dma_device_id": "system", 00:22:22.321 "dma_device_type": 1 00:22:22.321 }, 00:22:22.321 { 00:22:22.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:22.321 "dma_device_type": 2 00:22:22.321 } 00:22:22.321 ], 00:22:22.321 "driver_specific": {} 00:22:22.321 } 00:22:22.321 ] 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.321 06:38:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:22.579 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.579 "name": "Existed_Raid", 00:22:22.579 "uuid": "902767f6-772d-4758-b1ab-0e0959edbd9c", 00:22:22.579 "strip_size_kb": 64, 00:22:22.579 "state": "online", 00:22:22.579 "raid_level": "concat", 00:22:22.579 "superblock": true, 00:22:22.579 "num_base_bdevs": 4, 00:22:22.579 "num_base_bdevs_discovered": 4, 00:22:22.579 "num_base_bdevs_operational": 4, 00:22:22.579 "base_bdevs_list": [ 00:22:22.579 { 00:22:22.579 "name": "BaseBdev1", 00:22:22.579 "uuid": "c7ae05ac-9223-49c2-853e-c11537cf82c9", 00:22:22.579 "is_configured": true, 00:22:22.579 "data_offset": 2048, 00:22:22.579 "data_size": 63488 00:22:22.579 }, 00:22:22.579 { 00:22:22.579 "name": "BaseBdev2", 00:22:22.579 "uuid": "b2604905-974c-470c-827a-8167b47824aa", 00:22:22.579 "is_configured": true, 00:22:22.579 "data_offset": 2048, 00:22:22.579 "data_size": 63488 00:22:22.579 }, 00:22:22.579 { 00:22:22.579 "name": "BaseBdev3", 00:22:22.579 "uuid": "aea24b47-2b5c-4c38-b524-dad2033447ec", 00:22:22.579 "is_configured": true, 00:22:22.579 "data_offset": 2048, 00:22:22.579 "data_size": 63488 00:22:22.579 }, 00:22:22.579 { 00:22:22.579 "name": "BaseBdev4", 00:22:22.579 "uuid": "51ce7b0f-9b64-47e3-b1f8-c748d744d423", 00:22:22.579 "is_configured": true, 00:22:22.579 "data_offset": 2048, 00:22:22.579 "data_size": 63488 00:22:22.579 } 00:22:22.579 ] 00:22:22.579 }' 00:22:22.579 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.580 06:38:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.146 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:23.146 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:23.146 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:23.146 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:23.146 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:23.146 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:23.146 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:23.146 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:23.404 [2024-07-25 06:38:36.821145] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:23.404 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:23.404 "name": "Existed_Raid", 00:22:23.404 "aliases": [ 00:22:23.404 "902767f6-772d-4758-b1ab-0e0959edbd9c" 00:22:23.404 ], 00:22:23.404 "product_name": "Raid Volume", 00:22:23.404 "block_size": 512, 00:22:23.404 "num_blocks": 253952, 00:22:23.404 "uuid": "902767f6-772d-4758-b1ab-0e0959edbd9c", 00:22:23.404 "assigned_rate_limits": { 00:22:23.404 "rw_ios_per_sec": 0, 00:22:23.404 "rw_mbytes_per_sec": 0, 00:22:23.404 "r_mbytes_per_sec": 0, 00:22:23.404 "w_mbytes_per_sec": 0 00:22:23.404 }, 00:22:23.404 "claimed": false, 00:22:23.404 "zoned": false, 00:22:23.404 "supported_io_types": { 00:22:23.404 "read": true, 00:22:23.404 "write": true, 00:22:23.404 "unmap": true, 00:22:23.404 "flush": true, 00:22:23.404 "reset": true, 00:22:23.404 "nvme_admin": false, 00:22:23.404 "nvme_io": false, 00:22:23.404 "nvme_io_md": false, 00:22:23.404 "write_zeroes": true, 00:22:23.404 "zcopy": false, 00:22:23.404 "get_zone_info": false, 00:22:23.404 "zone_management": false, 00:22:23.404 "zone_append": false, 00:22:23.404 "compare": false, 00:22:23.404 "compare_and_write": false, 00:22:23.404 "abort": false, 00:22:23.404 "seek_hole": false, 00:22:23.404 "seek_data": false, 00:22:23.404 "copy": false, 00:22:23.404 "nvme_iov_md": false 00:22:23.404 }, 00:22:23.404 "memory_domains": [ 00:22:23.404 { 00:22:23.404 "dma_device_id": "system", 00:22:23.404 "dma_device_type": 1 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.404 "dma_device_type": 2 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "dma_device_id": "system", 00:22:23.404 "dma_device_type": 1 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.404 "dma_device_type": 2 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "dma_device_id": "system", 00:22:23.404 "dma_device_type": 1 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.404 "dma_device_type": 2 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "dma_device_id": "system", 00:22:23.404 "dma_device_type": 1 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.404 "dma_device_type": 2 00:22:23.404 } 00:22:23.404 ], 00:22:23.404 "driver_specific": { 00:22:23.404 "raid": { 00:22:23.404 "uuid": "902767f6-772d-4758-b1ab-0e0959edbd9c", 00:22:23.404 "strip_size_kb": 64, 00:22:23.404 "state": "online", 00:22:23.404 "raid_level": "concat", 00:22:23.404 "superblock": true, 00:22:23.404 "num_base_bdevs": 4, 00:22:23.404 "num_base_bdevs_discovered": 4, 00:22:23.404 "num_base_bdevs_operational": 4, 00:22:23.404 "base_bdevs_list": [ 00:22:23.404 { 00:22:23.404 "name": "BaseBdev1", 00:22:23.404 "uuid": "c7ae05ac-9223-49c2-853e-c11537cf82c9", 00:22:23.404 "is_configured": true, 00:22:23.404 "data_offset": 2048, 00:22:23.404 "data_size": 63488 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "name": "BaseBdev2", 00:22:23.404 "uuid": "b2604905-974c-470c-827a-8167b47824aa", 00:22:23.404 "is_configured": true, 00:22:23.404 "data_offset": 2048, 00:22:23.404 "data_size": 63488 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "name": "BaseBdev3", 00:22:23.404 "uuid": "aea24b47-2b5c-4c38-b524-dad2033447ec", 00:22:23.404 "is_configured": true, 00:22:23.404 "data_offset": 2048, 00:22:23.404 "data_size": 63488 00:22:23.404 }, 00:22:23.404 { 00:22:23.404 "name": "BaseBdev4", 00:22:23.404 "uuid": "51ce7b0f-9b64-47e3-b1f8-c748d744d423", 00:22:23.404 "is_configured": true, 00:22:23.404 "data_offset": 2048, 00:22:23.404 "data_size": 63488 00:22:23.405 } 00:22:23.405 ] 00:22:23.405 } 00:22:23.405 } 00:22:23.405 }' 00:22:23.405 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:23.405 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:23.405 BaseBdev2 00:22:23.405 BaseBdev3 00:22:23.405 BaseBdev4' 00:22:23.405 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:23.405 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:23.405 06:38:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:23.663 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:23.663 "name": "BaseBdev1", 00:22:23.663 "aliases": [ 00:22:23.663 "c7ae05ac-9223-49c2-853e-c11537cf82c9" 00:22:23.663 ], 00:22:23.663 "product_name": "Malloc disk", 00:22:23.663 "block_size": 512, 00:22:23.663 "num_blocks": 65536, 00:22:23.663 "uuid": "c7ae05ac-9223-49c2-853e-c11537cf82c9", 00:22:23.663 "assigned_rate_limits": { 00:22:23.663 "rw_ios_per_sec": 0, 00:22:23.663 "rw_mbytes_per_sec": 0, 00:22:23.663 "r_mbytes_per_sec": 0, 00:22:23.663 "w_mbytes_per_sec": 0 00:22:23.663 }, 00:22:23.663 "claimed": true, 00:22:23.663 "claim_type": "exclusive_write", 00:22:23.663 "zoned": false, 00:22:23.663 "supported_io_types": { 00:22:23.663 "read": true, 00:22:23.663 "write": true, 00:22:23.663 "unmap": true, 00:22:23.663 "flush": true, 00:22:23.663 "reset": true, 00:22:23.663 "nvme_admin": false, 00:22:23.663 "nvme_io": false, 00:22:23.663 "nvme_io_md": false, 00:22:23.663 "write_zeroes": true, 00:22:23.663 "zcopy": true, 00:22:23.663 "get_zone_info": false, 00:22:23.663 "zone_management": false, 00:22:23.663 "zone_append": false, 00:22:23.663 "compare": false, 00:22:23.663 "compare_and_write": false, 00:22:23.663 "abort": true, 00:22:23.663 "seek_hole": false, 00:22:23.663 "seek_data": false, 00:22:23.663 "copy": true, 00:22:23.663 "nvme_iov_md": false 00:22:23.663 }, 00:22:23.663 "memory_domains": [ 00:22:23.663 { 00:22:23.663 "dma_device_id": "system", 00:22:23.663 "dma_device_type": 1 00:22:23.663 }, 00:22:23.663 { 00:22:23.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.663 "dma_device_type": 2 00:22:23.663 } 00:22:23.663 ], 00:22:23.663 "driver_specific": {} 00:22:23.663 }' 00:22:23.663 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:23.663 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:23.663 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:23.663 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:23.663 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:23.920 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.485 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.485 "name": "BaseBdev2", 00:22:24.485 "aliases": [ 00:22:24.485 "b2604905-974c-470c-827a-8167b47824aa" 00:22:24.485 ], 00:22:24.485 "product_name": "Malloc disk", 00:22:24.485 "block_size": 512, 00:22:24.485 "num_blocks": 65536, 00:22:24.485 "uuid": "b2604905-974c-470c-827a-8167b47824aa", 00:22:24.485 "assigned_rate_limits": { 00:22:24.485 "rw_ios_per_sec": 0, 00:22:24.485 "rw_mbytes_per_sec": 0, 00:22:24.485 "r_mbytes_per_sec": 0, 00:22:24.485 "w_mbytes_per_sec": 0 00:22:24.485 }, 00:22:24.485 "claimed": true, 00:22:24.485 "claim_type": "exclusive_write", 00:22:24.485 "zoned": false, 00:22:24.485 "supported_io_types": { 00:22:24.485 "read": true, 00:22:24.485 "write": true, 00:22:24.485 "unmap": true, 00:22:24.485 "flush": true, 00:22:24.485 "reset": true, 00:22:24.485 "nvme_admin": false, 00:22:24.485 "nvme_io": false, 00:22:24.485 "nvme_io_md": false, 00:22:24.485 "write_zeroes": true, 00:22:24.485 "zcopy": true, 00:22:24.485 "get_zone_info": false, 00:22:24.485 "zone_management": false, 00:22:24.485 "zone_append": false, 00:22:24.485 "compare": false, 00:22:24.485 "compare_and_write": false, 00:22:24.485 "abort": true, 00:22:24.485 "seek_hole": false, 00:22:24.485 "seek_data": false, 00:22:24.485 "copy": true, 00:22:24.485 "nvme_iov_md": false 00:22:24.485 }, 00:22:24.485 "memory_domains": [ 00:22:24.485 { 00:22:24.485 "dma_device_id": "system", 00:22:24.485 "dma_device_type": 1 00:22:24.485 }, 00:22:24.485 { 00:22:24.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.485 "dma_device_type": 2 00:22:24.485 } 00:22:24.485 ], 00:22:24.485 "driver_specific": {} 00:22:24.485 }' 00:22:24.485 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.485 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.485 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:24.485 06:38:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.485 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:24.743 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:25.001 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:25.001 "name": "BaseBdev3", 00:22:25.001 "aliases": [ 00:22:25.001 "aea24b47-2b5c-4c38-b524-dad2033447ec" 00:22:25.001 ], 00:22:25.001 "product_name": "Malloc disk", 00:22:25.001 "block_size": 512, 00:22:25.001 "num_blocks": 65536, 00:22:25.001 "uuid": "aea24b47-2b5c-4c38-b524-dad2033447ec", 00:22:25.001 "assigned_rate_limits": { 00:22:25.001 "rw_ios_per_sec": 0, 00:22:25.001 "rw_mbytes_per_sec": 0, 00:22:25.001 "r_mbytes_per_sec": 0, 00:22:25.001 "w_mbytes_per_sec": 0 00:22:25.001 }, 00:22:25.001 "claimed": true, 00:22:25.001 "claim_type": "exclusive_write", 00:22:25.001 "zoned": false, 00:22:25.001 "supported_io_types": { 00:22:25.001 "read": true, 00:22:25.001 "write": true, 00:22:25.001 "unmap": true, 00:22:25.001 "flush": true, 00:22:25.001 "reset": true, 00:22:25.001 "nvme_admin": false, 00:22:25.001 "nvme_io": false, 00:22:25.001 "nvme_io_md": false, 00:22:25.001 "write_zeroes": true, 00:22:25.001 "zcopy": true, 00:22:25.001 "get_zone_info": false, 00:22:25.001 "zone_management": false, 00:22:25.001 "zone_append": false, 00:22:25.001 "compare": false, 00:22:25.001 "compare_and_write": false, 00:22:25.001 "abort": true, 00:22:25.001 "seek_hole": false, 00:22:25.001 "seek_data": false, 00:22:25.001 "copy": true, 00:22:25.001 "nvme_iov_md": false 00:22:25.001 }, 00:22:25.001 "memory_domains": [ 00:22:25.001 { 00:22:25.001 "dma_device_id": "system", 00:22:25.001 "dma_device_type": 1 00:22:25.001 }, 00:22:25.001 { 00:22:25.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.001 "dma_device_type": 2 00:22:25.001 } 00:22:25.001 ], 00:22:25.001 "driver_specific": {} 00:22:25.001 }' 00:22:25.001 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.001 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.001 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.001 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.259 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:25.260 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:25.260 06:38:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:25.826 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:25.826 "name": "BaseBdev4", 00:22:25.826 "aliases": [ 00:22:25.826 "51ce7b0f-9b64-47e3-b1f8-c748d744d423" 00:22:25.826 ], 00:22:25.826 "product_name": "Malloc disk", 00:22:25.826 "block_size": 512, 00:22:25.826 "num_blocks": 65536, 00:22:25.826 "uuid": "51ce7b0f-9b64-47e3-b1f8-c748d744d423", 00:22:25.826 "assigned_rate_limits": { 00:22:25.826 "rw_ios_per_sec": 0, 00:22:25.826 "rw_mbytes_per_sec": 0, 00:22:25.826 "r_mbytes_per_sec": 0, 00:22:25.826 "w_mbytes_per_sec": 0 00:22:25.826 }, 00:22:25.826 "claimed": true, 00:22:25.826 "claim_type": "exclusive_write", 00:22:25.826 "zoned": false, 00:22:25.826 "supported_io_types": { 00:22:25.826 "read": true, 00:22:25.826 "write": true, 00:22:25.826 "unmap": true, 00:22:25.826 "flush": true, 00:22:25.826 "reset": true, 00:22:25.826 "nvme_admin": false, 00:22:25.826 "nvme_io": false, 00:22:25.826 "nvme_io_md": false, 00:22:25.826 "write_zeroes": true, 00:22:25.826 "zcopy": true, 00:22:25.826 "get_zone_info": false, 00:22:25.826 "zone_management": false, 00:22:25.826 "zone_append": false, 00:22:25.826 "compare": false, 00:22:25.826 "compare_and_write": false, 00:22:25.826 "abort": true, 00:22:25.826 "seek_hole": false, 00:22:25.826 "seek_data": false, 00:22:25.826 "copy": true, 00:22:25.826 "nvme_iov_md": false 00:22:25.826 }, 00:22:25.826 "memory_domains": [ 00:22:25.826 { 00:22:25.826 "dma_device_id": "system", 00:22:25.826 "dma_device_type": 1 00:22:25.826 }, 00:22:25.826 { 00:22:25.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.826 "dma_device_type": 2 00:22:25.826 } 00:22:25.826 ], 00:22:25.826 "driver_specific": {} 00:22:25.826 }' 00:22:25.826 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.826 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.827 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.827 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.827 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.085 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:26.085 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.085 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.085 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:26.085 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.085 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.085 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:26.085 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:26.344 [2024-07-25 06:38:39.796714] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:26.344 [2024-07-25 06:38:39.796739] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:26.344 [2024-07-25 06:38:39.796783] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:26.344 06:38:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.603 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:26.603 "name": "Existed_Raid", 00:22:26.603 "uuid": "902767f6-772d-4758-b1ab-0e0959edbd9c", 00:22:26.603 "strip_size_kb": 64, 00:22:26.603 "state": "offline", 00:22:26.603 "raid_level": "concat", 00:22:26.603 "superblock": true, 00:22:26.603 "num_base_bdevs": 4, 00:22:26.603 "num_base_bdevs_discovered": 3, 00:22:26.603 "num_base_bdevs_operational": 3, 00:22:26.603 "base_bdevs_list": [ 00:22:26.603 { 00:22:26.603 "name": null, 00:22:26.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.603 "is_configured": false, 00:22:26.603 "data_offset": 2048, 00:22:26.603 "data_size": 63488 00:22:26.603 }, 00:22:26.603 { 00:22:26.603 "name": "BaseBdev2", 00:22:26.603 "uuid": "b2604905-974c-470c-827a-8167b47824aa", 00:22:26.603 "is_configured": true, 00:22:26.603 "data_offset": 2048, 00:22:26.603 "data_size": 63488 00:22:26.603 }, 00:22:26.603 { 00:22:26.603 "name": "BaseBdev3", 00:22:26.603 "uuid": "aea24b47-2b5c-4c38-b524-dad2033447ec", 00:22:26.603 "is_configured": true, 00:22:26.603 "data_offset": 2048, 00:22:26.603 "data_size": 63488 00:22:26.603 }, 00:22:26.603 { 00:22:26.603 "name": "BaseBdev4", 00:22:26.603 "uuid": "51ce7b0f-9b64-47e3-b1f8-c748d744d423", 00:22:26.603 "is_configured": true, 00:22:26.603 "data_offset": 2048, 00:22:26.603 "data_size": 63488 00:22:26.603 } 00:22:26.603 ] 00:22:26.603 }' 00:22:26.603 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:26.603 06:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:27.168 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:27.168 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:27.169 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.169 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:27.425 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:27.425 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:27.425 06:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:27.683 [2024-07-25 06:38:41.073227] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:27.683 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:27.683 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:27.683 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.683 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:27.940 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:27.940 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:27.940 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:28.198 [2024-07-25 06:38:41.520590] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:28.198 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:28.198 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:28.198 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.198 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:28.457 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:28.457 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:28.457 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:28.457 [2024-07-25 06:38:41.967800] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:28.457 [2024-07-25 06:38:41.967837] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce6250 name Existed_Raid, state offline 00:22:28.457 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:28.457 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:28.457 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.457 06:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:28.715 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:28.715 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:28.715 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:28.715 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:28.715 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:28.715 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:28.974 BaseBdev2 00:22:28.974 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:28.974 06:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:28.974 06:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:28.974 06:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:28.974 06:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:28.974 06:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:28.974 06:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:29.232 06:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:29.491 [ 00:22:29.491 { 00:22:29.491 "name": "BaseBdev2", 00:22:29.491 "aliases": [ 00:22:29.491 "7ae18713-a1b6-4de4-910e-ed700a36f661" 00:22:29.491 ], 00:22:29.491 "product_name": "Malloc disk", 00:22:29.491 "block_size": 512, 00:22:29.491 "num_blocks": 65536, 00:22:29.491 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:29.491 "assigned_rate_limits": { 00:22:29.491 "rw_ios_per_sec": 0, 00:22:29.491 "rw_mbytes_per_sec": 0, 00:22:29.491 "r_mbytes_per_sec": 0, 00:22:29.491 "w_mbytes_per_sec": 0 00:22:29.491 }, 00:22:29.491 "claimed": false, 00:22:29.491 "zoned": false, 00:22:29.491 "supported_io_types": { 00:22:29.491 "read": true, 00:22:29.491 "write": true, 00:22:29.491 "unmap": true, 00:22:29.491 "flush": true, 00:22:29.491 "reset": true, 00:22:29.491 "nvme_admin": false, 00:22:29.491 "nvme_io": false, 00:22:29.491 "nvme_io_md": false, 00:22:29.491 "write_zeroes": true, 00:22:29.491 "zcopy": true, 00:22:29.491 "get_zone_info": false, 00:22:29.491 "zone_management": false, 00:22:29.491 "zone_append": false, 00:22:29.491 "compare": false, 00:22:29.491 "compare_and_write": false, 00:22:29.491 "abort": true, 00:22:29.491 "seek_hole": false, 00:22:29.491 "seek_data": false, 00:22:29.491 "copy": true, 00:22:29.491 "nvme_iov_md": false 00:22:29.491 }, 00:22:29.491 "memory_domains": [ 00:22:29.491 { 00:22:29.491 "dma_device_id": "system", 00:22:29.491 "dma_device_type": 1 00:22:29.491 }, 00:22:29.491 { 00:22:29.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:29.491 "dma_device_type": 2 00:22:29.491 } 00:22:29.491 ], 00:22:29.491 "driver_specific": {} 00:22:29.491 } 00:22:29.491 ] 00:22:29.491 06:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:29.491 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:29.491 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:29.491 06:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:29.491 BaseBdev3 00:22:29.750 06:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:29.750 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:22:29.750 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:29.750 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:29.750 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:29.750 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:29.750 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:29.750 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:30.010 [ 00:22:30.010 { 00:22:30.010 "name": "BaseBdev3", 00:22:30.010 "aliases": [ 00:22:30.010 "44b9ece7-4f02-4cad-a0a6-13f769d521e3" 00:22:30.010 ], 00:22:30.010 "product_name": "Malloc disk", 00:22:30.010 "block_size": 512, 00:22:30.010 "num_blocks": 65536, 00:22:30.010 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:30.010 "assigned_rate_limits": { 00:22:30.010 "rw_ios_per_sec": 0, 00:22:30.010 "rw_mbytes_per_sec": 0, 00:22:30.010 "r_mbytes_per_sec": 0, 00:22:30.010 "w_mbytes_per_sec": 0 00:22:30.010 }, 00:22:30.010 "claimed": false, 00:22:30.010 "zoned": false, 00:22:30.010 "supported_io_types": { 00:22:30.010 "read": true, 00:22:30.010 "write": true, 00:22:30.010 "unmap": true, 00:22:30.010 "flush": true, 00:22:30.010 "reset": true, 00:22:30.010 "nvme_admin": false, 00:22:30.010 "nvme_io": false, 00:22:30.010 "nvme_io_md": false, 00:22:30.010 "write_zeroes": true, 00:22:30.010 "zcopy": true, 00:22:30.010 "get_zone_info": false, 00:22:30.010 "zone_management": false, 00:22:30.010 "zone_append": false, 00:22:30.010 "compare": false, 00:22:30.010 "compare_and_write": false, 00:22:30.011 "abort": true, 00:22:30.011 "seek_hole": false, 00:22:30.011 "seek_data": false, 00:22:30.011 "copy": true, 00:22:30.011 "nvme_iov_md": false 00:22:30.011 }, 00:22:30.011 "memory_domains": [ 00:22:30.011 { 00:22:30.011 "dma_device_id": "system", 00:22:30.011 "dma_device_type": 1 00:22:30.011 }, 00:22:30.011 { 00:22:30.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:30.011 "dma_device_type": 2 00:22:30.011 } 00:22:30.011 ], 00:22:30.011 "driver_specific": {} 00:22:30.011 } 00:22:30.011 ] 00:22:30.011 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:30.011 06:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:30.011 06:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:30.011 06:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:30.270 BaseBdev4 00:22:30.270 06:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:30.270 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:30.270 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:30.270 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:30.270 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:30.270 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:30.270 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:30.528 06:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:30.786 [ 00:22:30.786 { 00:22:30.786 "name": "BaseBdev4", 00:22:30.786 "aliases": [ 00:22:30.786 "0dca4b94-cdf6-41d7-87aa-ededc455e2db" 00:22:30.786 ], 00:22:30.786 "product_name": "Malloc disk", 00:22:30.786 "block_size": 512, 00:22:30.786 "num_blocks": 65536, 00:22:30.786 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:30.786 "assigned_rate_limits": { 00:22:30.786 "rw_ios_per_sec": 0, 00:22:30.786 "rw_mbytes_per_sec": 0, 00:22:30.786 "r_mbytes_per_sec": 0, 00:22:30.786 "w_mbytes_per_sec": 0 00:22:30.786 }, 00:22:30.786 "claimed": false, 00:22:30.786 "zoned": false, 00:22:30.786 "supported_io_types": { 00:22:30.786 "read": true, 00:22:30.786 "write": true, 00:22:30.786 "unmap": true, 00:22:30.786 "flush": true, 00:22:30.786 "reset": true, 00:22:30.786 "nvme_admin": false, 00:22:30.786 "nvme_io": false, 00:22:30.786 "nvme_io_md": false, 00:22:30.786 "write_zeroes": true, 00:22:30.786 "zcopy": true, 00:22:30.786 "get_zone_info": false, 00:22:30.786 "zone_management": false, 00:22:30.786 "zone_append": false, 00:22:30.786 "compare": false, 00:22:30.786 "compare_and_write": false, 00:22:30.786 "abort": true, 00:22:30.786 "seek_hole": false, 00:22:30.786 "seek_data": false, 00:22:30.786 "copy": true, 00:22:30.786 "nvme_iov_md": false 00:22:30.786 }, 00:22:30.786 "memory_domains": [ 00:22:30.786 { 00:22:30.786 "dma_device_id": "system", 00:22:30.786 "dma_device_type": 1 00:22:30.786 }, 00:22:30.786 { 00:22:30.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:30.786 "dma_device_type": 2 00:22:30.786 } 00:22:30.786 ], 00:22:30.786 "driver_specific": {} 00:22:30.786 } 00:22:30.786 ] 00:22:30.786 06:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:30.786 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:30.786 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:30.786 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:31.045 [2024-07-25 06:38:44.385004] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:31.045 [2024-07-25 06:38:44.385042] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:31.045 [2024-07-25 06:38:44.385061] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:31.045 [2024-07-25 06:38:44.386270] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:31.045 [2024-07-25 06:38:44.386310] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:31.045 "name": "Existed_Raid", 00:22:31.045 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:31.045 "strip_size_kb": 64, 00:22:31.045 "state": "configuring", 00:22:31.045 "raid_level": "concat", 00:22:31.045 "superblock": true, 00:22:31.045 "num_base_bdevs": 4, 00:22:31.045 "num_base_bdevs_discovered": 3, 00:22:31.045 "num_base_bdevs_operational": 4, 00:22:31.045 "base_bdevs_list": [ 00:22:31.045 { 00:22:31.045 "name": "BaseBdev1", 00:22:31.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:31.045 "is_configured": false, 00:22:31.045 "data_offset": 0, 00:22:31.045 "data_size": 0 00:22:31.045 }, 00:22:31.045 { 00:22:31.045 "name": "BaseBdev2", 00:22:31.045 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:31.045 "is_configured": true, 00:22:31.045 "data_offset": 2048, 00:22:31.045 "data_size": 63488 00:22:31.045 }, 00:22:31.045 { 00:22:31.045 "name": "BaseBdev3", 00:22:31.045 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:31.045 "is_configured": true, 00:22:31.045 "data_offset": 2048, 00:22:31.045 "data_size": 63488 00:22:31.045 }, 00:22:31.045 { 00:22:31.045 "name": "BaseBdev4", 00:22:31.045 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:31.045 "is_configured": true, 00:22:31.045 "data_offset": 2048, 00:22:31.045 "data_size": 63488 00:22:31.045 } 00:22:31.045 ] 00:22:31.045 }' 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:31.045 06:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:31.980 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:32.239 [2024-07-25 06:38:45.604203] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.239 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:32.497 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.497 "name": "Existed_Raid", 00:22:32.497 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:32.497 "strip_size_kb": 64, 00:22:32.497 "state": "configuring", 00:22:32.497 "raid_level": "concat", 00:22:32.497 "superblock": true, 00:22:32.497 "num_base_bdevs": 4, 00:22:32.497 "num_base_bdevs_discovered": 2, 00:22:32.497 "num_base_bdevs_operational": 4, 00:22:32.497 "base_bdevs_list": [ 00:22:32.497 { 00:22:32.497 "name": "BaseBdev1", 00:22:32.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.497 "is_configured": false, 00:22:32.497 "data_offset": 0, 00:22:32.497 "data_size": 0 00:22:32.497 }, 00:22:32.497 { 00:22:32.497 "name": null, 00:22:32.497 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:32.497 "is_configured": false, 00:22:32.497 "data_offset": 2048, 00:22:32.497 "data_size": 63488 00:22:32.497 }, 00:22:32.497 { 00:22:32.497 "name": "BaseBdev3", 00:22:32.497 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:32.497 "is_configured": true, 00:22:32.497 "data_offset": 2048, 00:22:32.497 "data_size": 63488 00:22:32.497 }, 00:22:32.497 { 00:22:32.497 "name": "BaseBdev4", 00:22:32.498 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:32.498 "is_configured": true, 00:22:32.498 "data_offset": 2048, 00:22:32.498 "data_size": 63488 00:22:32.498 } 00:22:32.498 ] 00:22:32.498 }' 00:22:32.498 06:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.498 06:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:33.094 06:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.094 06:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:33.094 06:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:33.094 06:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:33.353 [2024-07-25 06:38:46.826685] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:33.353 BaseBdev1 00:22:33.353 06:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:33.353 06:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:33.353 06:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:33.353 06:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:33.353 06:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:33.353 06:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:33.353 06:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:33.611 06:38:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:33.870 [ 00:22:33.870 { 00:22:33.870 "name": "BaseBdev1", 00:22:33.870 "aliases": [ 00:22:33.870 "fa5a2480-8ac6-4649-b81b-010ca0017329" 00:22:33.870 ], 00:22:33.870 "product_name": "Malloc disk", 00:22:33.870 "block_size": 512, 00:22:33.870 "num_blocks": 65536, 00:22:33.870 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:33.870 "assigned_rate_limits": { 00:22:33.870 "rw_ios_per_sec": 0, 00:22:33.870 "rw_mbytes_per_sec": 0, 00:22:33.870 "r_mbytes_per_sec": 0, 00:22:33.870 "w_mbytes_per_sec": 0 00:22:33.870 }, 00:22:33.870 "claimed": true, 00:22:33.870 "claim_type": "exclusive_write", 00:22:33.870 "zoned": false, 00:22:33.870 "supported_io_types": { 00:22:33.870 "read": true, 00:22:33.870 "write": true, 00:22:33.870 "unmap": true, 00:22:33.870 "flush": true, 00:22:33.870 "reset": true, 00:22:33.870 "nvme_admin": false, 00:22:33.870 "nvme_io": false, 00:22:33.870 "nvme_io_md": false, 00:22:33.870 "write_zeroes": true, 00:22:33.870 "zcopy": true, 00:22:33.870 "get_zone_info": false, 00:22:33.870 "zone_management": false, 00:22:33.870 "zone_append": false, 00:22:33.870 "compare": false, 00:22:33.870 "compare_and_write": false, 00:22:33.870 "abort": true, 00:22:33.870 "seek_hole": false, 00:22:33.870 "seek_data": false, 00:22:33.870 "copy": true, 00:22:33.870 "nvme_iov_md": false 00:22:33.870 }, 00:22:33.870 "memory_domains": [ 00:22:33.870 { 00:22:33.870 "dma_device_id": "system", 00:22:33.870 "dma_device_type": 1 00:22:33.870 }, 00:22:33.870 { 00:22:33.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.870 "dma_device_type": 2 00:22:33.870 } 00:22:33.870 ], 00:22:33.870 "driver_specific": {} 00:22:33.870 } 00:22:33.870 ] 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.871 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:34.129 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.129 "name": "Existed_Raid", 00:22:34.129 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:34.129 "strip_size_kb": 64, 00:22:34.129 "state": "configuring", 00:22:34.129 "raid_level": "concat", 00:22:34.129 "superblock": true, 00:22:34.129 "num_base_bdevs": 4, 00:22:34.129 "num_base_bdevs_discovered": 3, 00:22:34.129 "num_base_bdevs_operational": 4, 00:22:34.129 "base_bdevs_list": [ 00:22:34.129 { 00:22:34.129 "name": "BaseBdev1", 00:22:34.129 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:34.129 "is_configured": true, 00:22:34.129 "data_offset": 2048, 00:22:34.129 "data_size": 63488 00:22:34.129 }, 00:22:34.129 { 00:22:34.129 "name": null, 00:22:34.129 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:34.129 "is_configured": false, 00:22:34.129 "data_offset": 2048, 00:22:34.129 "data_size": 63488 00:22:34.129 }, 00:22:34.129 { 00:22:34.129 "name": "BaseBdev3", 00:22:34.129 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:34.129 "is_configured": true, 00:22:34.129 "data_offset": 2048, 00:22:34.129 "data_size": 63488 00:22:34.129 }, 00:22:34.129 { 00:22:34.129 "name": "BaseBdev4", 00:22:34.129 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:34.129 "is_configured": true, 00:22:34.130 "data_offset": 2048, 00:22:34.130 "data_size": 63488 00:22:34.130 } 00:22:34.130 ] 00:22:34.130 }' 00:22:34.130 06:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.130 06:38:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:34.697 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.697 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:34.955 [2024-07-25 06:38:48.471056] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:34.955 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.214 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.214 "name": "Existed_Raid", 00:22:35.214 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:35.214 "strip_size_kb": 64, 00:22:35.214 "state": "configuring", 00:22:35.214 "raid_level": "concat", 00:22:35.214 "superblock": true, 00:22:35.214 "num_base_bdevs": 4, 00:22:35.214 "num_base_bdevs_discovered": 2, 00:22:35.214 "num_base_bdevs_operational": 4, 00:22:35.214 "base_bdevs_list": [ 00:22:35.214 { 00:22:35.214 "name": "BaseBdev1", 00:22:35.214 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:35.214 "is_configured": true, 00:22:35.214 "data_offset": 2048, 00:22:35.214 "data_size": 63488 00:22:35.214 }, 00:22:35.214 { 00:22:35.214 "name": null, 00:22:35.214 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:35.214 "is_configured": false, 00:22:35.214 "data_offset": 2048, 00:22:35.214 "data_size": 63488 00:22:35.214 }, 00:22:35.214 { 00:22:35.214 "name": null, 00:22:35.214 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:35.214 "is_configured": false, 00:22:35.214 "data_offset": 2048, 00:22:35.214 "data_size": 63488 00:22:35.214 }, 00:22:35.214 { 00:22:35.214 "name": "BaseBdev4", 00:22:35.214 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:35.214 "is_configured": true, 00:22:35.214 "data_offset": 2048, 00:22:35.214 "data_size": 63488 00:22:35.214 } 00:22:35.214 ] 00:22:35.214 }' 00:22:35.214 06:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.214 06:38:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:35.781 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.781 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:36.039 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:36.039 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:36.298 [2024-07-25 06:38:49.698297] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.298 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:36.557 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.557 "name": "Existed_Raid", 00:22:36.557 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:36.557 "strip_size_kb": 64, 00:22:36.557 "state": "configuring", 00:22:36.557 "raid_level": "concat", 00:22:36.557 "superblock": true, 00:22:36.557 "num_base_bdevs": 4, 00:22:36.557 "num_base_bdevs_discovered": 3, 00:22:36.557 "num_base_bdevs_operational": 4, 00:22:36.557 "base_bdevs_list": [ 00:22:36.557 { 00:22:36.557 "name": "BaseBdev1", 00:22:36.557 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:36.557 "is_configured": true, 00:22:36.557 "data_offset": 2048, 00:22:36.557 "data_size": 63488 00:22:36.557 }, 00:22:36.557 { 00:22:36.557 "name": null, 00:22:36.557 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:36.557 "is_configured": false, 00:22:36.557 "data_offset": 2048, 00:22:36.557 "data_size": 63488 00:22:36.557 }, 00:22:36.557 { 00:22:36.557 "name": "BaseBdev3", 00:22:36.557 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:36.557 "is_configured": true, 00:22:36.557 "data_offset": 2048, 00:22:36.557 "data_size": 63488 00:22:36.557 }, 00:22:36.557 { 00:22:36.557 "name": "BaseBdev4", 00:22:36.557 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:36.557 "is_configured": true, 00:22:36.557 "data_offset": 2048, 00:22:36.557 "data_size": 63488 00:22:36.557 } 00:22:36.557 ] 00:22:36.557 }' 00:22:36.557 06:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.557 06:38:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:37.123 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:37.123 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.123 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:37.123 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:37.382 [2024-07-25 06:38:50.765114] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.382 06:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.641 06:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.641 "name": "Existed_Raid", 00:22:37.641 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:37.641 "strip_size_kb": 64, 00:22:37.641 "state": "configuring", 00:22:37.641 "raid_level": "concat", 00:22:37.641 "superblock": true, 00:22:37.641 "num_base_bdevs": 4, 00:22:37.641 "num_base_bdevs_discovered": 2, 00:22:37.641 "num_base_bdevs_operational": 4, 00:22:37.641 "base_bdevs_list": [ 00:22:37.641 { 00:22:37.641 "name": null, 00:22:37.641 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:37.641 "is_configured": false, 00:22:37.641 "data_offset": 2048, 00:22:37.641 "data_size": 63488 00:22:37.641 }, 00:22:37.641 { 00:22:37.641 "name": null, 00:22:37.641 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:37.641 "is_configured": false, 00:22:37.641 "data_offset": 2048, 00:22:37.641 "data_size": 63488 00:22:37.641 }, 00:22:37.641 { 00:22:37.641 "name": "BaseBdev3", 00:22:37.641 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:37.641 "is_configured": true, 00:22:37.641 "data_offset": 2048, 00:22:37.641 "data_size": 63488 00:22:37.641 }, 00:22:37.641 { 00:22:37.641 "name": "BaseBdev4", 00:22:37.641 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:37.641 "is_configured": true, 00:22:37.641 "data_offset": 2048, 00:22:37.641 "data_size": 63488 00:22:37.641 } 00:22:37.641 ] 00:22:37.641 }' 00:22:37.641 06:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.641 06:38:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:38.208 06:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.208 06:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:38.466 06:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:38.466 06:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:38.724 [2024-07-25 06:38:52.034461] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.725 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:38.983 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:38.983 "name": "Existed_Raid", 00:22:38.983 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:38.983 "strip_size_kb": 64, 00:22:38.983 "state": "configuring", 00:22:38.983 "raid_level": "concat", 00:22:38.983 "superblock": true, 00:22:38.983 "num_base_bdevs": 4, 00:22:38.983 "num_base_bdevs_discovered": 3, 00:22:38.983 "num_base_bdevs_operational": 4, 00:22:38.983 "base_bdevs_list": [ 00:22:38.983 { 00:22:38.983 "name": null, 00:22:38.983 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:38.983 "is_configured": false, 00:22:38.983 "data_offset": 2048, 00:22:38.983 "data_size": 63488 00:22:38.983 }, 00:22:38.983 { 00:22:38.983 "name": "BaseBdev2", 00:22:38.983 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:38.983 "is_configured": true, 00:22:38.983 "data_offset": 2048, 00:22:38.983 "data_size": 63488 00:22:38.983 }, 00:22:38.983 { 00:22:38.983 "name": "BaseBdev3", 00:22:38.983 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:38.983 "is_configured": true, 00:22:38.983 "data_offset": 2048, 00:22:38.983 "data_size": 63488 00:22:38.983 }, 00:22:38.983 { 00:22:38.983 "name": "BaseBdev4", 00:22:38.983 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:38.983 "is_configured": true, 00:22:38.983 "data_offset": 2048, 00:22:38.983 "data_size": 63488 00:22:38.983 } 00:22:38.983 ] 00:22:38.983 }' 00:22:38.983 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:38.983 06:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:39.549 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.549 06:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:39.549 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:39.549 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.549 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:39.808 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u fa5a2480-8ac6-4649-b81b-010ca0017329 00:22:40.065 [2024-07-25 06:38:53.529616] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:40.065 [2024-07-25 06:38:53.529764] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b34710 00:22:40.065 [2024-07-25 06:38:53.529775] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:40.065 [2024-07-25 06:38:53.529940] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b2c000 00:22:40.065 [2024-07-25 06:38:53.530044] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b34710 00:22:40.065 [2024-07-25 06:38:53.530053] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b34710 00:22:40.065 [2024-07-25 06:38:53.530134] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.065 NewBaseBdev 00:22:40.065 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:40.065 06:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:22:40.065 06:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:40.065 06:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:40.065 06:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:40.065 06:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:40.065 06:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:40.322 06:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:40.580 [ 00:22:40.580 { 00:22:40.580 "name": "NewBaseBdev", 00:22:40.580 "aliases": [ 00:22:40.580 "fa5a2480-8ac6-4649-b81b-010ca0017329" 00:22:40.580 ], 00:22:40.581 "product_name": "Malloc disk", 00:22:40.581 "block_size": 512, 00:22:40.581 "num_blocks": 65536, 00:22:40.581 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:40.581 "assigned_rate_limits": { 00:22:40.581 "rw_ios_per_sec": 0, 00:22:40.581 "rw_mbytes_per_sec": 0, 00:22:40.581 "r_mbytes_per_sec": 0, 00:22:40.581 "w_mbytes_per_sec": 0 00:22:40.581 }, 00:22:40.581 "claimed": true, 00:22:40.581 "claim_type": "exclusive_write", 00:22:40.581 "zoned": false, 00:22:40.581 "supported_io_types": { 00:22:40.581 "read": true, 00:22:40.581 "write": true, 00:22:40.581 "unmap": true, 00:22:40.581 "flush": true, 00:22:40.581 "reset": true, 00:22:40.581 "nvme_admin": false, 00:22:40.581 "nvme_io": false, 00:22:40.581 "nvme_io_md": false, 00:22:40.581 "write_zeroes": true, 00:22:40.581 "zcopy": true, 00:22:40.581 "get_zone_info": false, 00:22:40.581 "zone_management": false, 00:22:40.581 "zone_append": false, 00:22:40.581 "compare": false, 00:22:40.581 "compare_and_write": false, 00:22:40.581 "abort": true, 00:22:40.581 "seek_hole": false, 00:22:40.581 "seek_data": false, 00:22:40.581 "copy": true, 00:22:40.581 "nvme_iov_md": false 00:22:40.581 }, 00:22:40.581 "memory_domains": [ 00:22:40.581 { 00:22:40.581 "dma_device_id": "system", 00:22:40.581 "dma_device_type": 1 00:22:40.581 }, 00:22:40.581 { 00:22:40.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.581 "dma_device_type": 2 00:22:40.581 } 00:22:40.581 ], 00:22:40.581 "driver_specific": {} 00:22:40.581 } 00:22:40.581 ] 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:40.581 06:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:40.581 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.581 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:40.840 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.840 "name": "Existed_Raid", 00:22:40.840 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:40.840 "strip_size_kb": 64, 00:22:40.840 "state": "online", 00:22:40.840 "raid_level": "concat", 00:22:40.840 "superblock": true, 00:22:40.840 "num_base_bdevs": 4, 00:22:40.840 "num_base_bdevs_discovered": 4, 00:22:40.840 "num_base_bdevs_operational": 4, 00:22:40.840 "base_bdevs_list": [ 00:22:40.840 { 00:22:40.840 "name": "NewBaseBdev", 00:22:40.840 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:40.840 "is_configured": true, 00:22:40.840 "data_offset": 2048, 00:22:40.840 "data_size": 63488 00:22:40.840 }, 00:22:40.840 { 00:22:40.840 "name": "BaseBdev2", 00:22:40.840 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:40.840 "is_configured": true, 00:22:40.840 "data_offset": 2048, 00:22:40.840 "data_size": 63488 00:22:40.840 }, 00:22:40.840 { 00:22:40.840 "name": "BaseBdev3", 00:22:40.840 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:40.840 "is_configured": true, 00:22:40.840 "data_offset": 2048, 00:22:40.840 "data_size": 63488 00:22:40.840 }, 00:22:40.840 { 00:22:40.840 "name": "BaseBdev4", 00:22:40.840 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:40.840 "is_configured": true, 00:22:40.840 "data_offset": 2048, 00:22:40.840 "data_size": 63488 00:22:40.840 } 00:22:40.840 ] 00:22:40.840 }' 00:22:40.840 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.840 06:38:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:41.406 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:41.406 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:41.406 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:41.406 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:41.406 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:41.406 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:41.406 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:41.406 06:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:41.665 [2024-07-25 06:38:55.033890] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:41.665 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:41.665 "name": "Existed_Raid", 00:22:41.665 "aliases": [ 00:22:41.665 "01f4951c-df14-452d-957a-139878fd87e9" 00:22:41.665 ], 00:22:41.665 "product_name": "Raid Volume", 00:22:41.665 "block_size": 512, 00:22:41.665 "num_blocks": 253952, 00:22:41.665 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:41.665 "assigned_rate_limits": { 00:22:41.665 "rw_ios_per_sec": 0, 00:22:41.665 "rw_mbytes_per_sec": 0, 00:22:41.665 "r_mbytes_per_sec": 0, 00:22:41.665 "w_mbytes_per_sec": 0 00:22:41.665 }, 00:22:41.665 "claimed": false, 00:22:41.665 "zoned": false, 00:22:41.665 "supported_io_types": { 00:22:41.665 "read": true, 00:22:41.665 "write": true, 00:22:41.665 "unmap": true, 00:22:41.665 "flush": true, 00:22:41.665 "reset": true, 00:22:41.665 "nvme_admin": false, 00:22:41.665 "nvme_io": false, 00:22:41.665 "nvme_io_md": false, 00:22:41.665 "write_zeroes": true, 00:22:41.665 "zcopy": false, 00:22:41.665 "get_zone_info": false, 00:22:41.665 "zone_management": false, 00:22:41.665 "zone_append": false, 00:22:41.665 "compare": false, 00:22:41.665 "compare_and_write": false, 00:22:41.665 "abort": false, 00:22:41.665 "seek_hole": false, 00:22:41.665 "seek_data": false, 00:22:41.665 "copy": false, 00:22:41.665 "nvme_iov_md": false 00:22:41.665 }, 00:22:41.665 "memory_domains": [ 00:22:41.665 { 00:22:41.665 "dma_device_id": "system", 00:22:41.665 "dma_device_type": 1 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.665 "dma_device_type": 2 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "dma_device_id": "system", 00:22:41.665 "dma_device_type": 1 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.665 "dma_device_type": 2 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "dma_device_id": "system", 00:22:41.665 "dma_device_type": 1 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.665 "dma_device_type": 2 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "dma_device_id": "system", 00:22:41.665 "dma_device_type": 1 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.665 "dma_device_type": 2 00:22:41.665 } 00:22:41.665 ], 00:22:41.665 "driver_specific": { 00:22:41.665 "raid": { 00:22:41.665 "uuid": "01f4951c-df14-452d-957a-139878fd87e9", 00:22:41.665 "strip_size_kb": 64, 00:22:41.665 "state": "online", 00:22:41.665 "raid_level": "concat", 00:22:41.665 "superblock": true, 00:22:41.665 "num_base_bdevs": 4, 00:22:41.665 "num_base_bdevs_discovered": 4, 00:22:41.665 "num_base_bdevs_operational": 4, 00:22:41.665 "base_bdevs_list": [ 00:22:41.665 { 00:22:41.665 "name": "NewBaseBdev", 00:22:41.665 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:41.665 "is_configured": true, 00:22:41.665 "data_offset": 2048, 00:22:41.665 "data_size": 63488 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "name": "BaseBdev2", 00:22:41.665 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:41.665 "is_configured": true, 00:22:41.665 "data_offset": 2048, 00:22:41.665 "data_size": 63488 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "name": "BaseBdev3", 00:22:41.665 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:41.665 "is_configured": true, 00:22:41.665 "data_offset": 2048, 00:22:41.665 "data_size": 63488 00:22:41.665 }, 00:22:41.665 { 00:22:41.665 "name": "BaseBdev4", 00:22:41.665 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:41.665 "is_configured": true, 00:22:41.665 "data_offset": 2048, 00:22:41.665 "data_size": 63488 00:22:41.665 } 00:22:41.665 ] 00:22:41.665 } 00:22:41.665 } 00:22:41.665 }' 00:22:41.665 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:41.665 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:41.665 BaseBdev2 00:22:41.665 BaseBdev3 00:22:41.665 BaseBdev4' 00:22:41.665 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:41.665 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:41.665 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:41.923 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:41.923 "name": "NewBaseBdev", 00:22:41.923 "aliases": [ 00:22:41.923 "fa5a2480-8ac6-4649-b81b-010ca0017329" 00:22:41.923 ], 00:22:41.923 "product_name": "Malloc disk", 00:22:41.923 "block_size": 512, 00:22:41.923 "num_blocks": 65536, 00:22:41.923 "uuid": "fa5a2480-8ac6-4649-b81b-010ca0017329", 00:22:41.923 "assigned_rate_limits": { 00:22:41.923 "rw_ios_per_sec": 0, 00:22:41.923 "rw_mbytes_per_sec": 0, 00:22:41.923 "r_mbytes_per_sec": 0, 00:22:41.923 "w_mbytes_per_sec": 0 00:22:41.923 }, 00:22:41.923 "claimed": true, 00:22:41.924 "claim_type": "exclusive_write", 00:22:41.924 "zoned": false, 00:22:41.924 "supported_io_types": { 00:22:41.924 "read": true, 00:22:41.924 "write": true, 00:22:41.924 "unmap": true, 00:22:41.924 "flush": true, 00:22:41.924 "reset": true, 00:22:41.924 "nvme_admin": false, 00:22:41.924 "nvme_io": false, 00:22:41.924 "nvme_io_md": false, 00:22:41.924 "write_zeroes": true, 00:22:41.924 "zcopy": true, 00:22:41.924 "get_zone_info": false, 00:22:41.924 "zone_management": false, 00:22:41.924 "zone_append": false, 00:22:41.924 "compare": false, 00:22:41.924 "compare_and_write": false, 00:22:41.924 "abort": true, 00:22:41.924 "seek_hole": false, 00:22:41.924 "seek_data": false, 00:22:41.924 "copy": true, 00:22:41.924 "nvme_iov_md": false 00:22:41.924 }, 00:22:41.924 "memory_domains": [ 00:22:41.924 { 00:22:41.924 "dma_device_id": "system", 00:22:41.924 "dma_device_type": 1 00:22:41.924 }, 00:22:41.924 { 00:22:41.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.924 "dma_device_type": 2 00:22:41.924 } 00:22:41.924 ], 00:22:41.924 "driver_specific": {} 00:22:41.924 }' 00:22:41.924 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.924 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.924 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:41.924 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.924 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:42.182 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:42.440 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:42.440 "name": "BaseBdev2", 00:22:42.440 "aliases": [ 00:22:42.440 "7ae18713-a1b6-4de4-910e-ed700a36f661" 00:22:42.440 ], 00:22:42.440 "product_name": "Malloc disk", 00:22:42.440 "block_size": 512, 00:22:42.440 "num_blocks": 65536, 00:22:42.440 "uuid": "7ae18713-a1b6-4de4-910e-ed700a36f661", 00:22:42.440 "assigned_rate_limits": { 00:22:42.440 "rw_ios_per_sec": 0, 00:22:42.440 "rw_mbytes_per_sec": 0, 00:22:42.441 "r_mbytes_per_sec": 0, 00:22:42.441 "w_mbytes_per_sec": 0 00:22:42.441 }, 00:22:42.441 "claimed": true, 00:22:42.441 "claim_type": "exclusive_write", 00:22:42.441 "zoned": false, 00:22:42.441 "supported_io_types": { 00:22:42.441 "read": true, 00:22:42.441 "write": true, 00:22:42.441 "unmap": true, 00:22:42.441 "flush": true, 00:22:42.441 "reset": true, 00:22:42.441 "nvme_admin": false, 00:22:42.441 "nvme_io": false, 00:22:42.441 "nvme_io_md": false, 00:22:42.441 "write_zeroes": true, 00:22:42.441 "zcopy": true, 00:22:42.441 "get_zone_info": false, 00:22:42.441 "zone_management": false, 00:22:42.441 "zone_append": false, 00:22:42.441 "compare": false, 00:22:42.441 "compare_and_write": false, 00:22:42.441 "abort": true, 00:22:42.441 "seek_hole": false, 00:22:42.441 "seek_data": false, 00:22:42.441 "copy": true, 00:22:42.441 "nvme_iov_md": false 00:22:42.441 }, 00:22:42.441 "memory_domains": [ 00:22:42.441 { 00:22:42.441 "dma_device_id": "system", 00:22:42.441 "dma_device_type": 1 00:22:42.441 }, 00:22:42.441 { 00:22:42.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.441 "dma_device_type": 2 00:22:42.441 } 00:22:42.441 ], 00:22:42.441 "driver_specific": {} 00:22:42.441 }' 00:22:42.441 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.441 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.441 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:42.441 06:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:42.699 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:42.957 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:42.957 "name": "BaseBdev3", 00:22:42.957 "aliases": [ 00:22:42.957 "44b9ece7-4f02-4cad-a0a6-13f769d521e3" 00:22:42.957 ], 00:22:42.957 "product_name": "Malloc disk", 00:22:42.957 "block_size": 512, 00:22:42.957 "num_blocks": 65536, 00:22:42.957 "uuid": "44b9ece7-4f02-4cad-a0a6-13f769d521e3", 00:22:42.957 "assigned_rate_limits": { 00:22:42.957 "rw_ios_per_sec": 0, 00:22:42.957 "rw_mbytes_per_sec": 0, 00:22:42.957 "r_mbytes_per_sec": 0, 00:22:42.957 "w_mbytes_per_sec": 0 00:22:42.957 }, 00:22:42.957 "claimed": true, 00:22:42.957 "claim_type": "exclusive_write", 00:22:42.958 "zoned": false, 00:22:42.958 "supported_io_types": { 00:22:42.958 "read": true, 00:22:42.958 "write": true, 00:22:42.958 "unmap": true, 00:22:42.958 "flush": true, 00:22:42.958 "reset": true, 00:22:42.958 "nvme_admin": false, 00:22:42.958 "nvme_io": false, 00:22:42.958 "nvme_io_md": false, 00:22:42.958 "write_zeroes": true, 00:22:42.958 "zcopy": true, 00:22:42.958 "get_zone_info": false, 00:22:42.958 "zone_management": false, 00:22:42.958 "zone_append": false, 00:22:42.958 "compare": false, 00:22:42.958 "compare_and_write": false, 00:22:42.958 "abort": true, 00:22:42.958 "seek_hole": false, 00:22:42.958 "seek_data": false, 00:22:42.958 "copy": true, 00:22:42.958 "nvme_iov_md": false 00:22:42.958 }, 00:22:42.958 "memory_domains": [ 00:22:42.958 { 00:22:42.958 "dma_device_id": "system", 00:22:42.958 "dma_device_type": 1 00:22:42.958 }, 00:22:42.958 { 00:22:42.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.958 "dma_device_type": 2 00:22:42.958 } 00:22:42.958 ], 00:22:42.958 "driver_specific": {} 00:22:42.958 }' 00:22:42.958 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:43.216 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:43.475 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:43.475 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:43.475 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:43.475 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:43.475 06:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:43.733 "name": "BaseBdev4", 00:22:43.733 "aliases": [ 00:22:43.733 "0dca4b94-cdf6-41d7-87aa-ededc455e2db" 00:22:43.733 ], 00:22:43.733 "product_name": "Malloc disk", 00:22:43.733 "block_size": 512, 00:22:43.733 "num_blocks": 65536, 00:22:43.733 "uuid": "0dca4b94-cdf6-41d7-87aa-ededc455e2db", 00:22:43.733 "assigned_rate_limits": { 00:22:43.733 "rw_ios_per_sec": 0, 00:22:43.733 "rw_mbytes_per_sec": 0, 00:22:43.733 "r_mbytes_per_sec": 0, 00:22:43.733 "w_mbytes_per_sec": 0 00:22:43.733 }, 00:22:43.733 "claimed": true, 00:22:43.733 "claim_type": "exclusive_write", 00:22:43.733 "zoned": false, 00:22:43.733 "supported_io_types": { 00:22:43.733 "read": true, 00:22:43.733 "write": true, 00:22:43.733 "unmap": true, 00:22:43.733 "flush": true, 00:22:43.733 "reset": true, 00:22:43.733 "nvme_admin": false, 00:22:43.733 "nvme_io": false, 00:22:43.733 "nvme_io_md": false, 00:22:43.733 "write_zeroes": true, 00:22:43.733 "zcopy": true, 00:22:43.733 "get_zone_info": false, 00:22:43.733 "zone_management": false, 00:22:43.733 "zone_append": false, 00:22:43.733 "compare": false, 00:22:43.733 "compare_and_write": false, 00:22:43.733 "abort": true, 00:22:43.733 "seek_hole": false, 00:22:43.733 "seek_data": false, 00:22:43.733 "copy": true, 00:22:43.733 "nvme_iov_md": false 00:22:43.733 }, 00:22:43.733 "memory_domains": [ 00:22:43.733 { 00:22:43.733 "dma_device_id": "system", 00:22:43.733 "dma_device_type": 1 00:22:43.733 }, 00:22:43.733 { 00:22:43.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.733 "dma_device_type": 2 00:22:43.733 } 00:22:43.733 ], 00:22:43.733 "driver_specific": {} 00:22:43.733 }' 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:43.733 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:43.992 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:43.992 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:43.992 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:43.992 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:43.992 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:44.250 [2024-07-25 06:38:57.596474] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:44.250 [2024-07-25 06:38:57.596501] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:44.250 [2024-07-25 06:38:57.596556] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:44.250 [2024-07-25 06:38:57.596613] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:44.250 [2024-07-25 06:38:57.596623] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b34710 name Existed_Raid, state offline 00:22:44.250 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1190489 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1190489 ']' 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1190489 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1190489 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1190489' 00:22:44.251 killing process with pid 1190489 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1190489 00:22:44.251 [2024-07-25 06:38:57.671550] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:44.251 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1190489 00:22:44.251 [2024-07-25 06:38:57.703622] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:44.510 06:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:44.510 00:22:44.510 real 0m30.656s 00:22:44.510 user 0m56.236s 00:22:44.510 sys 0m5.515s 00:22:44.510 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:44.510 06:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:44.510 ************************************ 00:22:44.510 END TEST raid_state_function_test_sb 00:22:44.510 ************************************ 00:22:44.510 06:38:57 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:22:44.510 06:38:57 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:22:44.510 06:38:57 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:44.510 06:38:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:44.510 ************************************ 00:22:44.510 START TEST raid_superblock_test 00:22:44.510 ************************************ 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 4 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1196360 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1196360 /var/tmp/spdk-raid.sock 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1196360 ']' 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:44.510 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:44.510 06:38:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:44.510 [2024-07-25 06:38:58.033400] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:22:44.510 [2024-07-25 06:38:58.033459] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1196360 ] 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:01.0 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:01.1 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:01.2 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:01.3 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:01.4 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:01.5 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:01.6 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:01.7 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:02.0 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:02.1 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:02.2 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:02.3 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:02.4 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:02.5 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:02.6 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3d:02.7 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:01.0 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:01.1 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:01.2 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:01.3 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:01.4 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:01.5 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:01.6 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:01.7 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:02.0 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:02.1 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:02.2 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:02.3 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:02.4 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:02.5 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:02.6 cannot be used 00:22:44.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:22:44.769 EAL: Requested device 0000:3f:02.7 cannot be used 00:22:44.769 [2024-07-25 06:38:58.171592] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:44.769 [2024-07-25 06:38:58.216333] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:44.769 [2024-07-25 06:38:58.274932] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:44.769 [2024-07-25 06:38:58.274967] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:45.705 06:38:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:45.964 malloc1 00:22:45.964 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:46.255 [2024-07-25 06:38:59.646315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:46.255 [2024-07-25 06:38:59.646358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:46.255 [2024-07-25 06:38:59.646378] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2484d70 00:22:46.255 [2024-07-25 06:38:59.646390] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:46.255 [2024-07-25 06:38:59.647906] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:46.255 [2024-07-25 06:38:59.647932] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:46.255 pt1 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:46.255 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:46.513 malloc2 00:22:46.513 06:38:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:46.771 [2024-07-25 06:39:00.095900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:46.771 [2024-07-25 06:39:00.095947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:46.771 [2024-07-25 06:39:00.095964] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d3790 00:22:46.771 [2024-07-25 06:39:00.095976] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:46.771 [2024-07-25 06:39:00.097411] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:46.771 [2024-07-25 06:39:00.097438] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:46.771 pt2 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:46.771 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:47.029 malloc3 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:47.029 [2024-07-25 06:39:00.561337] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:47.029 [2024-07-25 06:39:00.561379] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.029 [2024-07-25 06:39:00.561396] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24788c0 00:22:47.029 [2024-07-25 06:39:00.561408] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.029 [2024-07-25 06:39:00.562731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.029 [2024-07-25 06:39:00.562757] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:47.029 pt3 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:47.029 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:47.287 malloc4 00:22:47.287 06:39:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:47.545 [2024-07-25 06:39:01.022915] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:47.545 [2024-07-25 06:39:01.022960] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.545 [2024-07-25 06:39:01.022976] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x247b300 00:22:47.545 [2024-07-25 06:39:01.022988] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.545 [2024-07-25 06:39:01.024334] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.545 [2024-07-25 06:39:01.024360] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:47.545 pt4 00:22:47.545 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:22:47.545 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:22:47.545 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:47.803 [2024-07-25 06:39:01.247539] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:47.803 [2024-07-25 06:39:01.248705] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:47.803 [2024-07-25 06:39:01.248756] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:47.804 [2024-07-25 06:39:01.248796] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:47.804 [2024-07-25 06:39:01.248958] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22cb770 00:22:47.804 [2024-07-25 06:39:01.248968] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:47.804 [2024-07-25 06:39:01.249153] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24779f0 00:22:47.804 [2024-07-25 06:39:01.249290] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22cb770 00:22:47.804 [2024-07-25 06:39:01.249299] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22cb770 00:22:47.804 [2024-07-25 06:39:01.249385] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.804 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.062 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:48.062 "name": "raid_bdev1", 00:22:48.062 "uuid": "6e7d88ba-b214-4424-b249-aae761a61abd", 00:22:48.062 "strip_size_kb": 64, 00:22:48.062 "state": "online", 00:22:48.062 "raid_level": "concat", 00:22:48.062 "superblock": true, 00:22:48.062 "num_base_bdevs": 4, 00:22:48.062 "num_base_bdevs_discovered": 4, 00:22:48.062 "num_base_bdevs_operational": 4, 00:22:48.062 "base_bdevs_list": [ 00:22:48.062 { 00:22:48.062 "name": "pt1", 00:22:48.062 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:48.062 "is_configured": true, 00:22:48.062 "data_offset": 2048, 00:22:48.062 "data_size": 63488 00:22:48.062 }, 00:22:48.062 { 00:22:48.062 "name": "pt2", 00:22:48.062 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:48.062 "is_configured": true, 00:22:48.062 "data_offset": 2048, 00:22:48.062 "data_size": 63488 00:22:48.062 }, 00:22:48.062 { 00:22:48.062 "name": "pt3", 00:22:48.062 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:48.062 "is_configured": true, 00:22:48.062 "data_offset": 2048, 00:22:48.062 "data_size": 63488 00:22:48.062 }, 00:22:48.062 { 00:22:48.062 "name": "pt4", 00:22:48.062 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:48.062 "is_configured": true, 00:22:48.062 "data_offset": 2048, 00:22:48.062 "data_size": 63488 00:22:48.062 } 00:22:48.062 ] 00:22:48.062 }' 00:22:48.062 06:39:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:48.062 06:39:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:48.628 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:22:48.628 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:48.628 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:48.628 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:48.628 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:48.628 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:48.628 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:48.628 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:48.886 [2024-07-25 06:39:02.274496] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:48.886 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:48.886 "name": "raid_bdev1", 00:22:48.886 "aliases": [ 00:22:48.886 "6e7d88ba-b214-4424-b249-aae761a61abd" 00:22:48.886 ], 00:22:48.886 "product_name": "Raid Volume", 00:22:48.886 "block_size": 512, 00:22:48.886 "num_blocks": 253952, 00:22:48.886 "uuid": "6e7d88ba-b214-4424-b249-aae761a61abd", 00:22:48.886 "assigned_rate_limits": { 00:22:48.886 "rw_ios_per_sec": 0, 00:22:48.886 "rw_mbytes_per_sec": 0, 00:22:48.886 "r_mbytes_per_sec": 0, 00:22:48.886 "w_mbytes_per_sec": 0 00:22:48.886 }, 00:22:48.886 "claimed": false, 00:22:48.886 "zoned": false, 00:22:48.886 "supported_io_types": { 00:22:48.886 "read": true, 00:22:48.886 "write": true, 00:22:48.886 "unmap": true, 00:22:48.886 "flush": true, 00:22:48.886 "reset": true, 00:22:48.886 "nvme_admin": false, 00:22:48.886 "nvme_io": false, 00:22:48.886 "nvme_io_md": false, 00:22:48.886 "write_zeroes": true, 00:22:48.886 "zcopy": false, 00:22:48.886 "get_zone_info": false, 00:22:48.886 "zone_management": false, 00:22:48.886 "zone_append": false, 00:22:48.886 "compare": false, 00:22:48.886 "compare_and_write": false, 00:22:48.886 "abort": false, 00:22:48.886 "seek_hole": false, 00:22:48.886 "seek_data": false, 00:22:48.886 "copy": false, 00:22:48.886 "nvme_iov_md": false 00:22:48.886 }, 00:22:48.886 "memory_domains": [ 00:22:48.886 { 00:22:48.886 "dma_device_id": "system", 00:22:48.886 "dma_device_type": 1 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.886 "dma_device_type": 2 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "dma_device_id": "system", 00:22:48.886 "dma_device_type": 1 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.886 "dma_device_type": 2 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "dma_device_id": "system", 00:22:48.886 "dma_device_type": 1 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.886 "dma_device_type": 2 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "dma_device_id": "system", 00:22:48.886 "dma_device_type": 1 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.886 "dma_device_type": 2 00:22:48.886 } 00:22:48.886 ], 00:22:48.886 "driver_specific": { 00:22:48.886 "raid": { 00:22:48.886 "uuid": "6e7d88ba-b214-4424-b249-aae761a61abd", 00:22:48.886 "strip_size_kb": 64, 00:22:48.886 "state": "online", 00:22:48.886 "raid_level": "concat", 00:22:48.886 "superblock": true, 00:22:48.886 "num_base_bdevs": 4, 00:22:48.886 "num_base_bdevs_discovered": 4, 00:22:48.886 "num_base_bdevs_operational": 4, 00:22:48.886 "base_bdevs_list": [ 00:22:48.886 { 00:22:48.886 "name": "pt1", 00:22:48.886 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:48.886 "is_configured": true, 00:22:48.886 "data_offset": 2048, 00:22:48.886 "data_size": 63488 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "name": "pt2", 00:22:48.886 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:48.886 "is_configured": true, 00:22:48.886 "data_offset": 2048, 00:22:48.886 "data_size": 63488 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "name": "pt3", 00:22:48.886 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:48.886 "is_configured": true, 00:22:48.886 "data_offset": 2048, 00:22:48.886 "data_size": 63488 00:22:48.886 }, 00:22:48.886 { 00:22:48.886 "name": "pt4", 00:22:48.886 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:48.886 "is_configured": true, 00:22:48.886 "data_offset": 2048, 00:22:48.886 "data_size": 63488 00:22:48.886 } 00:22:48.886 ] 00:22:48.886 } 00:22:48.886 } 00:22:48.886 }' 00:22:48.886 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:48.886 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:48.886 pt2 00:22:48.886 pt3 00:22:48.886 pt4' 00:22:48.886 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:48.886 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:48.886 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:49.144 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:49.144 "name": "pt1", 00:22:49.144 "aliases": [ 00:22:49.144 "00000000-0000-0000-0000-000000000001" 00:22:49.144 ], 00:22:49.144 "product_name": "passthru", 00:22:49.144 "block_size": 512, 00:22:49.144 "num_blocks": 65536, 00:22:49.144 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:49.144 "assigned_rate_limits": { 00:22:49.144 "rw_ios_per_sec": 0, 00:22:49.144 "rw_mbytes_per_sec": 0, 00:22:49.144 "r_mbytes_per_sec": 0, 00:22:49.144 "w_mbytes_per_sec": 0 00:22:49.144 }, 00:22:49.144 "claimed": true, 00:22:49.144 "claim_type": "exclusive_write", 00:22:49.144 "zoned": false, 00:22:49.144 "supported_io_types": { 00:22:49.144 "read": true, 00:22:49.144 "write": true, 00:22:49.144 "unmap": true, 00:22:49.144 "flush": true, 00:22:49.144 "reset": true, 00:22:49.144 "nvme_admin": false, 00:22:49.144 "nvme_io": false, 00:22:49.144 "nvme_io_md": false, 00:22:49.144 "write_zeroes": true, 00:22:49.144 "zcopy": true, 00:22:49.144 "get_zone_info": false, 00:22:49.144 "zone_management": false, 00:22:49.144 "zone_append": false, 00:22:49.144 "compare": false, 00:22:49.144 "compare_and_write": false, 00:22:49.144 "abort": true, 00:22:49.144 "seek_hole": false, 00:22:49.144 "seek_data": false, 00:22:49.144 "copy": true, 00:22:49.144 "nvme_iov_md": false 00:22:49.144 }, 00:22:49.144 "memory_domains": [ 00:22:49.144 { 00:22:49.144 "dma_device_id": "system", 00:22:49.144 "dma_device_type": 1 00:22:49.144 }, 00:22:49.144 { 00:22:49.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:49.144 "dma_device_type": 2 00:22:49.144 } 00:22:49.144 ], 00:22:49.144 "driver_specific": { 00:22:49.144 "passthru": { 00:22:49.144 "name": "pt1", 00:22:49.144 "base_bdev_name": "malloc1" 00:22:49.144 } 00:22:49.144 } 00:22:49.144 }' 00:22:49.144 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.144 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.144 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:49.144 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.144 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:49.402 06:39:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:49.660 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:49.660 "name": "pt2", 00:22:49.660 "aliases": [ 00:22:49.660 "00000000-0000-0000-0000-000000000002" 00:22:49.660 ], 00:22:49.660 "product_name": "passthru", 00:22:49.660 "block_size": 512, 00:22:49.660 "num_blocks": 65536, 00:22:49.660 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:49.660 "assigned_rate_limits": { 00:22:49.660 "rw_ios_per_sec": 0, 00:22:49.660 "rw_mbytes_per_sec": 0, 00:22:49.660 "r_mbytes_per_sec": 0, 00:22:49.660 "w_mbytes_per_sec": 0 00:22:49.660 }, 00:22:49.660 "claimed": true, 00:22:49.660 "claim_type": "exclusive_write", 00:22:49.660 "zoned": false, 00:22:49.660 "supported_io_types": { 00:22:49.660 "read": true, 00:22:49.660 "write": true, 00:22:49.660 "unmap": true, 00:22:49.660 "flush": true, 00:22:49.660 "reset": true, 00:22:49.660 "nvme_admin": false, 00:22:49.660 "nvme_io": false, 00:22:49.660 "nvme_io_md": false, 00:22:49.660 "write_zeroes": true, 00:22:49.660 "zcopy": true, 00:22:49.660 "get_zone_info": false, 00:22:49.660 "zone_management": false, 00:22:49.660 "zone_append": false, 00:22:49.660 "compare": false, 00:22:49.660 "compare_and_write": false, 00:22:49.660 "abort": true, 00:22:49.660 "seek_hole": false, 00:22:49.660 "seek_data": false, 00:22:49.660 "copy": true, 00:22:49.660 "nvme_iov_md": false 00:22:49.660 }, 00:22:49.660 "memory_domains": [ 00:22:49.660 { 00:22:49.660 "dma_device_id": "system", 00:22:49.660 "dma_device_type": 1 00:22:49.660 }, 00:22:49.660 { 00:22:49.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:49.660 "dma_device_type": 2 00:22:49.660 } 00:22:49.660 ], 00:22:49.660 "driver_specific": { 00:22:49.660 "passthru": { 00:22:49.660 "name": "pt2", 00:22:49.660 "base_bdev_name": "malloc2" 00:22:49.660 } 00:22:49.660 } 00:22:49.660 }' 00:22:49.660 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.660 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:49.918 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:50.175 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:50.175 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:50.175 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:50.175 "name": "pt3", 00:22:50.175 "aliases": [ 00:22:50.175 "00000000-0000-0000-0000-000000000003" 00:22:50.175 ], 00:22:50.175 "product_name": "passthru", 00:22:50.175 "block_size": 512, 00:22:50.175 "num_blocks": 65536, 00:22:50.175 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:50.175 "assigned_rate_limits": { 00:22:50.175 "rw_ios_per_sec": 0, 00:22:50.175 "rw_mbytes_per_sec": 0, 00:22:50.175 "r_mbytes_per_sec": 0, 00:22:50.175 "w_mbytes_per_sec": 0 00:22:50.175 }, 00:22:50.175 "claimed": true, 00:22:50.175 "claim_type": "exclusive_write", 00:22:50.175 "zoned": false, 00:22:50.175 "supported_io_types": { 00:22:50.175 "read": true, 00:22:50.175 "write": true, 00:22:50.175 "unmap": true, 00:22:50.175 "flush": true, 00:22:50.175 "reset": true, 00:22:50.175 "nvme_admin": false, 00:22:50.175 "nvme_io": false, 00:22:50.175 "nvme_io_md": false, 00:22:50.175 "write_zeroes": true, 00:22:50.175 "zcopy": true, 00:22:50.175 "get_zone_info": false, 00:22:50.175 "zone_management": false, 00:22:50.175 "zone_append": false, 00:22:50.175 "compare": false, 00:22:50.175 "compare_and_write": false, 00:22:50.175 "abort": true, 00:22:50.175 "seek_hole": false, 00:22:50.175 "seek_data": false, 00:22:50.175 "copy": true, 00:22:50.175 "nvme_iov_md": false 00:22:50.175 }, 00:22:50.175 "memory_domains": [ 00:22:50.175 { 00:22:50.175 "dma_device_id": "system", 00:22:50.175 "dma_device_type": 1 00:22:50.175 }, 00:22:50.175 { 00:22:50.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.175 "dma_device_type": 2 00:22:50.175 } 00:22:50.175 ], 00:22:50.175 "driver_specific": { 00:22:50.175 "passthru": { 00:22:50.175 "name": "pt3", 00:22:50.175 "base_bdev_name": "malloc3" 00:22:50.175 } 00:22:50.175 } 00:22:50.175 }' 00:22:50.175 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:50.175 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:50.433 06:39:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:50.691 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:50.691 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:50.691 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:50.691 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:50.691 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:50.691 "name": "pt4", 00:22:50.691 "aliases": [ 00:22:50.691 "00000000-0000-0000-0000-000000000004" 00:22:50.691 ], 00:22:50.691 "product_name": "passthru", 00:22:50.691 "block_size": 512, 00:22:50.691 "num_blocks": 65536, 00:22:50.691 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:50.691 "assigned_rate_limits": { 00:22:50.691 "rw_ios_per_sec": 0, 00:22:50.691 "rw_mbytes_per_sec": 0, 00:22:50.691 "r_mbytes_per_sec": 0, 00:22:50.691 "w_mbytes_per_sec": 0 00:22:50.691 }, 00:22:50.691 "claimed": true, 00:22:50.691 "claim_type": "exclusive_write", 00:22:50.691 "zoned": false, 00:22:50.691 "supported_io_types": { 00:22:50.691 "read": true, 00:22:50.691 "write": true, 00:22:50.691 "unmap": true, 00:22:50.691 "flush": true, 00:22:50.691 "reset": true, 00:22:50.691 "nvme_admin": false, 00:22:50.691 "nvme_io": false, 00:22:50.691 "nvme_io_md": false, 00:22:50.691 "write_zeroes": true, 00:22:50.691 "zcopy": true, 00:22:50.691 "get_zone_info": false, 00:22:50.691 "zone_management": false, 00:22:50.691 "zone_append": false, 00:22:50.691 "compare": false, 00:22:50.691 "compare_and_write": false, 00:22:50.691 "abort": true, 00:22:50.691 "seek_hole": false, 00:22:50.691 "seek_data": false, 00:22:50.691 "copy": true, 00:22:50.691 "nvme_iov_md": false 00:22:50.691 }, 00:22:50.691 "memory_domains": [ 00:22:50.691 { 00:22:50.691 "dma_device_id": "system", 00:22:50.691 "dma_device_type": 1 00:22:50.691 }, 00:22:50.691 { 00:22:50.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.691 "dma_device_type": 2 00:22:50.691 } 00:22:50.691 ], 00:22:50.691 "driver_specific": { 00:22:50.691 "passthru": { 00:22:50.691 "name": "pt4", 00:22:50.691 "base_bdev_name": "malloc4" 00:22:50.691 } 00:22:50.691 } 00:22:50.691 }' 00:22:50.691 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:50.949 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:51.206 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:51.206 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:51.206 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:51.206 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:22:51.464 [2024-07-25 06:39:04.781067] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:51.464 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=6e7d88ba-b214-4424-b249-aae761a61abd 00:22:51.464 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 6e7d88ba-b214-4424-b249-aae761a61abd ']' 00:22:51.464 06:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:51.464 [2024-07-25 06:39:05.009396] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:51.464 [2024-07-25 06:39:05.009418] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:51.464 [2024-07-25 06:39:05.009466] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:51.464 [2024-07-25 06:39:05.009525] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:51.464 [2024-07-25 06:39:05.009535] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22cb770 name raid_bdev1, state offline 00:22:51.723 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.723 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:22:51.723 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:22:51.723 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:22:51.723 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:22:51.723 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:51.981 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:22:51.981 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:52.240 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:22:52.240 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:52.498 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:22:52.498 06:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:52.757 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:52.757 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:53.015 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:22:53.015 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:53.015 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:53.016 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:53.274 [2024-07-25 06:39:06.593485] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:53.274 [2024-07-25 06:39:06.594715] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:53.274 [2024-07-25 06:39:06.594754] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:53.274 [2024-07-25 06:39:06.594785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:53.274 [2024-07-25 06:39:06.594825] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:53.274 [2024-07-25 06:39:06.594860] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:53.274 [2024-07-25 06:39:06.594881] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:53.274 [2024-07-25 06:39:06.594901] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:53.274 [2024-07-25 06:39:06.594917] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:53.274 [2024-07-25 06:39:06.594926] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x247ad00 name raid_bdev1, state configuring 00:22:53.274 request: 00:22:53.274 { 00:22:53.274 "name": "raid_bdev1", 00:22:53.274 "raid_level": "concat", 00:22:53.274 "base_bdevs": [ 00:22:53.274 "malloc1", 00:22:53.274 "malloc2", 00:22:53.274 "malloc3", 00:22:53.274 "malloc4" 00:22:53.274 ], 00:22:53.274 "strip_size_kb": 64, 00:22:53.274 "superblock": false, 00:22:53.274 "method": "bdev_raid_create", 00:22:53.274 "req_id": 1 00:22:53.274 } 00:22:53.274 Got JSON-RPC error response 00:22:53.274 response: 00:22:53.274 { 00:22:53.274 "code": -17, 00:22:53.274 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:53.274 } 00:22:53.274 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:22:53.275 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:22:53.275 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:22:53.275 06:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:22:53.275 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.275 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:22:53.534 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:22:53.534 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:22:53.534 06:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:53.534 [2024-07-25 06:39:07.050627] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:53.534 [2024-07-25 06:39:07.050667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.534 [2024-07-25 06:39:07.050685] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2475f60 00:22:53.534 [2024-07-25 06:39:07.050697] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.534 [2024-07-25 06:39:07.052125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.534 [2024-07-25 06:39:07.052157] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:53.534 [2024-07-25 06:39:07.052215] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:53.534 [2024-07-25 06:39:07.052238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:53.534 pt1 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.534 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.792 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.792 "name": "raid_bdev1", 00:22:53.792 "uuid": "6e7d88ba-b214-4424-b249-aae761a61abd", 00:22:53.792 "strip_size_kb": 64, 00:22:53.792 "state": "configuring", 00:22:53.792 "raid_level": "concat", 00:22:53.792 "superblock": true, 00:22:53.792 "num_base_bdevs": 4, 00:22:53.792 "num_base_bdevs_discovered": 1, 00:22:53.792 "num_base_bdevs_operational": 4, 00:22:53.792 "base_bdevs_list": [ 00:22:53.792 { 00:22:53.792 "name": "pt1", 00:22:53.792 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:53.792 "is_configured": true, 00:22:53.792 "data_offset": 2048, 00:22:53.792 "data_size": 63488 00:22:53.792 }, 00:22:53.792 { 00:22:53.792 "name": null, 00:22:53.792 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:53.792 "is_configured": false, 00:22:53.792 "data_offset": 2048, 00:22:53.792 "data_size": 63488 00:22:53.792 }, 00:22:53.792 { 00:22:53.792 "name": null, 00:22:53.792 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:53.792 "is_configured": false, 00:22:53.792 "data_offset": 2048, 00:22:53.792 "data_size": 63488 00:22:53.792 }, 00:22:53.792 { 00:22:53.792 "name": null, 00:22:53.792 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:53.792 "is_configured": false, 00:22:53.792 "data_offset": 2048, 00:22:53.792 "data_size": 63488 00:22:53.792 } 00:22:53.792 ] 00:22:53.792 }' 00:22:53.792 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.792 06:39:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:54.359 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:22:54.359 06:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:54.618 [2024-07-25 06:39:08.069316] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:54.618 [2024-07-25 06:39:08.069360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.618 [2024-07-25 06:39:08.069378] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22cc0a0 00:22:54.618 [2024-07-25 06:39:08.069390] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.618 [2024-07-25 06:39:08.069686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.618 [2024-07-25 06:39:08.069702] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:54.618 [2024-07-25 06:39:08.069756] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:54.618 [2024-07-25 06:39:08.069773] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:54.618 pt2 00:22:54.618 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:54.877 [2024-07-25 06:39:08.293919] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.877 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.135 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:55.135 "name": "raid_bdev1", 00:22:55.135 "uuid": "6e7d88ba-b214-4424-b249-aae761a61abd", 00:22:55.135 "strip_size_kb": 64, 00:22:55.135 "state": "configuring", 00:22:55.135 "raid_level": "concat", 00:22:55.135 "superblock": true, 00:22:55.135 "num_base_bdevs": 4, 00:22:55.135 "num_base_bdevs_discovered": 1, 00:22:55.135 "num_base_bdevs_operational": 4, 00:22:55.135 "base_bdevs_list": [ 00:22:55.135 { 00:22:55.135 "name": "pt1", 00:22:55.135 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:55.135 "is_configured": true, 00:22:55.135 "data_offset": 2048, 00:22:55.135 "data_size": 63488 00:22:55.135 }, 00:22:55.135 { 00:22:55.135 "name": null, 00:22:55.135 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:55.135 "is_configured": false, 00:22:55.135 "data_offset": 2048, 00:22:55.135 "data_size": 63488 00:22:55.135 }, 00:22:55.135 { 00:22:55.135 "name": null, 00:22:55.135 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:55.135 "is_configured": false, 00:22:55.135 "data_offset": 2048, 00:22:55.135 "data_size": 63488 00:22:55.135 }, 00:22:55.135 { 00:22:55.135 "name": null, 00:22:55.135 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:55.135 "is_configured": false, 00:22:55.135 "data_offset": 2048, 00:22:55.135 "data_size": 63488 00:22:55.135 } 00:22:55.135 ] 00:22:55.135 }' 00:22:55.135 06:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:55.135 06:39:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:55.702 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:22:55.702 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:22:55.702 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:55.960 [2024-07-25 06:39:09.320618] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:55.960 [2024-07-25 06:39:09.320663] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.960 [2024-07-25 06:39:09.320681] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24846b0 00:22:55.960 [2024-07-25 06:39:09.320693] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.960 [2024-07-25 06:39:09.320990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.960 [2024-07-25 06:39:09.321006] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:55.960 [2024-07-25 06:39:09.321060] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:55.960 [2024-07-25 06:39:09.321076] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:55.960 pt2 00:22:55.960 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:22:55.960 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:22:55.960 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:56.219 [2024-07-25 06:39:09.549217] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:56.219 [2024-07-25 06:39:09.549249] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:56.219 [2024-07-25 06:39:09.549263] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22cb450 00:22:56.219 [2024-07-25 06:39:09.549274] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:56.219 [2024-07-25 06:39:09.549525] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:56.219 [2024-07-25 06:39:09.549540] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:56.219 [2024-07-25 06:39:09.549585] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:56.219 [2024-07-25 06:39:09.549600] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:56.219 pt3 00:22:56.219 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:22:56.219 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:22:56.219 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:56.478 [2024-07-25 06:39:09.777814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:56.478 [2024-07-25 06:39:09.777849] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:56.478 [2024-07-25 06:39:09.777863] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22c9080 00:22:56.478 [2024-07-25 06:39:09.777874] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:56.478 [2024-07-25 06:39:09.778125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:56.478 [2024-07-25 06:39:09.778146] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:56.478 [2024-07-25 06:39:09.778192] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:56.478 [2024-07-25 06:39:09.778207] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:56.478 [2024-07-25 06:39:09.778310] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2479950 00:22:56.478 [2024-07-25 06:39:09.778319] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:56.478 [2024-07-25 06:39:09.778469] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d2760 00:22:56.478 [2024-07-25 06:39:09.778582] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2479950 00:22:56.478 [2024-07-25 06:39:09.778591] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2479950 00:22:56.478 [2024-07-25 06:39:09.778672] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:56.478 pt4 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.478 06:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.737 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.737 "name": "raid_bdev1", 00:22:56.737 "uuid": "6e7d88ba-b214-4424-b249-aae761a61abd", 00:22:56.737 "strip_size_kb": 64, 00:22:56.737 "state": "online", 00:22:56.737 "raid_level": "concat", 00:22:56.737 "superblock": true, 00:22:56.737 "num_base_bdevs": 4, 00:22:56.737 "num_base_bdevs_discovered": 4, 00:22:56.737 "num_base_bdevs_operational": 4, 00:22:56.737 "base_bdevs_list": [ 00:22:56.737 { 00:22:56.737 "name": "pt1", 00:22:56.737 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:56.737 "is_configured": true, 00:22:56.737 "data_offset": 2048, 00:22:56.737 "data_size": 63488 00:22:56.737 }, 00:22:56.737 { 00:22:56.737 "name": "pt2", 00:22:56.737 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:56.737 "is_configured": true, 00:22:56.737 "data_offset": 2048, 00:22:56.737 "data_size": 63488 00:22:56.737 }, 00:22:56.737 { 00:22:56.737 "name": "pt3", 00:22:56.737 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:56.737 "is_configured": true, 00:22:56.737 "data_offset": 2048, 00:22:56.737 "data_size": 63488 00:22:56.737 }, 00:22:56.737 { 00:22:56.737 "name": "pt4", 00:22:56.737 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:56.737 "is_configured": true, 00:22:56.737 "data_offset": 2048, 00:22:56.737 "data_size": 63488 00:22:56.737 } 00:22:56.737 ] 00:22:56.737 }' 00:22:56.737 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.737 06:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:57.303 [2024-07-25 06:39:10.832884] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:57.303 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:57.303 "name": "raid_bdev1", 00:22:57.303 "aliases": [ 00:22:57.304 "6e7d88ba-b214-4424-b249-aae761a61abd" 00:22:57.304 ], 00:22:57.304 "product_name": "Raid Volume", 00:22:57.304 "block_size": 512, 00:22:57.304 "num_blocks": 253952, 00:22:57.304 "uuid": "6e7d88ba-b214-4424-b249-aae761a61abd", 00:22:57.304 "assigned_rate_limits": { 00:22:57.304 "rw_ios_per_sec": 0, 00:22:57.304 "rw_mbytes_per_sec": 0, 00:22:57.304 "r_mbytes_per_sec": 0, 00:22:57.304 "w_mbytes_per_sec": 0 00:22:57.304 }, 00:22:57.304 "claimed": false, 00:22:57.304 "zoned": false, 00:22:57.304 "supported_io_types": { 00:22:57.304 "read": true, 00:22:57.304 "write": true, 00:22:57.304 "unmap": true, 00:22:57.304 "flush": true, 00:22:57.304 "reset": true, 00:22:57.304 "nvme_admin": false, 00:22:57.304 "nvme_io": false, 00:22:57.304 "nvme_io_md": false, 00:22:57.304 "write_zeroes": true, 00:22:57.304 "zcopy": false, 00:22:57.304 "get_zone_info": false, 00:22:57.304 "zone_management": false, 00:22:57.304 "zone_append": false, 00:22:57.304 "compare": false, 00:22:57.304 "compare_and_write": false, 00:22:57.304 "abort": false, 00:22:57.304 "seek_hole": false, 00:22:57.304 "seek_data": false, 00:22:57.304 "copy": false, 00:22:57.304 "nvme_iov_md": false 00:22:57.304 }, 00:22:57.304 "memory_domains": [ 00:22:57.304 { 00:22:57.304 "dma_device_id": "system", 00:22:57.304 "dma_device_type": 1 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:57.304 "dma_device_type": 2 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "dma_device_id": "system", 00:22:57.304 "dma_device_type": 1 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:57.304 "dma_device_type": 2 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "dma_device_id": "system", 00:22:57.304 "dma_device_type": 1 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:57.304 "dma_device_type": 2 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "dma_device_id": "system", 00:22:57.304 "dma_device_type": 1 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:57.304 "dma_device_type": 2 00:22:57.304 } 00:22:57.304 ], 00:22:57.304 "driver_specific": { 00:22:57.304 "raid": { 00:22:57.304 "uuid": "6e7d88ba-b214-4424-b249-aae761a61abd", 00:22:57.304 "strip_size_kb": 64, 00:22:57.304 "state": "online", 00:22:57.304 "raid_level": "concat", 00:22:57.304 "superblock": true, 00:22:57.304 "num_base_bdevs": 4, 00:22:57.304 "num_base_bdevs_discovered": 4, 00:22:57.304 "num_base_bdevs_operational": 4, 00:22:57.304 "base_bdevs_list": [ 00:22:57.304 { 00:22:57.304 "name": "pt1", 00:22:57.304 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:57.304 "is_configured": true, 00:22:57.304 "data_offset": 2048, 00:22:57.304 "data_size": 63488 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "name": "pt2", 00:22:57.304 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:57.304 "is_configured": true, 00:22:57.304 "data_offset": 2048, 00:22:57.304 "data_size": 63488 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "name": "pt3", 00:22:57.304 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:57.304 "is_configured": true, 00:22:57.304 "data_offset": 2048, 00:22:57.304 "data_size": 63488 00:22:57.304 }, 00:22:57.304 { 00:22:57.304 "name": "pt4", 00:22:57.304 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:57.304 "is_configured": true, 00:22:57.304 "data_offset": 2048, 00:22:57.304 "data_size": 63488 00:22:57.304 } 00:22:57.304 ] 00:22:57.304 } 00:22:57.304 } 00:22:57.304 }' 00:22:57.304 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:57.563 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:57.563 pt2 00:22:57.563 pt3 00:22:57.563 pt4' 00:22:57.563 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:57.563 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:57.563 06:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:57.821 "name": "pt1", 00:22:57.821 "aliases": [ 00:22:57.821 "00000000-0000-0000-0000-000000000001" 00:22:57.821 ], 00:22:57.821 "product_name": "passthru", 00:22:57.821 "block_size": 512, 00:22:57.821 "num_blocks": 65536, 00:22:57.821 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:57.821 "assigned_rate_limits": { 00:22:57.821 "rw_ios_per_sec": 0, 00:22:57.821 "rw_mbytes_per_sec": 0, 00:22:57.821 "r_mbytes_per_sec": 0, 00:22:57.821 "w_mbytes_per_sec": 0 00:22:57.821 }, 00:22:57.821 "claimed": true, 00:22:57.821 "claim_type": "exclusive_write", 00:22:57.821 "zoned": false, 00:22:57.821 "supported_io_types": { 00:22:57.821 "read": true, 00:22:57.821 "write": true, 00:22:57.821 "unmap": true, 00:22:57.821 "flush": true, 00:22:57.821 "reset": true, 00:22:57.821 "nvme_admin": false, 00:22:57.821 "nvme_io": false, 00:22:57.821 "nvme_io_md": false, 00:22:57.821 "write_zeroes": true, 00:22:57.821 "zcopy": true, 00:22:57.821 "get_zone_info": false, 00:22:57.821 "zone_management": false, 00:22:57.821 "zone_append": false, 00:22:57.821 "compare": false, 00:22:57.821 "compare_and_write": false, 00:22:57.821 "abort": true, 00:22:57.821 "seek_hole": false, 00:22:57.821 "seek_data": false, 00:22:57.821 "copy": true, 00:22:57.821 "nvme_iov_md": false 00:22:57.821 }, 00:22:57.821 "memory_domains": [ 00:22:57.821 { 00:22:57.821 "dma_device_id": "system", 00:22:57.821 "dma_device_type": 1 00:22:57.821 }, 00:22:57.821 { 00:22:57.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:57.821 "dma_device_type": 2 00:22:57.821 } 00:22:57.821 ], 00:22:57.821 "driver_specific": { 00:22:57.821 "passthru": { 00:22:57.821 "name": "pt1", 00:22:57.821 "base_bdev_name": "malloc1" 00:22:57.821 } 00:22:57.821 } 00:22:57.821 }' 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:57.821 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:58.080 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:58.080 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:58.080 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:58.080 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:58.080 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:58.080 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:58.080 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:58.338 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:58.339 "name": "pt2", 00:22:58.339 "aliases": [ 00:22:58.339 "00000000-0000-0000-0000-000000000002" 00:22:58.339 ], 00:22:58.339 "product_name": "passthru", 00:22:58.339 "block_size": 512, 00:22:58.339 "num_blocks": 65536, 00:22:58.339 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:58.339 "assigned_rate_limits": { 00:22:58.339 "rw_ios_per_sec": 0, 00:22:58.339 "rw_mbytes_per_sec": 0, 00:22:58.339 "r_mbytes_per_sec": 0, 00:22:58.339 "w_mbytes_per_sec": 0 00:22:58.339 }, 00:22:58.339 "claimed": true, 00:22:58.339 "claim_type": "exclusive_write", 00:22:58.339 "zoned": false, 00:22:58.339 "supported_io_types": { 00:22:58.339 "read": true, 00:22:58.339 "write": true, 00:22:58.339 "unmap": true, 00:22:58.339 "flush": true, 00:22:58.339 "reset": true, 00:22:58.339 "nvme_admin": false, 00:22:58.339 "nvme_io": false, 00:22:58.339 "nvme_io_md": false, 00:22:58.339 "write_zeroes": true, 00:22:58.339 "zcopy": true, 00:22:58.339 "get_zone_info": false, 00:22:58.339 "zone_management": false, 00:22:58.339 "zone_append": false, 00:22:58.339 "compare": false, 00:22:58.339 "compare_and_write": false, 00:22:58.339 "abort": true, 00:22:58.339 "seek_hole": false, 00:22:58.339 "seek_data": false, 00:22:58.339 "copy": true, 00:22:58.339 "nvme_iov_md": false 00:22:58.339 }, 00:22:58.339 "memory_domains": [ 00:22:58.339 { 00:22:58.339 "dma_device_id": "system", 00:22:58.339 "dma_device_type": 1 00:22:58.339 }, 00:22:58.339 { 00:22:58.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:58.339 "dma_device_type": 2 00:22:58.339 } 00:22:58.339 ], 00:22:58.339 "driver_specific": { 00:22:58.339 "passthru": { 00:22:58.339 "name": "pt2", 00:22:58.339 "base_bdev_name": "malloc2" 00:22:58.339 } 00:22:58.339 } 00:22:58.339 }' 00:22:58.339 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:58.339 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:58.339 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:58.339 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:58.339 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:58.339 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:58.339 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:58.639 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:58.639 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:58.639 06:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:58.639 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:58.639 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:58.639 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:58.639 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:58.639 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:58.911 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:58.911 "name": "pt3", 00:22:58.911 "aliases": [ 00:22:58.911 "00000000-0000-0000-0000-000000000003" 00:22:58.911 ], 00:22:58.911 "product_name": "passthru", 00:22:58.911 "block_size": 512, 00:22:58.911 "num_blocks": 65536, 00:22:58.911 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:58.911 "assigned_rate_limits": { 00:22:58.911 "rw_ios_per_sec": 0, 00:22:58.911 "rw_mbytes_per_sec": 0, 00:22:58.911 "r_mbytes_per_sec": 0, 00:22:58.911 "w_mbytes_per_sec": 0 00:22:58.911 }, 00:22:58.911 "claimed": true, 00:22:58.911 "claim_type": "exclusive_write", 00:22:58.911 "zoned": false, 00:22:58.911 "supported_io_types": { 00:22:58.911 "read": true, 00:22:58.911 "write": true, 00:22:58.911 "unmap": true, 00:22:58.911 "flush": true, 00:22:58.911 "reset": true, 00:22:58.911 "nvme_admin": false, 00:22:58.911 "nvme_io": false, 00:22:58.911 "nvme_io_md": false, 00:22:58.911 "write_zeroes": true, 00:22:58.911 "zcopy": true, 00:22:58.911 "get_zone_info": false, 00:22:58.911 "zone_management": false, 00:22:58.911 "zone_append": false, 00:22:58.911 "compare": false, 00:22:58.911 "compare_and_write": false, 00:22:58.911 "abort": true, 00:22:58.911 "seek_hole": false, 00:22:58.911 "seek_data": false, 00:22:58.911 "copy": true, 00:22:58.911 "nvme_iov_md": false 00:22:58.911 }, 00:22:58.911 "memory_domains": [ 00:22:58.911 { 00:22:58.911 "dma_device_id": "system", 00:22:58.911 "dma_device_type": 1 00:22:58.911 }, 00:22:58.911 { 00:22:58.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:58.911 "dma_device_type": 2 00:22:58.911 } 00:22:58.911 ], 00:22:58.911 "driver_specific": { 00:22:58.911 "passthru": { 00:22:58.911 "name": "pt3", 00:22:58.911 "base_bdev_name": "malloc3" 00:22:58.911 } 00:22:58.911 } 00:22:58.911 }' 00:22:58.911 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:58.911 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:58.911 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:58.911 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:58.911 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:58.911 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:58.911 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:59.170 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:59.170 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:59.170 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:59.170 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:59.170 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:59.171 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:59.171 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:59.171 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:59.429 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:59.429 "name": "pt4", 00:22:59.429 "aliases": [ 00:22:59.429 "00000000-0000-0000-0000-000000000004" 00:22:59.429 ], 00:22:59.429 "product_name": "passthru", 00:22:59.429 "block_size": 512, 00:22:59.429 "num_blocks": 65536, 00:22:59.429 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:59.429 "assigned_rate_limits": { 00:22:59.429 "rw_ios_per_sec": 0, 00:22:59.429 "rw_mbytes_per_sec": 0, 00:22:59.429 "r_mbytes_per_sec": 0, 00:22:59.429 "w_mbytes_per_sec": 0 00:22:59.429 }, 00:22:59.429 "claimed": true, 00:22:59.429 "claim_type": "exclusive_write", 00:22:59.429 "zoned": false, 00:22:59.429 "supported_io_types": { 00:22:59.429 "read": true, 00:22:59.429 "write": true, 00:22:59.429 "unmap": true, 00:22:59.429 "flush": true, 00:22:59.429 "reset": true, 00:22:59.429 "nvme_admin": false, 00:22:59.429 "nvme_io": false, 00:22:59.429 "nvme_io_md": false, 00:22:59.429 "write_zeroes": true, 00:22:59.429 "zcopy": true, 00:22:59.429 "get_zone_info": false, 00:22:59.429 "zone_management": false, 00:22:59.429 "zone_append": false, 00:22:59.429 "compare": false, 00:22:59.429 "compare_and_write": false, 00:22:59.429 "abort": true, 00:22:59.429 "seek_hole": false, 00:22:59.429 "seek_data": false, 00:22:59.429 "copy": true, 00:22:59.429 "nvme_iov_md": false 00:22:59.429 }, 00:22:59.429 "memory_domains": [ 00:22:59.429 { 00:22:59.429 "dma_device_id": "system", 00:22:59.429 "dma_device_type": 1 00:22:59.429 }, 00:22:59.429 { 00:22:59.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.429 "dma_device_type": 2 00:22:59.429 } 00:22:59.429 ], 00:22:59.429 "driver_specific": { 00:22:59.429 "passthru": { 00:22:59.429 "name": "pt4", 00:22:59.429 "base_bdev_name": "malloc4" 00:22:59.429 } 00:22:59.429 } 00:22:59.429 }' 00:22:59.429 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:59.429 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:59.429 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:59.429 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:59.688 06:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:59.688 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:22:59.947 [2024-07-25 06:39:13.419704] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 6e7d88ba-b214-4424-b249-aae761a61abd '!=' 6e7d88ba-b214-4424-b249-aae761a61abd ']' 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1196360 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1196360 ']' 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1196360 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1196360 00:22:59.947 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:59.948 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:59.948 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1196360' 00:22:59.948 killing process with pid 1196360 00:22:59.948 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1196360 00:22:59.948 [2024-07-25 06:39:13.502890] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:59.948 [2024-07-25 06:39:13.502951] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:59.948 [2024-07-25 06:39:13.503008] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:59.948 [2024-07-25 06:39:13.503018] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2479950 name raid_bdev1, state offline 00:22:59.948 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1196360 00:23:00.206 [2024-07-25 06:39:13.535449] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:00.206 06:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:23:00.206 00:23:00.206 real 0m15.741s 00:23:00.206 user 0m28.269s 00:23:00.206 sys 0m3.007s 00:23:00.206 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:00.206 06:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:00.206 ************************************ 00:23:00.206 END TEST raid_superblock_test 00:23:00.206 ************************************ 00:23:00.206 06:39:13 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:23:00.206 06:39:13 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:23:00.206 06:39:13 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:00.206 06:39:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:00.464 ************************************ 00:23:00.464 START TEST raid_read_error_test 00:23:00.464 ************************************ 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 read 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.hzS0Mg3HCm 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1199895 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1199895 /var/tmp/spdk-raid.sock 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1199895 ']' 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:00.464 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:00.464 06:39:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:00.464 [2024-07-25 06:39:13.877863] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:23:00.464 [2024-07-25 06:39:13.877918] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1199895 ] 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:01.0 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:01.1 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:01.2 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:01.3 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:01.4 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:01.5 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:01.6 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:01.7 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:02.0 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:02.1 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:02.2 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:02.3 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:02.4 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:02.5 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:02.6 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3d:02.7 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:01.0 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:01.1 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:01.2 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:01.3 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:01.4 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:01.5 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:01.6 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:01.7 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:02.0 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:02.1 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:02.2 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:02.3 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:02.4 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:02.5 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:02.6 cannot be used 00:23:00.464 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:00.464 EAL: Requested device 0000:3f:02.7 cannot be used 00:23:00.464 [2024-07-25 06:39:14.000069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:00.722 [2024-07-25 06:39:14.045458] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:00.722 [2024-07-25 06:39:14.103539] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:00.722 [2024-07-25 06:39:14.103575] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:01.287 06:39:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:01.287 06:39:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:23:01.287 06:39:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:01.287 06:39:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:01.545 BaseBdev1_malloc 00:23:01.545 06:39:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:01.802 true 00:23:01.802 06:39:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:02.059 [2024-07-25 06:39:15.442785] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:02.059 [2024-07-25 06:39:15.442825] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:02.059 [2024-07-25 06:39:15.442843] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e59a60 00:23:02.059 [2024-07-25 06:39:15.442855] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:02.059 [2024-07-25 06:39:15.444330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:02.059 [2024-07-25 06:39:15.444358] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:02.059 BaseBdev1 00:23:02.059 06:39:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:02.059 06:39:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:02.317 BaseBdev2_malloc 00:23:02.317 06:39:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:02.574 true 00:23:02.574 06:39:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:02.574 [2024-07-25 06:39:16.124966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:02.574 [2024-07-25 06:39:16.125007] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:02.574 [2024-07-25 06:39:16.125028] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e5edc0 00:23:02.574 [2024-07-25 06:39:16.125040] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:02.574 [2024-07-25 06:39:16.126408] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:02.574 [2024-07-25 06:39:16.126435] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:02.574 BaseBdev2 00:23:02.832 06:39:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:02.832 06:39:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:02.832 BaseBdev3_malloc 00:23:02.832 06:39:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:03.089 true 00:23:03.089 06:39:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:03.346 [2024-07-25 06:39:16.790954] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:03.346 [2024-07-25 06:39:16.790993] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.346 [2024-07-25 06:39:16.791011] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e5f420 00:23:03.346 [2024-07-25 06:39:16.791023] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.346 [2024-07-25 06:39:16.792413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.346 [2024-07-25 06:39:16.792439] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:03.346 BaseBdev3 00:23:03.346 06:39:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:03.346 06:39:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:03.604 BaseBdev4_malloc 00:23:03.604 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:03.861 true 00:23:03.861 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:04.119 [2024-07-25 06:39:17.477202] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:04.119 [2024-07-25 06:39:17.477242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.119 [2024-07-25 06:39:17.477261] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e629b0 00:23:04.119 [2024-07-25 06:39:17.477273] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.119 [2024-07-25 06:39:17.478634] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.119 [2024-07-25 06:39:17.478661] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:04.119 BaseBdev4 00:23:04.119 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:04.377 [2024-07-25 06:39:17.701826] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:04.377 [2024-07-25 06:39:17.702978] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:04.377 [2024-07-25 06:39:17.703041] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:04.377 [2024-07-25 06:39:17.703096] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:04.377 [2024-07-25 06:39:17.703319] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e62ec0 00:23:04.377 [2024-07-25 06:39:17.703330] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:23:04.377 [2024-07-25 06:39:17.703502] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb54b0 00:23:04.377 [2024-07-25 06:39:17.703637] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e62ec0 00:23:04.377 [2024-07-25 06:39:17.703646] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e62ec0 00:23:04.377 [2024-07-25 06:39:17.703738] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.377 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.634 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.634 "name": "raid_bdev1", 00:23:04.634 "uuid": "cbed8592-f17b-444c-b028-73fca20de07f", 00:23:04.634 "strip_size_kb": 64, 00:23:04.634 "state": "online", 00:23:04.634 "raid_level": "concat", 00:23:04.634 "superblock": true, 00:23:04.634 "num_base_bdevs": 4, 00:23:04.634 "num_base_bdevs_discovered": 4, 00:23:04.634 "num_base_bdevs_operational": 4, 00:23:04.634 "base_bdevs_list": [ 00:23:04.634 { 00:23:04.634 "name": "BaseBdev1", 00:23:04.634 "uuid": "2ba3a2b5-3a6d-5e87-b170-725d3fcf0848", 00:23:04.634 "is_configured": true, 00:23:04.634 "data_offset": 2048, 00:23:04.634 "data_size": 63488 00:23:04.634 }, 00:23:04.634 { 00:23:04.634 "name": "BaseBdev2", 00:23:04.634 "uuid": "9cfaa9bd-acdb-5f96-b42e-193fec2b4faa", 00:23:04.634 "is_configured": true, 00:23:04.635 "data_offset": 2048, 00:23:04.635 "data_size": 63488 00:23:04.635 }, 00:23:04.635 { 00:23:04.635 "name": "BaseBdev3", 00:23:04.635 "uuid": "f97e4112-9f72-5f43-b104-e1bb4b4dfb17", 00:23:04.635 "is_configured": true, 00:23:04.635 "data_offset": 2048, 00:23:04.635 "data_size": 63488 00:23:04.635 }, 00:23:04.635 { 00:23:04.635 "name": "BaseBdev4", 00:23:04.635 "uuid": "1dec3c3b-499f-5118-993f-2ff130c5912f", 00:23:04.635 "is_configured": true, 00:23:04.635 "data_offset": 2048, 00:23:04.635 "data_size": 63488 00:23:04.635 } 00:23:04.635 ] 00:23:04.635 }' 00:23:04.635 06:39:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.635 06:39:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:05.200 06:39:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:23:05.200 06:39:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:05.200 [2024-07-25 06:39:18.616454] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e66590 00:23:06.134 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.392 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.650 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.650 "name": "raid_bdev1", 00:23:06.650 "uuid": "cbed8592-f17b-444c-b028-73fca20de07f", 00:23:06.650 "strip_size_kb": 64, 00:23:06.650 "state": "online", 00:23:06.650 "raid_level": "concat", 00:23:06.650 "superblock": true, 00:23:06.650 "num_base_bdevs": 4, 00:23:06.650 "num_base_bdevs_discovered": 4, 00:23:06.650 "num_base_bdevs_operational": 4, 00:23:06.650 "base_bdevs_list": [ 00:23:06.650 { 00:23:06.650 "name": "BaseBdev1", 00:23:06.650 "uuid": "2ba3a2b5-3a6d-5e87-b170-725d3fcf0848", 00:23:06.650 "is_configured": true, 00:23:06.650 "data_offset": 2048, 00:23:06.650 "data_size": 63488 00:23:06.650 }, 00:23:06.650 { 00:23:06.650 "name": "BaseBdev2", 00:23:06.650 "uuid": "9cfaa9bd-acdb-5f96-b42e-193fec2b4faa", 00:23:06.650 "is_configured": true, 00:23:06.650 "data_offset": 2048, 00:23:06.650 "data_size": 63488 00:23:06.650 }, 00:23:06.650 { 00:23:06.650 "name": "BaseBdev3", 00:23:06.650 "uuid": "f97e4112-9f72-5f43-b104-e1bb4b4dfb17", 00:23:06.650 "is_configured": true, 00:23:06.650 "data_offset": 2048, 00:23:06.650 "data_size": 63488 00:23:06.650 }, 00:23:06.650 { 00:23:06.650 "name": "BaseBdev4", 00:23:06.650 "uuid": "1dec3c3b-499f-5118-993f-2ff130c5912f", 00:23:06.650 "is_configured": true, 00:23:06.650 "data_offset": 2048, 00:23:06.650 "data_size": 63488 00:23:06.650 } 00:23:06.650 ] 00:23:06.650 }' 00:23:06.650 06:39:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.650 06:39:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:07.216 06:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:07.475 [2024-07-25 06:39:20.787390] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:07.475 [2024-07-25 06:39:20.787421] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:07.475 [2024-07-25 06:39:20.790322] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:07.475 [2024-07-25 06:39:20.790359] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:07.475 [2024-07-25 06:39:20.790394] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:07.475 [2024-07-25 06:39:20.790404] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e62ec0 name raid_bdev1, state offline 00:23:07.475 0 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1199895 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1199895 ']' 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1199895 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1199895 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1199895' 00:23:07.475 killing process with pid 1199895 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1199895 00:23:07.475 [2024-07-25 06:39:20.862890] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:07.475 06:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1199895 00:23:07.475 [2024-07-25 06:39:20.889638] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.hzS0Mg3HCm 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:23:07.734 00:23:07.734 real 0m7.282s 00:23:07.734 user 0m11.595s 00:23:07.734 sys 0m1.310s 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:07.734 06:39:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:07.734 ************************************ 00:23:07.734 END TEST raid_read_error_test 00:23:07.734 ************************************ 00:23:07.734 06:39:21 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:23:07.734 06:39:21 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:23:07.734 06:39:21 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:07.734 06:39:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:07.734 ************************************ 00:23:07.734 START TEST raid_write_error_test 00:23:07.734 ************************************ 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 write 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.mER2seF42p 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1201123 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1201123 /var/tmp/spdk-raid.sock 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1201123 ']' 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:07.734 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:07.734 06:39:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:07.734 [2024-07-25 06:39:21.242182] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:23:07.734 [2024-07-25 06:39:21.242237] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1201123 ] 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:01.0 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:01.1 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:01.2 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:01.3 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:01.4 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:01.5 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:01.6 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:01.7 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:02.0 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:02.1 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:02.2 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:02.3 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:02.4 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:02.5 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:02.6 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3d:02.7 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:01.0 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:01.1 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:01.2 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:01.3 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:01.4 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:01.5 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:01.6 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:01.7 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:02.0 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:02.1 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:02.2 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.993 EAL: Requested device 0000:3f:02.3 cannot be used 00:23:07.993 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.994 EAL: Requested device 0000:3f:02.4 cannot be used 00:23:07.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.994 EAL: Requested device 0000:3f:02.5 cannot be used 00:23:07.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.994 EAL: Requested device 0000:3f:02.6 cannot be used 00:23:07.994 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:07.994 EAL: Requested device 0000:3f:02.7 cannot be used 00:23:07.994 [2024-07-25 06:39:21.378993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:07.994 [2024-07-25 06:39:21.423926] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:07.994 [2024-07-25 06:39:21.493081] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:07.994 [2024-07-25 06:39:21.493118] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:08.928 06:39:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:08.928 06:39:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:23:08.928 06:39:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:08.928 06:39:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:08.928 BaseBdev1_malloc 00:23:08.928 06:39:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:09.186 true 00:23:09.186 06:39:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:09.444 [2024-07-25 06:39:22.778118] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:09.444 [2024-07-25 06:39:22.778164] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.444 [2024-07-25 06:39:22.778182] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151ba60 00:23:09.444 [2024-07-25 06:39:22.778194] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.444 [2024-07-25 06:39:22.779665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.444 [2024-07-25 06:39:22.779691] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:09.444 BaseBdev1 00:23:09.444 06:39:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:09.444 06:39:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:09.702 BaseBdev2_malloc 00:23:09.702 06:39:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:09.702 true 00:23:09.702 06:39:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:09.969 [2024-07-25 06:39:23.436040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:09.969 [2024-07-25 06:39:23.436078] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.969 [2024-07-25 06:39:23.436097] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1520dc0 00:23:09.969 [2024-07-25 06:39:23.436108] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.969 [2024-07-25 06:39:23.437431] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.969 [2024-07-25 06:39:23.437459] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:09.969 BaseBdev2 00:23:09.969 06:39:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:09.969 06:39:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:10.229 BaseBdev3_malloc 00:23:10.229 06:39:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:10.487 true 00:23:10.487 06:39:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:10.745 [2024-07-25 06:39:24.122168] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:10.745 [2024-07-25 06:39:24.122206] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:10.745 [2024-07-25 06:39:24.122224] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1521420 00:23:10.745 [2024-07-25 06:39:24.122241] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:10.745 [2024-07-25 06:39:24.123603] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:10.745 [2024-07-25 06:39:24.123630] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:10.745 BaseBdev3 00:23:10.745 06:39:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:10.745 06:39:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:11.003 BaseBdev4_malloc 00:23:11.003 06:39:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:11.260 true 00:23:11.260 06:39:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:11.260 [2024-07-25 06:39:24.800287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:11.260 [2024-07-25 06:39:24.800327] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:11.260 [2024-07-25 06:39:24.800347] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15249b0 00:23:11.260 [2024-07-25 06:39:24.800359] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:11.260 [2024-07-25 06:39:24.801727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:11.260 [2024-07-25 06:39:24.801753] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:11.260 BaseBdev4 00:23:11.519 06:39:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:11.519 [2024-07-25 06:39:25.024902] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:11.519 [2024-07-25 06:39:25.026033] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:11.519 [2024-07-25 06:39:25.026095] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:11.519 [2024-07-25 06:39:25.026158] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:11.519 [2024-07-25 06:39:25.026368] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1524ec0 00:23:11.519 [2024-07-25 06:39:25.026378] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:23:11.519 [2024-07-25 06:39:25.026550] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13774b0 00:23:11.519 [2024-07-25 06:39:25.026682] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1524ec0 00:23:11.519 [2024-07-25 06:39:25.026691] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1524ec0 00:23:11.519 [2024-07-25 06:39:25.026781] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.519 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.816 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.816 "name": "raid_bdev1", 00:23:11.816 "uuid": "f7e985c3-9a7d-4bb2-8c3e-de92475f31bc", 00:23:11.816 "strip_size_kb": 64, 00:23:11.816 "state": "online", 00:23:11.816 "raid_level": "concat", 00:23:11.816 "superblock": true, 00:23:11.816 "num_base_bdevs": 4, 00:23:11.816 "num_base_bdevs_discovered": 4, 00:23:11.816 "num_base_bdevs_operational": 4, 00:23:11.816 "base_bdevs_list": [ 00:23:11.816 { 00:23:11.816 "name": "BaseBdev1", 00:23:11.816 "uuid": "2bae25e2-771e-5176-a8dd-619a9eff90e6", 00:23:11.816 "is_configured": true, 00:23:11.816 "data_offset": 2048, 00:23:11.816 "data_size": 63488 00:23:11.816 }, 00:23:11.816 { 00:23:11.816 "name": "BaseBdev2", 00:23:11.816 "uuid": "1a2148c6-5734-5e7d-82b3-8f911c0bf767", 00:23:11.816 "is_configured": true, 00:23:11.816 "data_offset": 2048, 00:23:11.816 "data_size": 63488 00:23:11.816 }, 00:23:11.816 { 00:23:11.816 "name": "BaseBdev3", 00:23:11.816 "uuid": "a14f13d2-2cda-54f2-8320-a3f7dbcc6fd7", 00:23:11.816 "is_configured": true, 00:23:11.816 "data_offset": 2048, 00:23:11.816 "data_size": 63488 00:23:11.816 }, 00:23:11.816 { 00:23:11.816 "name": "BaseBdev4", 00:23:11.816 "uuid": "7db1abe1-f569-526b-940a-3463df8bb79f", 00:23:11.816 "is_configured": true, 00:23:11.816 "data_offset": 2048, 00:23:11.816 "data_size": 63488 00:23:11.816 } 00:23:11.816 ] 00:23:11.816 }' 00:23:11.816 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.816 06:39:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:12.383 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:23:12.383 06:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:12.641 [2024-07-25 06:39:25.951575] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1528590 00:23:13.576 06:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.576 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.835 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.835 "name": "raid_bdev1", 00:23:13.835 "uuid": "f7e985c3-9a7d-4bb2-8c3e-de92475f31bc", 00:23:13.835 "strip_size_kb": 64, 00:23:13.835 "state": "online", 00:23:13.835 "raid_level": "concat", 00:23:13.835 "superblock": true, 00:23:13.835 "num_base_bdevs": 4, 00:23:13.835 "num_base_bdevs_discovered": 4, 00:23:13.835 "num_base_bdevs_operational": 4, 00:23:13.835 "base_bdevs_list": [ 00:23:13.835 { 00:23:13.835 "name": "BaseBdev1", 00:23:13.835 "uuid": "2bae25e2-771e-5176-a8dd-619a9eff90e6", 00:23:13.835 "is_configured": true, 00:23:13.835 "data_offset": 2048, 00:23:13.835 "data_size": 63488 00:23:13.835 }, 00:23:13.835 { 00:23:13.835 "name": "BaseBdev2", 00:23:13.835 "uuid": "1a2148c6-5734-5e7d-82b3-8f911c0bf767", 00:23:13.835 "is_configured": true, 00:23:13.835 "data_offset": 2048, 00:23:13.835 "data_size": 63488 00:23:13.835 }, 00:23:13.835 { 00:23:13.835 "name": "BaseBdev3", 00:23:13.835 "uuid": "a14f13d2-2cda-54f2-8320-a3f7dbcc6fd7", 00:23:13.835 "is_configured": true, 00:23:13.835 "data_offset": 2048, 00:23:13.835 "data_size": 63488 00:23:13.835 }, 00:23:13.835 { 00:23:13.835 "name": "BaseBdev4", 00:23:13.835 "uuid": "7db1abe1-f569-526b-940a-3463df8bb79f", 00:23:13.835 "is_configured": true, 00:23:13.835 "data_offset": 2048, 00:23:13.835 "data_size": 63488 00:23:13.835 } 00:23:13.835 ] 00:23:13.835 }' 00:23:13.835 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.835 06:39:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:14.401 06:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:14.659 [2024-07-25 06:39:28.106145] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:14.659 [2024-07-25 06:39:28.106185] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:14.659 [2024-07-25 06:39:28.109084] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:14.659 [2024-07-25 06:39:28.109123] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:14.659 [2024-07-25 06:39:28.109166] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:14.659 [2024-07-25 06:39:28.109177] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1524ec0 name raid_bdev1, state offline 00:23:14.659 0 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1201123 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1201123 ']' 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1201123 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1201123 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1201123' 00:23:14.659 killing process with pid 1201123 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1201123 00:23:14.659 [2024-07-25 06:39:28.183762] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:14.659 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1201123 00:23:14.659 [2024-07-25 06:39:28.210647] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.mER2seF42p 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.47 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.47 != \0\.\0\0 ]] 00:23:14.918 00:23:14.918 real 0m7.235s 00:23:14.918 user 0m11.512s 00:23:14.918 sys 0m1.278s 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:14.918 06:39:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:14.918 ************************************ 00:23:14.918 END TEST raid_write_error_test 00:23:14.918 ************************************ 00:23:14.918 06:39:28 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:23:14.918 06:39:28 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:23:14.918 06:39:28 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:23:14.918 06:39:28 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:14.918 06:39:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:15.176 ************************************ 00:23:15.176 START TEST raid_state_function_test 00:23:15.176 ************************************ 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 false 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1202529 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1202529' 00:23:15.176 Process raid pid: 1202529 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1202529 /var/tmp/spdk-raid.sock 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1202529 ']' 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:15.176 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:15.176 06:39:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:15.176 [2024-07-25 06:39:28.560041] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:23:15.176 [2024-07-25 06:39:28.560100] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:01.0 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:01.1 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:01.2 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:01.3 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:01.4 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:01.5 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:01.6 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:01.7 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:02.0 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:02.1 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:02.2 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:02.3 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:02.4 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:02.5 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:02.6 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3d:02.7 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:01.0 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:01.1 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:01.2 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:01.3 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:01.4 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:01.5 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:01.6 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:01.7 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:02.0 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:02.1 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:02.2 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:02.3 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:02.4 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:02.5 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:02.6 cannot be used 00:23:15.176 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:15.176 EAL: Requested device 0000:3f:02.7 cannot be used 00:23:15.176 [2024-07-25 06:39:28.696133] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:15.433 [2024-07-25 06:39:28.740530] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:15.433 [2024-07-25 06:39:28.799933] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:15.433 [2024-07-25 06:39:28.799958] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:15.998 06:39:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:15.998 06:39:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:23:15.998 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:16.255 [2024-07-25 06:39:29.667863] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:16.255 [2024-07-25 06:39:29.667903] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:16.255 [2024-07-25 06:39:29.667913] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:16.255 [2024-07-25 06:39:29.667923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:16.255 [2024-07-25 06:39:29.667931] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:16.255 [2024-07-25 06:39:29.667942] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:16.255 [2024-07-25 06:39:29.667950] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:16.255 [2024-07-25 06:39:29.667960] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.255 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:16.512 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:16.512 "name": "Existed_Raid", 00:23:16.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.512 "strip_size_kb": 0, 00:23:16.512 "state": "configuring", 00:23:16.512 "raid_level": "raid1", 00:23:16.512 "superblock": false, 00:23:16.512 "num_base_bdevs": 4, 00:23:16.512 "num_base_bdevs_discovered": 0, 00:23:16.512 "num_base_bdevs_operational": 4, 00:23:16.512 "base_bdevs_list": [ 00:23:16.512 { 00:23:16.512 "name": "BaseBdev1", 00:23:16.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.512 "is_configured": false, 00:23:16.512 "data_offset": 0, 00:23:16.512 "data_size": 0 00:23:16.512 }, 00:23:16.512 { 00:23:16.512 "name": "BaseBdev2", 00:23:16.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.512 "is_configured": false, 00:23:16.512 "data_offset": 0, 00:23:16.512 "data_size": 0 00:23:16.512 }, 00:23:16.512 { 00:23:16.512 "name": "BaseBdev3", 00:23:16.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.512 "is_configured": false, 00:23:16.512 "data_offset": 0, 00:23:16.512 "data_size": 0 00:23:16.512 }, 00:23:16.512 { 00:23:16.512 "name": "BaseBdev4", 00:23:16.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.512 "is_configured": false, 00:23:16.512 "data_offset": 0, 00:23:16.512 "data_size": 0 00:23:16.512 } 00:23:16.512 ] 00:23:16.512 }' 00:23:16.512 06:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:16.512 06:39:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:17.077 06:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:17.335 [2024-07-25 06:39:30.698456] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:17.335 [2024-07-25 06:39:30.698483] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xee9470 name Existed_Raid, state configuring 00:23:17.335 06:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:17.592 [2024-07-25 06:39:30.927060] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:17.592 [2024-07-25 06:39:30.927087] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:17.592 [2024-07-25 06:39:30.927096] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:17.592 [2024-07-25 06:39:30.927107] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:17.592 [2024-07-25 06:39:30.927114] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:17.592 [2024-07-25 06:39:30.927124] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:17.592 [2024-07-25 06:39:30.927132] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:17.592 [2024-07-25 06:39:30.927148] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:17.592 06:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:17.848 [2024-07-25 06:39:31.165118] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:17.848 BaseBdev1 00:23:17.848 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:17.848 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:23:17.848 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:17.848 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:17.848 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:17.848 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:17.848 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:18.106 [ 00:23:18.106 { 00:23:18.106 "name": "BaseBdev1", 00:23:18.106 "aliases": [ 00:23:18.106 "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8" 00:23:18.106 ], 00:23:18.106 "product_name": "Malloc disk", 00:23:18.106 "block_size": 512, 00:23:18.106 "num_blocks": 65536, 00:23:18.106 "uuid": "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8", 00:23:18.106 "assigned_rate_limits": { 00:23:18.106 "rw_ios_per_sec": 0, 00:23:18.106 "rw_mbytes_per_sec": 0, 00:23:18.106 "r_mbytes_per_sec": 0, 00:23:18.106 "w_mbytes_per_sec": 0 00:23:18.106 }, 00:23:18.106 "claimed": true, 00:23:18.106 "claim_type": "exclusive_write", 00:23:18.106 "zoned": false, 00:23:18.106 "supported_io_types": { 00:23:18.106 "read": true, 00:23:18.106 "write": true, 00:23:18.106 "unmap": true, 00:23:18.106 "flush": true, 00:23:18.106 "reset": true, 00:23:18.106 "nvme_admin": false, 00:23:18.106 "nvme_io": false, 00:23:18.106 "nvme_io_md": false, 00:23:18.106 "write_zeroes": true, 00:23:18.106 "zcopy": true, 00:23:18.106 "get_zone_info": false, 00:23:18.106 "zone_management": false, 00:23:18.106 "zone_append": false, 00:23:18.106 "compare": false, 00:23:18.106 "compare_and_write": false, 00:23:18.106 "abort": true, 00:23:18.106 "seek_hole": false, 00:23:18.106 "seek_data": false, 00:23:18.106 "copy": true, 00:23:18.106 "nvme_iov_md": false 00:23:18.106 }, 00:23:18.106 "memory_domains": [ 00:23:18.106 { 00:23:18.106 "dma_device_id": "system", 00:23:18.106 "dma_device_type": 1 00:23:18.106 }, 00:23:18.106 { 00:23:18.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:18.106 "dma_device_type": 2 00:23:18.106 } 00:23:18.106 ], 00:23:18.106 "driver_specific": {} 00:23:18.106 } 00:23:18.106 ] 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.106 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:18.364 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.364 "name": "Existed_Raid", 00:23:18.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.364 "strip_size_kb": 0, 00:23:18.364 "state": "configuring", 00:23:18.364 "raid_level": "raid1", 00:23:18.364 "superblock": false, 00:23:18.364 "num_base_bdevs": 4, 00:23:18.364 "num_base_bdevs_discovered": 1, 00:23:18.364 "num_base_bdevs_operational": 4, 00:23:18.364 "base_bdevs_list": [ 00:23:18.364 { 00:23:18.364 "name": "BaseBdev1", 00:23:18.364 "uuid": "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8", 00:23:18.364 "is_configured": true, 00:23:18.364 "data_offset": 0, 00:23:18.364 "data_size": 65536 00:23:18.364 }, 00:23:18.364 { 00:23:18.364 "name": "BaseBdev2", 00:23:18.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.364 "is_configured": false, 00:23:18.364 "data_offset": 0, 00:23:18.364 "data_size": 0 00:23:18.364 }, 00:23:18.364 { 00:23:18.364 "name": "BaseBdev3", 00:23:18.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.364 "is_configured": false, 00:23:18.364 "data_offset": 0, 00:23:18.364 "data_size": 0 00:23:18.364 }, 00:23:18.364 { 00:23:18.364 "name": "BaseBdev4", 00:23:18.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.364 "is_configured": false, 00:23:18.364 "data_offset": 0, 00:23:18.364 "data_size": 0 00:23:18.364 } 00:23:18.364 ] 00:23:18.364 }' 00:23:18.364 06:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.364 06:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:18.929 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:19.186 [2024-07-25 06:39:32.644991] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:19.186 [2024-07-25 06:39:32.645032] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xee8ce0 name Existed_Raid, state configuring 00:23:19.186 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:19.444 [2024-07-25 06:39:32.873619] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:19.444 [2024-07-25 06:39:32.874981] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:19.444 [2024-07-25 06:39:32.875013] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:19.444 [2024-07-25 06:39:32.875022] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:19.444 [2024-07-25 06:39:32.875033] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:19.444 [2024-07-25 06:39:32.875041] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:19.444 [2024-07-25 06:39:32.875052] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.444 06:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:19.702 06:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.702 "name": "Existed_Raid", 00:23:19.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.702 "strip_size_kb": 0, 00:23:19.702 "state": "configuring", 00:23:19.702 "raid_level": "raid1", 00:23:19.702 "superblock": false, 00:23:19.702 "num_base_bdevs": 4, 00:23:19.702 "num_base_bdevs_discovered": 1, 00:23:19.702 "num_base_bdevs_operational": 4, 00:23:19.702 "base_bdevs_list": [ 00:23:19.702 { 00:23:19.702 "name": "BaseBdev1", 00:23:19.702 "uuid": "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8", 00:23:19.702 "is_configured": true, 00:23:19.702 "data_offset": 0, 00:23:19.702 "data_size": 65536 00:23:19.702 }, 00:23:19.702 { 00:23:19.702 "name": "BaseBdev2", 00:23:19.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.702 "is_configured": false, 00:23:19.702 "data_offset": 0, 00:23:19.702 "data_size": 0 00:23:19.702 }, 00:23:19.702 { 00:23:19.702 "name": "BaseBdev3", 00:23:19.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.702 "is_configured": false, 00:23:19.702 "data_offset": 0, 00:23:19.702 "data_size": 0 00:23:19.702 }, 00:23:19.702 { 00:23:19.702 "name": "BaseBdev4", 00:23:19.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.702 "is_configured": false, 00:23:19.702 "data_offset": 0, 00:23:19.702 "data_size": 0 00:23:19.702 } 00:23:19.702 ] 00:23:19.702 }' 00:23:19.702 06:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.702 06:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:20.268 06:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:20.526 [2024-07-25 06:39:33.903435] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:20.526 BaseBdev2 00:23:20.526 06:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:20.526 06:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:23:20.526 06:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:20.526 06:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:20.526 06:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:20.526 06:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:20.526 06:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:20.783 06:39:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:21.041 [ 00:23:21.042 { 00:23:21.042 "name": "BaseBdev2", 00:23:21.042 "aliases": [ 00:23:21.042 "07ada6e3-2eda-40e3-aa56-c441082021b3" 00:23:21.042 ], 00:23:21.042 "product_name": "Malloc disk", 00:23:21.042 "block_size": 512, 00:23:21.042 "num_blocks": 65536, 00:23:21.042 "uuid": "07ada6e3-2eda-40e3-aa56-c441082021b3", 00:23:21.042 "assigned_rate_limits": { 00:23:21.042 "rw_ios_per_sec": 0, 00:23:21.042 "rw_mbytes_per_sec": 0, 00:23:21.042 "r_mbytes_per_sec": 0, 00:23:21.042 "w_mbytes_per_sec": 0 00:23:21.042 }, 00:23:21.042 "claimed": true, 00:23:21.042 "claim_type": "exclusive_write", 00:23:21.042 "zoned": false, 00:23:21.042 "supported_io_types": { 00:23:21.042 "read": true, 00:23:21.042 "write": true, 00:23:21.042 "unmap": true, 00:23:21.042 "flush": true, 00:23:21.042 "reset": true, 00:23:21.042 "nvme_admin": false, 00:23:21.042 "nvme_io": false, 00:23:21.042 "nvme_io_md": false, 00:23:21.042 "write_zeroes": true, 00:23:21.042 "zcopy": true, 00:23:21.042 "get_zone_info": false, 00:23:21.042 "zone_management": false, 00:23:21.042 "zone_append": false, 00:23:21.042 "compare": false, 00:23:21.042 "compare_and_write": false, 00:23:21.042 "abort": true, 00:23:21.042 "seek_hole": false, 00:23:21.042 "seek_data": false, 00:23:21.042 "copy": true, 00:23:21.042 "nvme_iov_md": false 00:23:21.042 }, 00:23:21.042 "memory_domains": [ 00:23:21.042 { 00:23:21.042 "dma_device_id": "system", 00:23:21.042 "dma_device_type": 1 00:23:21.042 }, 00:23:21.042 { 00:23:21.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:21.042 "dma_device_type": 2 00:23:21.042 } 00:23:21.042 ], 00:23:21.042 "driver_specific": {} 00:23:21.042 } 00:23:21.042 ] 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.042 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:21.300 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.300 "name": "Existed_Raid", 00:23:21.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:21.300 "strip_size_kb": 0, 00:23:21.300 "state": "configuring", 00:23:21.300 "raid_level": "raid1", 00:23:21.300 "superblock": false, 00:23:21.300 "num_base_bdevs": 4, 00:23:21.300 "num_base_bdevs_discovered": 2, 00:23:21.300 "num_base_bdevs_operational": 4, 00:23:21.300 "base_bdevs_list": [ 00:23:21.300 { 00:23:21.300 "name": "BaseBdev1", 00:23:21.300 "uuid": "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8", 00:23:21.300 "is_configured": true, 00:23:21.300 "data_offset": 0, 00:23:21.300 "data_size": 65536 00:23:21.300 }, 00:23:21.300 { 00:23:21.300 "name": "BaseBdev2", 00:23:21.300 "uuid": "07ada6e3-2eda-40e3-aa56-c441082021b3", 00:23:21.300 "is_configured": true, 00:23:21.300 "data_offset": 0, 00:23:21.300 "data_size": 65536 00:23:21.300 }, 00:23:21.300 { 00:23:21.300 "name": "BaseBdev3", 00:23:21.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:21.300 "is_configured": false, 00:23:21.300 "data_offset": 0, 00:23:21.300 "data_size": 0 00:23:21.300 }, 00:23:21.300 { 00:23:21.300 "name": "BaseBdev4", 00:23:21.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:21.300 "is_configured": false, 00:23:21.300 "data_offset": 0, 00:23:21.300 "data_size": 0 00:23:21.300 } 00:23:21.300 ] 00:23:21.300 }' 00:23:21.300 06:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.300 06:39:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:21.866 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:21.866 [2024-07-25 06:39:35.330442] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:21.866 BaseBdev3 00:23:21.866 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:23:21.866 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:23:21.866 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:21.866 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:21.866 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:21.866 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:21.866 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:22.124 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:22.382 [ 00:23:22.382 { 00:23:22.382 "name": "BaseBdev3", 00:23:22.382 "aliases": [ 00:23:22.382 "41e38f80-c35a-4bfb-971f-ec72668ac0d7" 00:23:22.382 ], 00:23:22.382 "product_name": "Malloc disk", 00:23:22.382 "block_size": 512, 00:23:22.382 "num_blocks": 65536, 00:23:22.382 "uuid": "41e38f80-c35a-4bfb-971f-ec72668ac0d7", 00:23:22.382 "assigned_rate_limits": { 00:23:22.382 "rw_ios_per_sec": 0, 00:23:22.382 "rw_mbytes_per_sec": 0, 00:23:22.382 "r_mbytes_per_sec": 0, 00:23:22.382 "w_mbytes_per_sec": 0 00:23:22.382 }, 00:23:22.382 "claimed": true, 00:23:22.382 "claim_type": "exclusive_write", 00:23:22.382 "zoned": false, 00:23:22.382 "supported_io_types": { 00:23:22.382 "read": true, 00:23:22.382 "write": true, 00:23:22.382 "unmap": true, 00:23:22.382 "flush": true, 00:23:22.382 "reset": true, 00:23:22.382 "nvme_admin": false, 00:23:22.382 "nvme_io": false, 00:23:22.382 "nvme_io_md": false, 00:23:22.382 "write_zeroes": true, 00:23:22.382 "zcopy": true, 00:23:22.382 "get_zone_info": false, 00:23:22.382 "zone_management": false, 00:23:22.382 "zone_append": false, 00:23:22.382 "compare": false, 00:23:22.382 "compare_and_write": false, 00:23:22.382 "abort": true, 00:23:22.382 "seek_hole": false, 00:23:22.382 "seek_data": false, 00:23:22.382 "copy": true, 00:23:22.382 "nvme_iov_md": false 00:23:22.382 }, 00:23:22.382 "memory_domains": [ 00:23:22.382 { 00:23:22.382 "dma_device_id": "system", 00:23:22.382 "dma_device_type": 1 00:23:22.382 }, 00:23:22.382 { 00:23:22.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.382 "dma_device_type": 2 00:23:22.382 } 00:23:22.382 ], 00:23:22.382 "driver_specific": {} 00:23:22.382 } 00:23:22.382 ] 00:23:22.382 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:22.382 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:22.382 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:22.382 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:22.382 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:22.382 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:22.382 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:22.382 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:22.383 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:22.383 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:22.383 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:22.383 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:22.383 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:22.383 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.383 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:22.640 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:22.641 "name": "Existed_Raid", 00:23:22.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.641 "strip_size_kb": 0, 00:23:22.641 "state": "configuring", 00:23:22.641 "raid_level": "raid1", 00:23:22.641 "superblock": false, 00:23:22.641 "num_base_bdevs": 4, 00:23:22.641 "num_base_bdevs_discovered": 3, 00:23:22.641 "num_base_bdevs_operational": 4, 00:23:22.641 "base_bdevs_list": [ 00:23:22.641 { 00:23:22.641 "name": "BaseBdev1", 00:23:22.641 "uuid": "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8", 00:23:22.641 "is_configured": true, 00:23:22.641 "data_offset": 0, 00:23:22.641 "data_size": 65536 00:23:22.641 }, 00:23:22.641 { 00:23:22.641 "name": "BaseBdev2", 00:23:22.641 "uuid": "07ada6e3-2eda-40e3-aa56-c441082021b3", 00:23:22.641 "is_configured": true, 00:23:22.641 "data_offset": 0, 00:23:22.641 "data_size": 65536 00:23:22.641 }, 00:23:22.641 { 00:23:22.641 "name": "BaseBdev3", 00:23:22.641 "uuid": "41e38f80-c35a-4bfb-971f-ec72668ac0d7", 00:23:22.641 "is_configured": true, 00:23:22.641 "data_offset": 0, 00:23:22.641 "data_size": 65536 00:23:22.641 }, 00:23:22.641 { 00:23:22.641 "name": "BaseBdev4", 00:23:22.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.641 "is_configured": false, 00:23:22.641 "data_offset": 0, 00:23:22.641 "data_size": 0 00:23:22.641 } 00:23:22.641 ] 00:23:22.641 }' 00:23:22.641 06:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:22.641 06:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:23.206 06:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:23.464 [2024-07-25 06:39:36.769396] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:23.464 [2024-07-25 06:39:36.769434] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x109c250 00:23:23.464 [2024-07-25 06:39:36.769442] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:23.464 [2024-07-25 06:39:36.769625] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1090de0 00:23:23.464 [2024-07-25 06:39:36.769750] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x109c250 00:23:23.464 [2024-07-25 06:39:36.769760] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x109c250 00:23:23.464 [2024-07-25 06:39:36.769913] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:23.464 BaseBdev4 00:23:23.464 06:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:23:23.464 06:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:23:23.464 06:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:23.464 06:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:23.464 06:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:23.464 06:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:23.464 06:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:23.464 06:39:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:23.722 [ 00:23:23.722 { 00:23:23.722 "name": "BaseBdev4", 00:23:23.723 "aliases": [ 00:23:23.723 "e5428415-b96e-4498-81c8-e30618574fd1" 00:23:23.723 ], 00:23:23.723 "product_name": "Malloc disk", 00:23:23.723 "block_size": 512, 00:23:23.723 "num_blocks": 65536, 00:23:23.723 "uuid": "e5428415-b96e-4498-81c8-e30618574fd1", 00:23:23.723 "assigned_rate_limits": { 00:23:23.723 "rw_ios_per_sec": 0, 00:23:23.723 "rw_mbytes_per_sec": 0, 00:23:23.723 "r_mbytes_per_sec": 0, 00:23:23.723 "w_mbytes_per_sec": 0 00:23:23.723 }, 00:23:23.723 "claimed": true, 00:23:23.723 "claim_type": "exclusive_write", 00:23:23.723 "zoned": false, 00:23:23.723 "supported_io_types": { 00:23:23.723 "read": true, 00:23:23.723 "write": true, 00:23:23.723 "unmap": true, 00:23:23.723 "flush": true, 00:23:23.723 "reset": true, 00:23:23.723 "nvme_admin": false, 00:23:23.723 "nvme_io": false, 00:23:23.723 "nvme_io_md": false, 00:23:23.723 "write_zeroes": true, 00:23:23.723 "zcopy": true, 00:23:23.723 "get_zone_info": false, 00:23:23.723 "zone_management": false, 00:23:23.723 "zone_append": false, 00:23:23.723 "compare": false, 00:23:23.723 "compare_and_write": false, 00:23:23.723 "abort": true, 00:23:23.723 "seek_hole": false, 00:23:23.723 "seek_data": false, 00:23:23.723 "copy": true, 00:23:23.723 "nvme_iov_md": false 00:23:23.723 }, 00:23:23.723 "memory_domains": [ 00:23:23.723 { 00:23:23.723 "dma_device_id": "system", 00:23:23.723 "dma_device_type": 1 00:23:23.723 }, 00:23:23.723 { 00:23:23.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:23.723 "dma_device_type": 2 00:23:23.723 } 00:23:23.723 ], 00:23:23.723 "driver_specific": {} 00:23:23.723 } 00:23:23.723 ] 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.723 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:23.981 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.981 "name": "Existed_Raid", 00:23:23.981 "uuid": "44a4f753-6301-4e26-af14-9bd0200d6975", 00:23:23.981 "strip_size_kb": 0, 00:23:23.981 "state": "online", 00:23:23.981 "raid_level": "raid1", 00:23:23.981 "superblock": false, 00:23:23.981 "num_base_bdevs": 4, 00:23:23.981 "num_base_bdevs_discovered": 4, 00:23:23.981 "num_base_bdevs_operational": 4, 00:23:23.981 "base_bdevs_list": [ 00:23:23.981 { 00:23:23.981 "name": "BaseBdev1", 00:23:23.981 "uuid": "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8", 00:23:23.981 "is_configured": true, 00:23:23.981 "data_offset": 0, 00:23:23.981 "data_size": 65536 00:23:23.981 }, 00:23:23.981 { 00:23:23.981 "name": "BaseBdev2", 00:23:23.981 "uuid": "07ada6e3-2eda-40e3-aa56-c441082021b3", 00:23:23.981 "is_configured": true, 00:23:23.981 "data_offset": 0, 00:23:23.981 "data_size": 65536 00:23:23.981 }, 00:23:23.981 { 00:23:23.981 "name": "BaseBdev3", 00:23:23.981 "uuid": "41e38f80-c35a-4bfb-971f-ec72668ac0d7", 00:23:23.981 "is_configured": true, 00:23:23.981 "data_offset": 0, 00:23:23.981 "data_size": 65536 00:23:23.981 }, 00:23:23.981 { 00:23:23.981 "name": "BaseBdev4", 00:23:23.981 "uuid": "e5428415-b96e-4498-81c8-e30618574fd1", 00:23:23.981 "is_configured": true, 00:23:23.981 "data_offset": 0, 00:23:23.981 "data_size": 65536 00:23:23.981 } 00:23:23.981 ] 00:23:23.981 }' 00:23:23.981 06:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.981 06:39:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:24.547 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:24.547 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:24.547 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:24.547 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:24.547 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:24.547 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:24.547 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:24.547 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:24.813 [2024-07-25 06:39:38.269669] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:24.813 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:24.813 "name": "Existed_Raid", 00:23:24.813 "aliases": [ 00:23:24.813 "44a4f753-6301-4e26-af14-9bd0200d6975" 00:23:24.813 ], 00:23:24.813 "product_name": "Raid Volume", 00:23:24.813 "block_size": 512, 00:23:24.813 "num_blocks": 65536, 00:23:24.813 "uuid": "44a4f753-6301-4e26-af14-9bd0200d6975", 00:23:24.813 "assigned_rate_limits": { 00:23:24.813 "rw_ios_per_sec": 0, 00:23:24.813 "rw_mbytes_per_sec": 0, 00:23:24.813 "r_mbytes_per_sec": 0, 00:23:24.813 "w_mbytes_per_sec": 0 00:23:24.813 }, 00:23:24.813 "claimed": false, 00:23:24.813 "zoned": false, 00:23:24.813 "supported_io_types": { 00:23:24.813 "read": true, 00:23:24.813 "write": true, 00:23:24.813 "unmap": false, 00:23:24.813 "flush": false, 00:23:24.813 "reset": true, 00:23:24.813 "nvme_admin": false, 00:23:24.813 "nvme_io": false, 00:23:24.813 "nvme_io_md": false, 00:23:24.813 "write_zeroes": true, 00:23:24.813 "zcopy": false, 00:23:24.813 "get_zone_info": false, 00:23:24.813 "zone_management": false, 00:23:24.813 "zone_append": false, 00:23:24.813 "compare": false, 00:23:24.813 "compare_and_write": false, 00:23:24.813 "abort": false, 00:23:24.813 "seek_hole": false, 00:23:24.813 "seek_data": false, 00:23:24.813 "copy": false, 00:23:24.813 "nvme_iov_md": false 00:23:24.813 }, 00:23:24.813 "memory_domains": [ 00:23:24.813 { 00:23:24.813 "dma_device_id": "system", 00:23:24.813 "dma_device_type": 1 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.813 "dma_device_type": 2 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "dma_device_id": "system", 00:23:24.813 "dma_device_type": 1 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.813 "dma_device_type": 2 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "dma_device_id": "system", 00:23:24.813 "dma_device_type": 1 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.813 "dma_device_type": 2 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "dma_device_id": "system", 00:23:24.813 "dma_device_type": 1 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.813 "dma_device_type": 2 00:23:24.813 } 00:23:24.813 ], 00:23:24.813 "driver_specific": { 00:23:24.813 "raid": { 00:23:24.813 "uuid": "44a4f753-6301-4e26-af14-9bd0200d6975", 00:23:24.813 "strip_size_kb": 0, 00:23:24.813 "state": "online", 00:23:24.813 "raid_level": "raid1", 00:23:24.813 "superblock": false, 00:23:24.813 "num_base_bdevs": 4, 00:23:24.813 "num_base_bdevs_discovered": 4, 00:23:24.813 "num_base_bdevs_operational": 4, 00:23:24.813 "base_bdevs_list": [ 00:23:24.813 { 00:23:24.813 "name": "BaseBdev1", 00:23:24.813 "uuid": "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8", 00:23:24.813 "is_configured": true, 00:23:24.813 "data_offset": 0, 00:23:24.813 "data_size": 65536 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "name": "BaseBdev2", 00:23:24.813 "uuid": "07ada6e3-2eda-40e3-aa56-c441082021b3", 00:23:24.813 "is_configured": true, 00:23:24.813 "data_offset": 0, 00:23:24.813 "data_size": 65536 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "name": "BaseBdev3", 00:23:24.813 "uuid": "41e38f80-c35a-4bfb-971f-ec72668ac0d7", 00:23:24.813 "is_configured": true, 00:23:24.813 "data_offset": 0, 00:23:24.813 "data_size": 65536 00:23:24.813 }, 00:23:24.813 { 00:23:24.813 "name": "BaseBdev4", 00:23:24.813 "uuid": "e5428415-b96e-4498-81c8-e30618574fd1", 00:23:24.813 "is_configured": true, 00:23:24.813 "data_offset": 0, 00:23:24.813 "data_size": 65536 00:23:24.813 } 00:23:24.813 ] 00:23:24.813 } 00:23:24.813 } 00:23:24.813 }' 00:23:24.813 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:24.813 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:24.813 BaseBdev2 00:23:24.814 BaseBdev3 00:23:24.814 BaseBdev4' 00:23:24.814 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:24.814 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:24.814 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:25.097 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:25.097 "name": "BaseBdev1", 00:23:25.097 "aliases": [ 00:23:25.097 "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8" 00:23:25.097 ], 00:23:25.097 "product_name": "Malloc disk", 00:23:25.097 "block_size": 512, 00:23:25.097 "num_blocks": 65536, 00:23:25.097 "uuid": "a0c280bc-cdb6-41cf-9ecf-b219b5a541b8", 00:23:25.097 "assigned_rate_limits": { 00:23:25.097 "rw_ios_per_sec": 0, 00:23:25.097 "rw_mbytes_per_sec": 0, 00:23:25.097 "r_mbytes_per_sec": 0, 00:23:25.097 "w_mbytes_per_sec": 0 00:23:25.097 }, 00:23:25.097 "claimed": true, 00:23:25.097 "claim_type": "exclusive_write", 00:23:25.097 "zoned": false, 00:23:25.097 "supported_io_types": { 00:23:25.097 "read": true, 00:23:25.097 "write": true, 00:23:25.097 "unmap": true, 00:23:25.097 "flush": true, 00:23:25.097 "reset": true, 00:23:25.097 "nvme_admin": false, 00:23:25.097 "nvme_io": false, 00:23:25.097 "nvme_io_md": false, 00:23:25.097 "write_zeroes": true, 00:23:25.097 "zcopy": true, 00:23:25.097 "get_zone_info": false, 00:23:25.097 "zone_management": false, 00:23:25.097 "zone_append": false, 00:23:25.097 "compare": false, 00:23:25.097 "compare_and_write": false, 00:23:25.097 "abort": true, 00:23:25.097 "seek_hole": false, 00:23:25.097 "seek_data": false, 00:23:25.097 "copy": true, 00:23:25.097 "nvme_iov_md": false 00:23:25.097 }, 00:23:25.097 "memory_domains": [ 00:23:25.097 { 00:23:25.097 "dma_device_id": "system", 00:23:25.097 "dma_device_type": 1 00:23:25.097 }, 00:23:25.097 { 00:23:25.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:25.097 "dma_device_type": 2 00:23:25.097 } 00:23:25.097 ], 00:23:25.097 "driver_specific": {} 00:23:25.097 }' 00:23:25.097 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:25.097 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:25.355 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:25.613 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:25.613 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:25.613 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:25.613 06:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:25.613 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:25.613 "name": "BaseBdev2", 00:23:25.613 "aliases": [ 00:23:25.613 "07ada6e3-2eda-40e3-aa56-c441082021b3" 00:23:25.613 ], 00:23:25.613 "product_name": "Malloc disk", 00:23:25.613 "block_size": 512, 00:23:25.613 "num_blocks": 65536, 00:23:25.613 "uuid": "07ada6e3-2eda-40e3-aa56-c441082021b3", 00:23:25.613 "assigned_rate_limits": { 00:23:25.613 "rw_ios_per_sec": 0, 00:23:25.613 "rw_mbytes_per_sec": 0, 00:23:25.613 "r_mbytes_per_sec": 0, 00:23:25.613 "w_mbytes_per_sec": 0 00:23:25.613 }, 00:23:25.613 "claimed": true, 00:23:25.613 "claim_type": "exclusive_write", 00:23:25.613 "zoned": false, 00:23:25.613 "supported_io_types": { 00:23:25.613 "read": true, 00:23:25.613 "write": true, 00:23:25.613 "unmap": true, 00:23:25.613 "flush": true, 00:23:25.613 "reset": true, 00:23:25.613 "nvme_admin": false, 00:23:25.613 "nvme_io": false, 00:23:25.613 "nvme_io_md": false, 00:23:25.613 "write_zeroes": true, 00:23:25.613 "zcopy": true, 00:23:25.613 "get_zone_info": false, 00:23:25.613 "zone_management": false, 00:23:25.613 "zone_append": false, 00:23:25.613 "compare": false, 00:23:25.613 "compare_and_write": false, 00:23:25.613 "abort": true, 00:23:25.613 "seek_hole": false, 00:23:25.613 "seek_data": false, 00:23:25.613 "copy": true, 00:23:25.613 "nvme_iov_md": false 00:23:25.613 }, 00:23:25.613 "memory_domains": [ 00:23:25.613 { 00:23:25.613 "dma_device_id": "system", 00:23:25.613 "dma_device_type": 1 00:23:25.613 }, 00:23:25.613 { 00:23:25.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:25.613 "dma_device_type": 2 00:23:25.613 } 00:23:25.613 ], 00:23:25.613 "driver_specific": {} 00:23:25.613 }' 00:23:25.613 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:25.871 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.129 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.129 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:26.129 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:26.129 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:26.129 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:26.387 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:26.387 "name": "BaseBdev3", 00:23:26.387 "aliases": [ 00:23:26.387 "41e38f80-c35a-4bfb-971f-ec72668ac0d7" 00:23:26.387 ], 00:23:26.387 "product_name": "Malloc disk", 00:23:26.387 "block_size": 512, 00:23:26.387 "num_blocks": 65536, 00:23:26.387 "uuid": "41e38f80-c35a-4bfb-971f-ec72668ac0d7", 00:23:26.387 "assigned_rate_limits": { 00:23:26.387 "rw_ios_per_sec": 0, 00:23:26.387 "rw_mbytes_per_sec": 0, 00:23:26.387 "r_mbytes_per_sec": 0, 00:23:26.387 "w_mbytes_per_sec": 0 00:23:26.387 }, 00:23:26.387 "claimed": true, 00:23:26.387 "claim_type": "exclusive_write", 00:23:26.387 "zoned": false, 00:23:26.387 "supported_io_types": { 00:23:26.387 "read": true, 00:23:26.387 "write": true, 00:23:26.387 "unmap": true, 00:23:26.387 "flush": true, 00:23:26.387 "reset": true, 00:23:26.387 "nvme_admin": false, 00:23:26.387 "nvme_io": false, 00:23:26.387 "nvme_io_md": false, 00:23:26.387 "write_zeroes": true, 00:23:26.387 "zcopy": true, 00:23:26.387 "get_zone_info": false, 00:23:26.387 "zone_management": false, 00:23:26.387 "zone_append": false, 00:23:26.387 "compare": false, 00:23:26.387 "compare_and_write": false, 00:23:26.387 "abort": true, 00:23:26.387 "seek_hole": false, 00:23:26.387 "seek_data": false, 00:23:26.387 "copy": true, 00:23:26.387 "nvme_iov_md": false 00:23:26.387 }, 00:23:26.387 "memory_domains": [ 00:23:26.387 { 00:23:26.387 "dma_device_id": "system", 00:23:26.387 "dma_device_type": 1 00:23:26.387 }, 00:23:26.387 { 00:23:26.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:26.387 "dma_device_type": 2 00:23:26.387 } 00:23:26.387 ], 00:23:26.387 "driver_specific": {} 00:23:26.387 }' 00:23:26.387 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.387 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.387 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:26.387 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.387 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.387 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:26.387 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.645 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.645 06:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:26.645 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.645 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.645 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:26.645 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:26.645 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:26.645 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:26.903 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:26.903 "name": "BaseBdev4", 00:23:26.903 "aliases": [ 00:23:26.903 "e5428415-b96e-4498-81c8-e30618574fd1" 00:23:26.903 ], 00:23:26.903 "product_name": "Malloc disk", 00:23:26.903 "block_size": 512, 00:23:26.903 "num_blocks": 65536, 00:23:26.903 "uuid": "e5428415-b96e-4498-81c8-e30618574fd1", 00:23:26.903 "assigned_rate_limits": { 00:23:26.903 "rw_ios_per_sec": 0, 00:23:26.903 "rw_mbytes_per_sec": 0, 00:23:26.903 "r_mbytes_per_sec": 0, 00:23:26.903 "w_mbytes_per_sec": 0 00:23:26.903 }, 00:23:26.903 "claimed": true, 00:23:26.903 "claim_type": "exclusive_write", 00:23:26.903 "zoned": false, 00:23:26.903 "supported_io_types": { 00:23:26.903 "read": true, 00:23:26.903 "write": true, 00:23:26.903 "unmap": true, 00:23:26.903 "flush": true, 00:23:26.903 "reset": true, 00:23:26.903 "nvme_admin": false, 00:23:26.903 "nvme_io": false, 00:23:26.903 "nvme_io_md": false, 00:23:26.903 "write_zeroes": true, 00:23:26.903 "zcopy": true, 00:23:26.903 "get_zone_info": false, 00:23:26.903 "zone_management": false, 00:23:26.903 "zone_append": false, 00:23:26.903 "compare": false, 00:23:26.903 "compare_and_write": false, 00:23:26.903 "abort": true, 00:23:26.903 "seek_hole": false, 00:23:26.903 "seek_data": false, 00:23:26.903 "copy": true, 00:23:26.903 "nvme_iov_md": false 00:23:26.903 }, 00:23:26.903 "memory_domains": [ 00:23:26.903 { 00:23:26.903 "dma_device_id": "system", 00:23:26.903 "dma_device_type": 1 00:23:26.903 }, 00:23:26.903 { 00:23:26.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:26.903 "dma_device_type": 2 00:23:26.903 } 00:23:26.903 ], 00:23:26.903 "driver_specific": {} 00:23:26.903 }' 00:23:26.903 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.903 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.903 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:26.903 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:27.161 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:27.419 [2024-07-25 06:39:40.884306] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.419 06:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:27.677 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.677 "name": "Existed_Raid", 00:23:27.677 "uuid": "44a4f753-6301-4e26-af14-9bd0200d6975", 00:23:27.677 "strip_size_kb": 0, 00:23:27.677 "state": "online", 00:23:27.677 "raid_level": "raid1", 00:23:27.677 "superblock": false, 00:23:27.677 "num_base_bdevs": 4, 00:23:27.677 "num_base_bdevs_discovered": 3, 00:23:27.677 "num_base_bdevs_operational": 3, 00:23:27.677 "base_bdevs_list": [ 00:23:27.677 { 00:23:27.677 "name": null, 00:23:27.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.677 "is_configured": false, 00:23:27.677 "data_offset": 0, 00:23:27.677 "data_size": 65536 00:23:27.677 }, 00:23:27.677 { 00:23:27.677 "name": "BaseBdev2", 00:23:27.677 "uuid": "07ada6e3-2eda-40e3-aa56-c441082021b3", 00:23:27.677 "is_configured": true, 00:23:27.677 "data_offset": 0, 00:23:27.677 "data_size": 65536 00:23:27.677 }, 00:23:27.677 { 00:23:27.677 "name": "BaseBdev3", 00:23:27.677 "uuid": "41e38f80-c35a-4bfb-971f-ec72668ac0d7", 00:23:27.677 "is_configured": true, 00:23:27.677 "data_offset": 0, 00:23:27.677 "data_size": 65536 00:23:27.677 }, 00:23:27.677 { 00:23:27.677 "name": "BaseBdev4", 00:23:27.677 "uuid": "e5428415-b96e-4498-81c8-e30618574fd1", 00:23:27.677 "is_configured": true, 00:23:27.677 "data_offset": 0, 00:23:27.677 "data_size": 65536 00:23:27.677 } 00:23:27.677 ] 00:23:27.677 }' 00:23:27.677 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.677 06:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:28.243 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:28.243 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:28.243 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.243 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:28.501 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:28.501 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:28.501 06:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:28.759 [2024-07-25 06:39:42.136584] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:28.759 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:28.759 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:28.759 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:28.759 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.016 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:29.016 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:29.016 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:23:29.274 [2024-07-25 06:39:42.587912] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:29.274 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:29.274 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:29.274 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.274 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:29.533 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:29.533 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:29.533 06:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:23:29.533 [2024-07-25 06:39:43.055246] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:23:29.533 [2024-07-25 06:39:43.055324] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:29.533 [2024-07-25 06:39:43.065460] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:29.533 [2024-07-25 06:39:43.065489] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:29.533 [2024-07-25 06:39:43.065500] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x109c250 name Existed_Raid, state offline 00:23:29.533 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:29.533 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:29.533 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.533 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:29.791 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:29.791 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:29.791 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:23:29.791 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:23:29.791 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:29.791 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:30.047 BaseBdev2 00:23:30.047 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:23:30.047 06:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:23:30.047 06:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:30.047 06:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:30.047 06:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:30.047 06:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:30.047 06:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:30.305 06:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:30.563 [ 00:23:30.563 { 00:23:30.563 "name": "BaseBdev2", 00:23:30.563 "aliases": [ 00:23:30.563 "c6e939e6-1efd-4489-84c9-28564d7f0d54" 00:23:30.563 ], 00:23:30.563 "product_name": "Malloc disk", 00:23:30.563 "block_size": 512, 00:23:30.563 "num_blocks": 65536, 00:23:30.563 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:30.563 "assigned_rate_limits": { 00:23:30.563 "rw_ios_per_sec": 0, 00:23:30.563 "rw_mbytes_per_sec": 0, 00:23:30.563 "r_mbytes_per_sec": 0, 00:23:30.563 "w_mbytes_per_sec": 0 00:23:30.563 }, 00:23:30.563 "claimed": false, 00:23:30.563 "zoned": false, 00:23:30.563 "supported_io_types": { 00:23:30.563 "read": true, 00:23:30.563 "write": true, 00:23:30.563 "unmap": true, 00:23:30.563 "flush": true, 00:23:30.563 "reset": true, 00:23:30.563 "nvme_admin": false, 00:23:30.563 "nvme_io": false, 00:23:30.563 "nvme_io_md": false, 00:23:30.563 "write_zeroes": true, 00:23:30.563 "zcopy": true, 00:23:30.563 "get_zone_info": false, 00:23:30.563 "zone_management": false, 00:23:30.563 "zone_append": false, 00:23:30.563 "compare": false, 00:23:30.563 "compare_and_write": false, 00:23:30.563 "abort": true, 00:23:30.563 "seek_hole": false, 00:23:30.563 "seek_data": false, 00:23:30.563 "copy": true, 00:23:30.563 "nvme_iov_md": false 00:23:30.563 }, 00:23:30.563 "memory_domains": [ 00:23:30.563 { 00:23:30.563 "dma_device_id": "system", 00:23:30.563 "dma_device_type": 1 00:23:30.563 }, 00:23:30.563 { 00:23:30.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:30.563 "dma_device_type": 2 00:23:30.563 } 00:23:30.563 ], 00:23:30.563 "driver_specific": {} 00:23:30.563 } 00:23:30.563 ] 00:23:30.563 06:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:30.563 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:30.563 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:30.563 06:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:30.821 BaseBdev3 00:23:30.821 06:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:23:30.821 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:23:30.821 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:30.821 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:30.821 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:30.821 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:30.821 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:31.079 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:31.337 [ 00:23:31.337 { 00:23:31.337 "name": "BaseBdev3", 00:23:31.337 "aliases": [ 00:23:31.337 "219a0fd4-d378-40b5-874f-03fa0347f0a3" 00:23:31.337 ], 00:23:31.337 "product_name": "Malloc disk", 00:23:31.337 "block_size": 512, 00:23:31.337 "num_blocks": 65536, 00:23:31.337 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:31.337 "assigned_rate_limits": { 00:23:31.337 "rw_ios_per_sec": 0, 00:23:31.337 "rw_mbytes_per_sec": 0, 00:23:31.337 "r_mbytes_per_sec": 0, 00:23:31.337 "w_mbytes_per_sec": 0 00:23:31.337 }, 00:23:31.337 "claimed": false, 00:23:31.337 "zoned": false, 00:23:31.337 "supported_io_types": { 00:23:31.337 "read": true, 00:23:31.337 "write": true, 00:23:31.337 "unmap": true, 00:23:31.337 "flush": true, 00:23:31.337 "reset": true, 00:23:31.337 "nvme_admin": false, 00:23:31.337 "nvme_io": false, 00:23:31.337 "nvme_io_md": false, 00:23:31.337 "write_zeroes": true, 00:23:31.337 "zcopy": true, 00:23:31.337 "get_zone_info": false, 00:23:31.337 "zone_management": false, 00:23:31.337 "zone_append": false, 00:23:31.337 "compare": false, 00:23:31.337 "compare_and_write": false, 00:23:31.337 "abort": true, 00:23:31.337 "seek_hole": false, 00:23:31.337 "seek_data": false, 00:23:31.337 "copy": true, 00:23:31.337 "nvme_iov_md": false 00:23:31.337 }, 00:23:31.337 "memory_domains": [ 00:23:31.337 { 00:23:31.337 "dma_device_id": "system", 00:23:31.337 "dma_device_type": 1 00:23:31.337 }, 00:23:31.337 { 00:23:31.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.337 "dma_device_type": 2 00:23:31.337 } 00:23:31.337 ], 00:23:31.337 "driver_specific": {} 00:23:31.337 } 00:23:31.337 ] 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:31.337 BaseBdev4 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:31.337 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:31.338 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:31.338 06:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:31.595 06:39:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:31.853 [ 00:23:31.853 { 00:23:31.853 "name": "BaseBdev4", 00:23:31.853 "aliases": [ 00:23:31.853 "8b3b9850-ecc4-461d-bdee-d7ae85ec6117" 00:23:31.853 ], 00:23:31.853 "product_name": "Malloc disk", 00:23:31.853 "block_size": 512, 00:23:31.853 "num_blocks": 65536, 00:23:31.853 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:31.853 "assigned_rate_limits": { 00:23:31.853 "rw_ios_per_sec": 0, 00:23:31.853 "rw_mbytes_per_sec": 0, 00:23:31.853 "r_mbytes_per_sec": 0, 00:23:31.853 "w_mbytes_per_sec": 0 00:23:31.853 }, 00:23:31.853 "claimed": false, 00:23:31.853 "zoned": false, 00:23:31.853 "supported_io_types": { 00:23:31.853 "read": true, 00:23:31.853 "write": true, 00:23:31.854 "unmap": true, 00:23:31.854 "flush": true, 00:23:31.854 "reset": true, 00:23:31.854 "nvme_admin": false, 00:23:31.854 "nvme_io": false, 00:23:31.854 "nvme_io_md": false, 00:23:31.854 "write_zeroes": true, 00:23:31.854 "zcopy": true, 00:23:31.854 "get_zone_info": false, 00:23:31.854 "zone_management": false, 00:23:31.854 "zone_append": false, 00:23:31.854 "compare": false, 00:23:31.854 "compare_and_write": false, 00:23:31.854 "abort": true, 00:23:31.854 "seek_hole": false, 00:23:31.854 "seek_data": false, 00:23:31.854 "copy": true, 00:23:31.854 "nvme_iov_md": false 00:23:31.854 }, 00:23:31.854 "memory_domains": [ 00:23:31.854 { 00:23:31.854 "dma_device_id": "system", 00:23:31.854 "dma_device_type": 1 00:23:31.854 }, 00:23:31.854 { 00:23:31.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.854 "dma_device_type": 2 00:23:31.854 } 00:23:31.854 ], 00:23:31.854 "driver_specific": {} 00:23:31.854 } 00:23:31.854 ] 00:23:31.854 06:39:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:31.854 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:31.854 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:31.854 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:32.112 [2024-07-25 06:39:45.564635] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:32.112 [2024-07-25 06:39:45.564676] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:32.112 [2024-07-25 06:39:45.564694] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:32.112 [2024-07-25 06:39:45.565910] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:32.112 [2024-07-25 06:39:45.565951] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.112 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:32.370 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.370 "name": "Existed_Raid", 00:23:32.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.370 "strip_size_kb": 0, 00:23:32.370 "state": "configuring", 00:23:32.370 "raid_level": "raid1", 00:23:32.370 "superblock": false, 00:23:32.370 "num_base_bdevs": 4, 00:23:32.370 "num_base_bdevs_discovered": 3, 00:23:32.370 "num_base_bdevs_operational": 4, 00:23:32.370 "base_bdevs_list": [ 00:23:32.370 { 00:23:32.370 "name": "BaseBdev1", 00:23:32.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.370 "is_configured": false, 00:23:32.370 "data_offset": 0, 00:23:32.370 "data_size": 0 00:23:32.370 }, 00:23:32.370 { 00:23:32.370 "name": "BaseBdev2", 00:23:32.370 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:32.370 "is_configured": true, 00:23:32.370 "data_offset": 0, 00:23:32.370 "data_size": 65536 00:23:32.370 }, 00:23:32.370 { 00:23:32.370 "name": "BaseBdev3", 00:23:32.370 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:32.370 "is_configured": true, 00:23:32.370 "data_offset": 0, 00:23:32.370 "data_size": 65536 00:23:32.370 }, 00:23:32.370 { 00:23:32.370 "name": "BaseBdev4", 00:23:32.370 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:32.370 "is_configured": true, 00:23:32.370 "data_offset": 0, 00:23:32.370 "data_size": 65536 00:23:32.370 } 00:23:32.370 ] 00:23:32.370 }' 00:23:32.370 06:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.370 06:39:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:32.936 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:33.194 [2024-07-25 06:39:46.603348] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.194 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:33.452 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:33.452 "name": "Existed_Raid", 00:23:33.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.452 "strip_size_kb": 0, 00:23:33.452 "state": "configuring", 00:23:33.452 "raid_level": "raid1", 00:23:33.452 "superblock": false, 00:23:33.452 "num_base_bdevs": 4, 00:23:33.452 "num_base_bdevs_discovered": 2, 00:23:33.452 "num_base_bdevs_operational": 4, 00:23:33.452 "base_bdevs_list": [ 00:23:33.452 { 00:23:33.452 "name": "BaseBdev1", 00:23:33.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.452 "is_configured": false, 00:23:33.452 "data_offset": 0, 00:23:33.452 "data_size": 0 00:23:33.452 }, 00:23:33.452 { 00:23:33.452 "name": null, 00:23:33.452 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:33.452 "is_configured": false, 00:23:33.452 "data_offset": 0, 00:23:33.452 "data_size": 65536 00:23:33.452 }, 00:23:33.452 { 00:23:33.452 "name": "BaseBdev3", 00:23:33.452 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:33.452 "is_configured": true, 00:23:33.452 "data_offset": 0, 00:23:33.452 "data_size": 65536 00:23:33.452 }, 00:23:33.452 { 00:23:33.452 "name": "BaseBdev4", 00:23:33.452 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:33.452 "is_configured": true, 00:23:33.452 "data_offset": 0, 00:23:33.452 "data_size": 65536 00:23:33.452 } 00:23:33.452 ] 00:23:33.452 }' 00:23:33.452 06:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:33.452 06:39:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:34.017 06:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.017 06:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:34.275 06:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:23:34.275 06:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:34.533 [2024-07-25 06:39:47.861740] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:34.533 BaseBdev1 00:23:34.533 06:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:23:34.533 06:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:23:34.533 06:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:34.533 06:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:34.533 06:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:34.533 06:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:34.533 06:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:34.791 06:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:34.791 [ 00:23:34.791 { 00:23:34.791 "name": "BaseBdev1", 00:23:34.791 "aliases": [ 00:23:34.791 "08c2af55-da09-432a-9912-fd430210f850" 00:23:34.791 ], 00:23:34.791 "product_name": "Malloc disk", 00:23:34.791 "block_size": 512, 00:23:34.791 "num_blocks": 65536, 00:23:34.791 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:34.791 "assigned_rate_limits": { 00:23:34.791 "rw_ios_per_sec": 0, 00:23:34.791 "rw_mbytes_per_sec": 0, 00:23:34.791 "r_mbytes_per_sec": 0, 00:23:34.791 "w_mbytes_per_sec": 0 00:23:34.791 }, 00:23:34.791 "claimed": true, 00:23:34.791 "claim_type": "exclusive_write", 00:23:34.791 "zoned": false, 00:23:34.791 "supported_io_types": { 00:23:34.791 "read": true, 00:23:34.791 "write": true, 00:23:34.791 "unmap": true, 00:23:34.791 "flush": true, 00:23:34.791 "reset": true, 00:23:34.791 "nvme_admin": false, 00:23:34.791 "nvme_io": false, 00:23:34.791 "nvme_io_md": false, 00:23:34.791 "write_zeroes": true, 00:23:34.791 "zcopy": true, 00:23:34.791 "get_zone_info": false, 00:23:34.791 "zone_management": false, 00:23:34.791 "zone_append": false, 00:23:34.791 "compare": false, 00:23:34.791 "compare_and_write": false, 00:23:34.791 "abort": true, 00:23:34.791 "seek_hole": false, 00:23:34.791 "seek_data": false, 00:23:34.791 "copy": true, 00:23:34.791 "nvme_iov_md": false 00:23:34.791 }, 00:23:34.791 "memory_domains": [ 00:23:34.791 { 00:23:34.791 "dma_device_id": "system", 00:23:34.791 "dma_device_type": 1 00:23:34.791 }, 00:23:34.791 { 00:23:34.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:34.791 "dma_device_type": 2 00:23:34.791 } 00:23:34.791 ], 00:23:34.791 "driver_specific": {} 00:23:34.791 } 00:23:34.791 ] 00:23:34.791 06:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:34.791 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:34.791 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:34.791 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:34.791 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.791 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.791 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:34.792 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.792 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.792 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.792 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.792 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.792 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:35.049 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:35.049 "name": "Existed_Raid", 00:23:35.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.049 "strip_size_kb": 0, 00:23:35.049 "state": "configuring", 00:23:35.049 "raid_level": "raid1", 00:23:35.049 "superblock": false, 00:23:35.049 "num_base_bdevs": 4, 00:23:35.049 "num_base_bdevs_discovered": 3, 00:23:35.049 "num_base_bdevs_operational": 4, 00:23:35.049 "base_bdevs_list": [ 00:23:35.049 { 00:23:35.049 "name": "BaseBdev1", 00:23:35.049 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:35.049 "is_configured": true, 00:23:35.049 "data_offset": 0, 00:23:35.050 "data_size": 65536 00:23:35.050 }, 00:23:35.050 { 00:23:35.050 "name": null, 00:23:35.050 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:35.050 "is_configured": false, 00:23:35.050 "data_offset": 0, 00:23:35.050 "data_size": 65536 00:23:35.050 }, 00:23:35.050 { 00:23:35.050 "name": "BaseBdev3", 00:23:35.050 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:35.050 "is_configured": true, 00:23:35.050 "data_offset": 0, 00:23:35.050 "data_size": 65536 00:23:35.050 }, 00:23:35.050 { 00:23:35.050 "name": "BaseBdev4", 00:23:35.050 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:35.050 "is_configured": true, 00:23:35.050 "data_offset": 0, 00:23:35.050 "data_size": 65536 00:23:35.050 } 00:23:35.050 ] 00:23:35.050 }' 00:23:35.050 06:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:35.050 06:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:35.615 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:35.615 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.873 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:23:35.873 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:23:36.131 [2024-07-25 06:39:49.574292] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.131 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:36.389 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.389 "name": "Existed_Raid", 00:23:36.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.389 "strip_size_kb": 0, 00:23:36.389 "state": "configuring", 00:23:36.389 "raid_level": "raid1", 00:23:36.389 "superblock": false, 00:23:36.389 "num_base_bdevs": 4, 00:23:36.389 "num_base_bdevs_discovered": 2, 00:23:36.389 "num_base_bdevs_operational": 4, 00:23:36.389 "base_bdevs_list": [ 00:23:36.389 { 00:23:36.389 "name": "BaseBdev1", 00:23:36.389 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:36.389 "is_configured": true, 00:23:36.389 "data_offset": 0, 00:23:36.389 "data_size": 65536 00:23:36.389 }, 00:23:36.389 { 00:23:36.389 "name": null, 00:23:36.389 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:36.389 "is_configured": false, 00:23:36.389 "data_offset": 0, 00:23:36.389 "data_size": 65536 00:23:36.389 }, 00:23:36.389 { 00:23:36.389 "name": null, 00:23:36.389 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:36.389 "is_configured": false, 00:23:36.389 "data_offset": 0, 00:23:36.389 "data_size": 65536 00:23:36.389 }, 00:23:36.389 { 00:23:36.389 "name": "BaseBdev4", 00:23:36.389 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:36.389 "is_configured": true, 00:23:36.389 "data_offset": 0, 00:23:36.389 "data_size": 65536 00:23:36.389 } 00:23:36.389 ] 00:23:36.389 }' 00:23:36.389 06:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.389 06:39:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:36.954 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.954 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:37.212 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:23:37.212 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:23:37.471 [2024-07-25 06:39:50.809591] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.471 06:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:37.787 06:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:37.787 "name": "Existed_Raid", 00:23:37.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.787 "strip_size_kb": 0, 00:23:37.787 "state": "configuring", 00:23:37.787 "raid_level": "raid1", 00:23:37.787 "superblock": false, 00:23:37.787 "num_base_bdevs": 4, 00:23:37.787 "num_base_bdevs_discovered": 3, 00:23:37.787 "num_base_bdevs_operational": 4, 00:23:37.787 "base_bdevs_list": [ 00:23:37.787 { 00:23:37.787 "name": "BaseBdev1", 00:23:37.787 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:37.787 "is_configured": true, 00:23:37.787 "data_offset": 0, 00:23:37.787 "data_size": 65536 00:23:37.787 }, 00:23:37.787 { 00:23:37.787 "name": null, 00:23:37.787 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:37.787 "is_configured": false, 00:23:37.787 "data_offset": 0, 00:23:37.787 "data_size": 65536 00:23:37.787 }, 00:23:37.787 { 00:23:37.787 "name": "BaseBdev3", 00:23:37.787 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:37.787 "is_configured": true, 00:23:37.787 "data_offset": 0, 00:23:37.787 "data_size": 65536 00:23:37.787 }, 00:23:37.787 { 00:23:37.787 "name": "BaseBdev4", 00:23:37.787 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:37.787 "is_configured": true, 00:23:37.787 "data_offset": 0, 00:23:37.787 "data_size": 65536 00:23:37.787 } 00:23:37.787 ] 00:23:37.787 }' 00:23:37.787 06:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:37.787 06:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:38.353 06:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.353 06:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:38.353 06:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:23:38.353 06:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:38.611 [2024-07-25 06:39:52.072916] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.612 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:38.870 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:38.870 "name": "Existed_Raid", 00:23:38.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.870 "strip_size_kb": 0, 00:23:38.870 "state": "configuring", 00:23:38.870 "raid_level": "raid1", 00:23:38.870 "superblock": false, 00:23:38.870 "num_base_bdevs": 4, 00:23:38.870 "num_base_bdevs_discovered": 2, 00:23:38.870 "num_base_bdevs_operational": 4, 00:23:38.870 "base_bdevs_list": [ 00:23:38.870 { 00:23:38.870 "name": null, 00:23:38.870 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:38.870 "is_configured": false, 00:23:38.870 "data_offset": 0, 00:23:38.870 "data_size": 65536 00:23:38.870 }, 00:23:38.870 { 00:23:38.870 "name": null, 00:23:38.870 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:38.870 "is_configured": false, 00:23:38.870 "data_offset": 0, 00:23:38.870 "data_size": 65536 00:23:38.870 }, 00:23:38.870 { 00:23:38.870 "name": "BaseBdev3", 00:23:38.870 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:38.870 "is_configured": true, 00:23:38.870 "data_offset": 0, 00:23:38.870 "data_size": 65536 00:23:38.870 }, 00:23:38.870 { 00:23:38.870 "name": "BaseBdev4", 00:23:38.870 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:38.870 "is_configured": true, 00:23:38.870 "data_offset": 0, 00:23:38.870 "data_size": 65536 00:23:38.870 } 00:23:38.870 ] 00:23:38.870 }' 00:23:38.870 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:38.870 06:39:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:39.434 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.434 06:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:39.691 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:23:39.691 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:23:39.948 [2024-07-25 06:39:53.277990] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.948 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:40.217 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.217 "name": "Existed_Raid", 00:23:40.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.217 "strip_size_kb": 0, 00:23:40.217 "state": "configuring", 00:23:40.217 "raid_level": "raid1", 00:23:40.217 "superblock": false, 00:23:40.217 "num_base_bdevs": 4, 00:23:40.217 "num_base_bdevs_discovered": 3, 00:23:40.217 "num_base_bdevs_operational": 4, 00:23:40.217 "base_bdevs_list": [ 00:23:40.217 { 00:23:40.217 "name": null, 00:23:40.217 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:40.217 "is_configured": false, 00:23:40.217 "data_offset": 0, 00:23:40.217 "data_size": 65536 00:23:40.217 }, 00:23:40.217 { 00:23:40.217 "name": "BaseBdev2", 00:23:40.217 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:40.217 "is_configured": true, 00:23:40.217 "data_offset": 0, 00:23:40.217 "data_size": 65536 00:23:40.217 }, 00:23:40.217 { 00:23:40.217 "name": "BaseBdev3", 00:23:40.217 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:40.217 "is_configured": true, 00:23:40.217 "data_offset": 0, 00:23:40.217 "data_size": 65536 00:23:40.217 }, 00:23:40.217 { 00:23:40.217 "name": "BaseBdev4", 00:23:40.217 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:40.217 "is_configured": true, 00:23:40.217 "data_offset": 0, 00:23:40.217 "data_size": 65536 00:23:40.217 } 00:23:40.217 ] 00:23:40.217 }' 00:23:40.217 06:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.217 06:39:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:40.782 06:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.782 06:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:40.782 06:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:23:40.782 06:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.782 06:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:23:41.038 06:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 08c2af55-da09-432a-9912-fd430210f850 00:23:41.296 [2024-07-25 06:39:54.777044] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:23:41.296 [2024-07-25 06:39:54.777083] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x108d8a0 00:23:41.296 [2024-07-25 06:39:54.777091] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:41.296 [2024-07-25 06:39:54.777285] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10946e0 00:23:41.296 [2024-07-25 06:39:54.777407] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x108d8a0 00:23:41.296 [2024-07-25 06:39:54.777416] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x108d8a0 00:23:41.296 [2024-07-25 06:39:54.777571] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:41.296 NewBaseBdev 00:23:41.296 06:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:23:41.296 06:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:23:41.296 06:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:41.296 06:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:23:41.296 06:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:41.296 06:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:41.296 06:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:41.554 06:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:23:41.811 [ 00:23:41.811 { 00:23:41.811 "name": "NewBaseBdev", 00:23:41.811 "aliases": [ 00:23:41.811 "08c2af55-da09-432a-9912-fd430210f850" 00:23:41.811 ], 00:23:41.811 "product_name": "Malloc disk", 00:23:41.811 "block_size": 512, 00:23:41.811 "num_blocks": 65536, 00:23:41.811 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:41.811 "assigned_rate_limits": { 00:23:41.811 "rw_ios_per_sec": 0, 00:23:41.811 "rw_mbytes_per_sec": 0, 00:23:41.811 "r_mbytes_per_sec": 0, 00:23:41.812 "w_mbytes_per_sec": 0 00:23:41.812 }, 00:23:41.812 "claimed": true, 00:23:41.812 "claim_type": "exclusive_write", 00:23:41.812 "zoned": false, 00:23:41.812 "supported_io_types": { 00:23:41.812 "read": true, 00:23:41.812 "write": true, 00:23:41.812 "unmap": true, 00:23:41.812 "flush": true, 00:23:41.812 "reset": true, 00:23:41.812 "nvme_admin": false, 00:23:41.812 "nvme_io": false, 00:23:41.812 "nvme_io_md": false, 00:23:41.812 "write_zeroes": true, 00:23:41.812 "zcopy": true, 00:23:41.812 "get_zone_info": false, 00:23:41.812 "zone_management": false, 00:23:41.812 "zone_append": false, 00:23:41.812 "compare": false, 00:23:41.812 "compare_and_write": false, 00:23:41.812 "abort": true, 00:23:41.812 "seek_hole": false, 00:23:41.812 "seek_data": false, 00:23:41.812 "copy": true, 00:23:41.812 "nvme_iov_md": false 00:23:41.812 }, 00:23:41.812 "memory_domains": [ 00:23:41.812 { 00:23:41.812 "dma_device_id": "system", 00:23:41.812 "dma_device_type": 1 00:23:41.812 }, 00:23:41.812 { 00:23:41.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.812 "dma_device_type": 2 00:23:41.812 } 00:23:41.812 ], 00:23:41.812 "driver_specific": {} 00:23:41.812 } 00:23:41.812 ] 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:41.812 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.069 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:42.069 "name": "Existed_Raid", 00:23:42.069 "uuid": "679070c4-f3df-44ec-a2c2-dd286dd8d9b4", 00:23:42.069 "strip_size_kb": 0, 00:23:42.069 "state": "online", 00:23:42.069 "raid_level": "raid1", 00:23:42.069 "superblock": false, 00:23:42.069 "num_base_bdevs": 4, 00:23:42.069 "num_base_bdevs_discovered": 4, 00:23:42.069 "num_base_bdevs_operational": 4, 00:23:42.070 "base_bdevs_list": [ 00:23:42.070 { 00:23:42.070 "name": "NewBaseBdev", 00:23:42.070 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:42.070 "is_configured": true, 00:23:42.070 "data_offset": 0, 00:23:42.070 "data_size": 65536 00:23:42.070 }, 00:23:42.070 { 00:23:42.070 "name": "BaseBdev2", 00:23:42.070 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:42.070 "is_configured": true, 00:23:42.070 "data_offset": 0, 00:23:42.070 "data_size": 65536 00:23:42.070 }, 00:23:42.070 { 00:23:42.070 "name": "BaseBdev3", 00:23:42.070 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:42.070 "is_configured": true, 00:23:42.070 "data_offset": 0, 00:23:42.070 "data_size": 65536 00:23:42.070 }, 00:23:42.070 { 00:23:42.070 "name": "BaseBdev4", 00:23:42.070 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:42.070 "is_configured": true, 00:23:42.070 "data_offset": 0, 00:23:42.070 "data_size": 65536 00:23:42.070 } 00:23:42.070 ] 00:23:42.070 }' 00:23:42.070 06:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:42.070 06:39:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:42.635 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:23:42.635 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:42.635 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:42.635 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:42.635 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:42.635 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:42.635 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:42.635 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:42.893 [2024-07-25 06:39:56.277315] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:42.893 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:42.893 "name": "Existed_Raid", 00:23:42.893 "aliases": [ 00:23:42.893 "679070c4-f3df-44ec-a2c2-dd286dd8d9b4" 00:23:42.893 ], 00:23:42.893 "product_name": "Raid Volume", 00:23:42.893 "block_size": 512, 00:23:42.893 "num_blocks": 65536, 00:23:42.893 "uuid": "679070c4-f3df-44ec-a2c2-dd286dd8d9b4", 00:23:42.893 "assigned_rate_limits": { 00:23:42.893 "rw_ios_per_sec": 0, 00:23:42.893 "rw_mbytes_per_sec": 0, 00:23:42.893 "r_mbytes_per_sec": 0, 00:23:42.893 "w_mbytes_per_sec": 0 00:23:42.893 }, 00:23:42.893 "claimed": false, 00:23:42.893 "zoned": false, 00:23:42.893 "supported_io_types": { 00:23:42.893 "read": true, 00:23:42.893 "write": true, 00:23:42.893 "unmap": false, 00:23:42.893 "flush": false, 00:23:42.893 "reset": true, 00:23:42.893 "nvme_admin": false, 00:23:42.893 "nvme_io": false, 00:23:42.893 "nvme_io_md": false, 00:23:42.893 "write_zeroes": true, 00:23:42.893 "zcopy": false, 00:23:42.893 "get_zone_info": false, 00:23:42.893 "zone_management": false, 00:23:42.893 "zone_append": false, 00:23:42.893 "compare": false, 00:23:42.893 "compare_and_write": false, 00:23:42.893 "abort": false, 00:23:42.893 "seek_hole": false, 00:23:42.893 "seek_data": false, 00:23:42.893 "copy": false, 00:23:42.893 "nvme_iov_md": false 00:23:42.893 }, 00:23:42.893 "memory_domains": [ 00:23:42.893 { 00:23:42.893 "dma_device_id": "system", 00:23:42.893 "dma_device_type": 1 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.893 "dma_device_type": 2 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "dma_device_id": "system", 00:23:42.893 "dma_device_type": 1 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.893 "dma_device_type": 2 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "dma_device_id": "system", 00:23:42.893 "dma_device_type": 1 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.893 "dma_device_type": 2 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "dma_device_id": "system", 00:23:42.893 "dma_device_type": 1 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.893 "dma_device_type": 2 00:23:42.893 } 00:23:42.893 ], 00:23:42.893 "driver_specific": { 00:23:42.893 "raid": { 00:23:42.893 "uuid": "679070c4-f3df-44ec-a2c2-dd286dd8d9b4", 00:23:42.893 "strip_size_kb": 0, 00:23:42.893 "state": "online", 00:23:42.893 "raid_level": "raid1", 00:23:42.893 "superblock": false, 00:23:42.893 "num_base_bdevs": 4, 00:23:42.893 "num_base_bdevs_discovered": 4, 00:23:42.893 "num_base_bdevs_operational": 4, 00:23:42.893 "base_bdevs_list": [ 00:23:42.893 { 00:23:42.893 "name": "NewBaseBdev", 00:23:42.893 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:42.893 "is_configured": true, 00:23:42.893 "data_offset": 0, 00:23:42.893 "data_size": 65536 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "name": "BaseBdev2", 00:23:42.893 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:42.893 "is_configured": true, 00:23:42.893 "data_offset": 0, 00:23:42.893 "data_size": 65536 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "name": "BaseBdev3", 00:23:42.893 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:42.893 "is_configured": true, 00:23:42.893 "data_offset": 0, 00:23:42.893 "data_size": 65536 00:23:42.893 }, 00:23:42.893 { 00:23:42.893 "name": "BaseBdev4", 00:23:42.893 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:42.893 "is_configured": true, 00:23:42.893 "data_offset": 0, 00:23:42.893 "data_size": 65536 00:23:42.893 } 00:23:42.893 ] 00:23:42.893 } 00:23:42.893 } 00:23:42.893 }' 00:23:42.893 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:42.893 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:23:42.893 BaseBdev2 00:23:42.893 BaseBdev3 00:23:42.893 BaseBdev4' 00:23:42.893 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:42.893 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:23:42.893 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:43.151 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:43.151 "name": "NewBaseBdev", 00:23:43.151 "aliases": [ 00:23:43.151 "08c2af55-da09-432a-9912-fd430210f850" 00:23:43.151 ], 00:23:43.151 "product_name": "Malloc disk", 00:23:43.151 "block_size": 512, 00:23:43.151 "num_blocks": 65536, 00:23:43.151 "uuid": "08c2af55-da09-432a-9912-fd430210f850", 00:23:43.151 "assigned_rate_limits": { 00:23:43.151 "rw_ios_per_sec": 0, 00:23:43.151 "rw_mbytes_per_sec": 0, 00:23:43.151 "r_mbytes_per_sec": 0, 00:23:43.151 "w_mbytes_per_sec": 0 00:23:43.151 }, 00:23:43.151 "claimed": true, 00:23:43.151 "claim_type": "exclusive_write", 00:23:43.151 "zoned": false, 00:23:43.151 "supported_io_types": { 00:23:43.151 "read": true, 00:23:43.151 "write": true, 00:23:43.151 "unmap": true, 00:23:43.151 "flush": true, 00:23:43.151 "reset": true, 00:23:43.151 "nvme_admin": false, 00:23:43.151 "nvme_io": false, 00:23:43.151 "nvme_io_md": false, 00:23:43.151 "write_zeroes": true, 00:23:43.151 "zcopy": true, 00:23:43.151 "get_zone_info": false, 00:23:43.151 "zone_management": false, 00:23:43.151 "zone_append": false, 00:23:43.151 "compare": false, 00:23:43.151 "compare_and_write": false, 00:23:43.151 "abort": true, 00:23:43.151 "seek_hole": false, 00:23:43.151 "seek_data": false, 00:23:43.151 "copy": true, 00:23:43.151 "nvme_iov_md": false 00:23:43.151 }, 00:23:43.151 "memory_domains": [ 00:23:43.151 { 00:23:43.151 "dma_device_id": "system", 00:23:43.151 "dma_device_type": 1 00:23:43.151 }, 00:23:43.151 { 00:23:43.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:43.151 "dma_device_type": 2 00:23:43.151 } 00:23:43.151 ], 00:23:43.151 "driver_specific": {} 00:23:43.151 }' 00:23:43.151 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.151 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.151 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:43.151 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:43.409 06:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:43.666 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:43.666 "name": "BaseBdev2", 00:23:43.666 "aliases": [ 00:23:43.666 "c6e939e6-1efd-4489-84c9-28564d7f0d54" 00:23:43.666 ], 00:23:43.666 "product_name": "Malloc disk", 00:23:43.666 "block_size": 512, 00:23:43.666 "num_blocks": 65536, 00:23:43.666 "uuid": "c6e939e6-1efd-4489-84c9-28564d7f0d54", 00:23:43.666 "assigned_rate_limits": { 00:23:43.666 "rw_ios_per_sec": 0, 00:23:43.666 "rw_mbytes_per_sec": 0, 00:23:43.666 "r_mbytes_per_sec": 0, 00:23:43.666 "w_mbytes_per_sec": 0 00:23:43.666 }, 00:23:43.666 "claimed": true, 00:23:43.666 "claim_type": "exclusive_write", 00:23:43.666 "zoned": false, 00:23:43.666 "supported_io_types": { 00:23:43.666 "read": true, 00:23:43.666 "write": true, 00:23:43.666 "unmap": true, 00:23:43.666 "flush": true, 00:23:43.666 "reset": true, 00:23:43.666 "nvme_admin": false, 00:23:43.666 "nvme_io": false, 00:23:43.666 "nvme_io_md": false, 00:23:43.666 "write_zeroes": true, 00:23:43.666 "zcopy": true, 00:23:43.666 "get_zone_info": false, 00:23:43.666 "zone_management": false, 00:23:43.666 "zone_append": false, 00:23:43.666 "compare": false, 00:23:43.666 "compare_and_write": false, 00:23:43.666 "abort": true, 00:23:43.666 "seek_hole": false, 00:23:43.666 "seek_data": false, 00:23:43.666 "copy": true, 00:23:43.666 "nvme_iov_md": false 00:23:43.666 }, 00:23:43.666 "memory_domains": [ 00:23:43.666 { 00:23:43.666 "dma_device_id": "system", 00:23:43.666 "dma_device_type": 1 00:23:43.666 }, 00:23:43.666 { 00:23:43.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:43.666 "dma_device_type": 2 00:23:43.666 } 00:23:43.666 ], 00:23:43.666 "driver_specific": {} 00:23:43.666 }' 00:23:43.666 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.666 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.666 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:43.666 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.924 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.924 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:43.924 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.924 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.924 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:43.924 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:44.182 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:44.182 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:44.182 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:44.182 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:44.182 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:44.440 "name": "BaseBdev3", 00:23:44.440 "aliases": [ 00:23:44.440 "219a0fd4-d378-40b5-874f-03fa0347f0a3" 00:23:44.440 ], 00:23:44.440 "product_name": "Malloc disk", 00:23:44.440 "block_size": 512, 00:23:44.440 "num_blocks": 65536, 00:23:44.440 "uuid": "219a0fd4-d378-40b5-874f-03fa0347f0a3", 00:23:44.440 "assigned_rate_limits": { 00:23:44.440 "rw_ios_per_sec": 0, 00:23:44.440 "rw_mbytes_per_sec": 0, 00:23:44.440 "r_mbytes_per_sec": 0, 00:23:44.440 "w_mbytes_per_sec": 0 00:23:44.440 }, 00:23:44.440 "claimed": true, 00:23:44.440 "claim_type": "exclusive_write", 00:23:44.440 "zoned": false, 00:23:44.440 "supported_io_types": { 00:23:44.440 "read": true, 00:23:44.440 "write": true, 00:23:44.440 "unmap": true, 00:23:44.440 "flush": true, 00:23:44.440 "reset": true, 00:23:44.440 "nvme_admin": false, 00:23:44.440 "nvme_io": false, 00:23:44.440 "nvme_io_md": false, 00:23:44.440 "write_zeroes": true, 00:23:44.440 "zcopy": true, 00:23:44.440 "get_zone_info": false, 00:23:44.440 "zone_management": false, 00:23:44.440 "zone_append": false, 00:23:44.440 "compare": false, 00:23:44.440 "compare_and_write": false, 00:23:44.440 "abort": true, 00:23:44.440 "seek_hole": false, 00:23:44.440 "seek_data": false, 00:23:44.440 "copy": true, 00:23:44.440 "nvme_iov_md": false 00:23:44.440 }, 00:23:44.440 "memory_domains": [ 00:23:44.440 { 00:23:44.440 "dma_device_id": "system", 00:23:44.440 "dma_device_type": 1 00:23:44.440 }, 00:23:44.440 { 00:23:44.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:44.440 "dma_device_type": 2 00:23:44.440 } 00:23:44.440 ], 00:23:44.440 "driver_specific": {} 00:23:44.440 }' 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:44.440 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:44.698 06:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:44.698 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:44.698 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:44.698 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:44.698 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:44.698 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:44.956 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:44.956 "name": "BaseBdev4", 00:23:44.956 "aliases": [ 00:23:44.956 "8b3b9850-ecc4-461d-bdee-d7ae85ec6117" 00:23:44.956 ], 00:23:44.956 "product_name": "Malloc disk", 00:23:44.956 "block_size": 512, 00:23:44.956 "num_blocks": 65536, 00:23:44.956 "uuid": "8b3b9850-ecc4-461d-bdee-d7ae85ec6117", 00:23:44.956 "assigned_rate_limits": { 00:23:44.957 "rw_ios_per_sec": 0, 00:23:44.957 "rw_mbytes_per_sec": 0, 00:23:44.957 "r_mbytes_per_sec": 0, 00:23:44.957 "w_mbytes_per_sec": 0 00:23:44.957 }, 00:23:44.957 "claimed": true, 00:23:44.957 "claim_type": "exclusive_write", 00:23:44.957 "zoned": false, 00:23:44.957 "supported_io_types": { 00:23:44.957 "read": true, 00:23:44.957 "write": true, 00:23:44.957 "unmap": true, 00:23:44.957 "flush": true, 00:23:44.957 "reset": true, 00:23:44.957 "nvme_admin": false, 00:23:44.957 "nvme_io": false, 00:23:44.957 "nvme_io_md": false, 00:23:44.957 "write_zeroes": true, 00:23:44.957 "zcopy": true, 00:23:44.957 "get_zone_info": false, 00:23:44.957 "zone_management": false, 00:23:44.957 "zone_append": false, 00:23:44.957 "compare": false, 00:23:44.957 "compare_and_write": false, 00:23:44.957 "abort": true, 00:23:44.957 "seek_hole": false, 00:23:44.957 "seek_data": false, 00:23:44.957 "copy": true, 00:23:44.957 "nvme_iov_md": false 00:23:44.957 }, 00:23:44.957 "memory_domains": [ 00:23:44.957 { 00:23:44.957 "dma_device_id": "system", 00:23:44.957 "dma_device_type": 1 00:23:44.957 }, 00:23:44.957 { 00:23:44.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:44.957 "dma_device_type": 2 00:23:44.957 } 00:23:44.957 ], 00:23:44.957 "driver_specific": {} 00:23:44.957 }' 00:23:44.957 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:44.957 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:44.957 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:44.957 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:44.957 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:44.957 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:44.957 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:45.215 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:45.215 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:45.215 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:45.215 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:45.215 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:45.215 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:45.473 [2024-07-25 06:39:58.863855] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:45.473 [2024-07-25 06:39:58.863880] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:45.473 [2024-07-25 06:39:58.863933] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:45.473 [2024-07-25 06:39:58.864178] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:45.473 [2024-07-25 06:39:58.864190] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x108d8a0 name Existed_Raid, state offline 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1202529 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1202529 ']' 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1202529 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1202529 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1202529' 00:23:45.473 killing process with pid 1202529 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1202529 00:23:45.473 [2024-07-25 06:39:58.943834] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:45.473 06:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1202529 00:23:45.473 [2024-07-25 06:39:58.975947] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:23:45.733 00:23:45.733 real 0m30.660s 00:23:45.733 user 0m56.123s 00:23:45.733 sys 0m5.700s 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:45.733 ************************************ 00:23:45.733 END TEST raid_state_function_test 00:23:45.733 ************************************ 00:23:45.733 06:39:59 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:23:45.733 06:39:59 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:23:45.733 06:39:59 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:45.733 06:39:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:45.733 ************************************ 00:23:45.733 START TEST raid_state_function_test_sb 00:23:45.733 ************************************ 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 true 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1208232 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1208232' 00:23:45.733 Process raid pid: 1208232 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1208232 /var/tmp/spdk-raid.sock 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1208232 ']' 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:45.733 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:45.733 06:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:45.992 [2024-07-25 06:39:59.314314] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:23:45.992 [2024-07-25 06:39:59.314378] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:01.0 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:01.1 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:01.2 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:01.3 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:01.4 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:01.5 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:01.6 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:01.7 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:02.0 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:02.1 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:02.2 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:02.3 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:02.4 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:02.5 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:02.6 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3d:02.7 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:01.0 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:01.1 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:01.2 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:01.3 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:01.4 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:01.5 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:01.6 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:01.7 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:02.0 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:02.1 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:02.2 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:02.3 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:02.4 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:02.5 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:02.6 cannot be used 00:23:45.992 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:23:45.992 EAL: Requested device 0000:3f:02.7 cannot be used 00:23:45.992 [2024-07-25 06:39:59.451668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:45.992 [2024-07-25 06:39:59.494587] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:46.250 [2024-07-25 06:39:59.552629] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:46.250 [2024-07-25 06:39:59.552662] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:46.816 06:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:46.816 06:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:23:46.816 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:47.074 [2024-07-25 06:40:00.377105] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:47.074 [2024-07-25 06:40:00.377153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:47.074 [2024-07-25 06:40:00.377165] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:47.074 [2024-07-25 06:40:00.377176] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:47.074 [2024-07-25 06:40:00.377184] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:47.074 [2024-07-25 06:40:00.377194] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:47.074 [2024-07-25 06:40:00.377202] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:47.074 [2024-07-25 06:40:00.377212] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.074 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:47.333 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:47.333 "name": "Existed_Raid", 00:23:47.333 "uuid": "416345de-ac08-42b5-8578-e434ae037636", 00:23:47.333 "strip_size_kb": 0, 00:23:47.333 "state": "configuring", 00:23:47.333 "raid_level": "raid1", 00:23:47.333 "superblock": true, 00:23:47.333 "num_base_bdevs": 4, 00:23:47.333 "num_base_bdevs_discovered": 0, 00:23:47.333 "num_base_bdevs_operational": 4, 00:23:47.333 "base_bdevs_list": [ 00:23:47.333 { 00:23:47.333 "name": "BaseBdev1", 00:23:47.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.333 "is_configured": false, 00:23:47.333 "data_offset": 0, 00:23:47.333 "data_size": 0 00:23:47.333 }, 00:23:47.333 { 00:23:47.333 "name": "BaseBdev2", 00:23:47.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.333 "is_configured": false, 00:23:47.333 "data_offset": 0, 00:23:47.333 "data_size": 0 00:23:47.333 }, 00:23:47.333 { 00:23:47.333 "name": "BaseBdev3", 00:23:47.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.333 "is_configured": false, 00:23:47.333 "data_offset": 0, 00:23:47.333 "data_size": 0 00:23:47.333 }, 00:23:47.333 { 00:23:47.333 "name": "BaseBdev4", 00:23:47.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.333 "is_configured": false, 00:23:47.333 "data_offset": 0, 00:23:47.333 "data_size": 0 00:23:47.333 } 00:23:47.333 ] 00:23:47.333 }' 00:23:47.333 06:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:47.333 06:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:47.899 06:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:47.899 [2024-07-25 06:40:01.411692] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:47.899 [2024-07-25 06:40:01.411722] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b32470 name Existed_Raid, state configuring 00:23:47.899 06:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:48.156 [2024-07-25 06:40:01.640314] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:48.157 [2024-07-25 06:40:01.640349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:48.157 [2024-07-25 06:40:01.640359] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:48.157 [2024-07-25 06:40:01.640370] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:48.157 [2024-07-25 06:40:01.640378] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:48.157 [2024-07-25 06:40:01.640388] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:48.157 [2024-07-25 06:40:01.640396] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:48.157 [2024-07-25 06:40:01.640406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:48.157 06:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:48.415 [2024-07-25 06:40:01.878411] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:48.415 BaseBdev1 00:23:48.415 06:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:48.415 06:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:23:48.415 06:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:48.415 06:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:23:48.415 06:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:48.415 06:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:48.415 06:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:48.672 06:40:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:48.930 [ 00:23:48.930 { 00:23:48.930 "name": "BaseBdev1", 00:23:48.930 "aliases": [ 00:23:48.930 "662b3ce3-b68f-4d60-8a56-e9219218c6c6" 00:23:48.930 ], 00:23:48.930 "product_name": "Malloc disk", 00:23:48.930 "block_size": 512, 00:23:48.930 "num_blocks": 65536, 00:23:48.930 "uuid": "662b3ce3-b68f-4d60-8a56-e9219218c6c6", 00:23:48.930 "assigned_rate_limits": { 00:23:48.930 "rw_ios_per_sec": 0, 00:23:48.930 "rw_mbytes_per_sec": 0, 00:23:48.930 "r_mbytes_per_sec": 0, 00:23:48.930 "w_mbytes_per_sec": 0 00:23:48.930 }, 00:23:48.930 "claimed": true, 00:23:48.930 "claim_type": "exclusive_write", 00:23:48.930 "zoned": false, 00:23:48.930 "supported_io_types": { 00:23:48.930 "read": true, 00:23:48.930 "write": true, 00:23:48.930 "unmap": true, 00:23:48.930 "flush": true, 00:23:48.930 "reset": true, 00:23:48.930 "nvme_admin": false, 00:23:48.930 "nvme_io": false, 00:23:48.930 "nvme_io_md": false, 00:23:48.930 "write_zeroes": true, 00:23:48.930 "zcopy": true, 00:23:48.930 "get_zone_info": false, 00:23:48.930 "zone_management": false, 00:23:48.930 "zone_append": false, 00:23:48.930 "compare": false, 00:23:48.930 "compare_and_write": false, 00:23:48.930 "abort": true, 00:23:48.930 "seek_hole": false, 00:23:48.930 "seek_data": false, 00:23:48.930 "copy": true, 00:23:48.930 "nvme_iov_md": false 00:23:48.930 }, 00:23:48.930 "memory_domains": [ 00:23:48.930 { 00:23:48.930 "dma_device_id": "system", 00:23:48.930 "dma_device_type": 1 00:23:48.930 }, 00:23:48.930 { 00:23:48.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.930 "dma_device_type": 2 00:23:48.930 } 00:23:48.930 ], 00:23:48.930 "driver_specific": {} 00:23:48.930 } 00:23:48.930 ] 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.930 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:49.188 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:49.188 "name": "Existed_Raid", 00:23:49.188 "uuid": "ab140f5e-07c8-4a27-a81f-511d5ed1eb74", 00:23:49.188 "strip_size_kb": 0, 00:23:49.188 "state": "configuring", 00:23:49.188 "raid_level": "raid1", 00:23:49.188 "superblock": true, 00:23:49.188 "num_base_bdevs": 4, 00:23:49.188 "num_base_bdevs_discovered": 1, 00:23:49.188 "num_base_bdevs_operational": 4, 00:23:49.188 "base_bdevs_list": [ 00:23:49.188 { 00:23:49.188 "name": "BaseBdev1", 00:23:49.188 "uuid": "662b3ce3-b68f-4d60-8a56-e9219218c6c6", 00:23:49.188 "is_configured": true, 00:23:49.188 "data_offset": 2048, 00:23:49.188 "data_size": 63488 00:23:49.188 }, 00:23:49.188 { 00:23:49.188 "name": "BaseBdev2", 00:23:49.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.188 "is_configured": false, 00:23:49.188 "data_offset": 0, 00:23:49.188 "data_size": 0 00:23:49.188 }, 00:23:49.188 { 00:23:49.188 "name": "BaseBdev3", 00:23:49.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.188 "is_configured": false, 00:23:49.188 "data_offset": 0, 00:23:49.188 "data_size": 0 00:23:49.188 }, 00:23:49.188 { 00:23:49.188 "name": "BaseBdev4", 00:23:49.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.188 "is_configured": false, 00:23:49.188 "data_offset": 0, 00:23:49.188 "data_size": 0 00:23:49.188 } 00:23:49.188 ] 00:23:49.188 }' 00:23:49.188 06:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:49.188 06:40:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:49.754 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:50.015 [2024-07-25 06:40:03.370330] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:50.015 [2024-07-25 06:40:03.370373] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b31ce0 name Existed_Raid, state configuring 00:23:50.015 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:50.273 [2024-07-25 06:40:03.598972] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:50.273 [2024-07-25 06:40:03.600356] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:50.273 [2024-07-25 06:40:03.600390] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:50.273 [2024-07-25 06:40:03.600400] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:50.273 [2024-07-25 06:40:03.600410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:50.273 [2024-07-25 06:40:03.600418] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:50.273 [2024-07-25 06:40:03.600428] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.273 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:50.564 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.564 "name": "Existed_Raid", 00:23:50.564 "uuid": "8bad9edb-4530-4c33-9585-2a7fc18ade4d", 00:23:50.564 "strip_size_kb": 0, 00:23:50.564 "state": "configuring", 00:23:50.564 "raid_level": "raid1", 00:23:50.564 "superblock": true, 00:23:50.564 "num_base_bdevs": 4, 00:23:50.564 "num_base_bdevs_discovered": 1, 00:23:50.564 "num_base_bdevs_operational": 4, 00:23:50.564 "base_bdevs_list": [ 00:23:50.564 { 00:23:50.564 "name": "BaseBdev1", 00:23:50.564 "uuid": "662b3ce3-b68f-4d60-8a56-e9219218c6c6", 00:23:50.564 "is_configured": true, 00:23:50.564 "data_offset": 2048, 00:23:50.564 "data_size": 63488 00:23:50.564 }, 00:23:50.564 { 00:23:50.564 "name": "BaseBdev2", 00:23:50.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.564 "is_configured": false, 00:23:50.564 "data_offset": 0, 00:23:50.564 "data_size": 0 00:23:50.564 }, 00:23:50.564 { 00:23:50.564 "name": "BaseBdev3", 00:23:50.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.564 "is_configured": false, 00:23:50.564 "data_offset": 0, 00:23:50.564 "data_size": 0 00:23:50.564 }, 00:23:50.564 { 00:23:50.564 "name": "BaseBdev4", 00:23:50.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.564 "is_configured": false, 00:23:50.564 "data_offset": 0, 00:23:50.564 "data_size": 0 00:23:50.564 } 00:23:50.564 ] 00:23:50.564 }' 00:23:50.564 06:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.564 06:40:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:51.129 06:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:51.129 [2024-07-25 06:40:04.624756] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:51.129 BaseBdev2 00:23:51.129 06:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:51.129 06:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:23:51.129 06:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:51.129 06:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:23:51.129 06:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:51.129 06:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:51.129 06:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:51.386 06:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:51.643 [ 00:23:51.643 { 00:23:51.643 "name": "BaseBdev2", 00:23:51.643 "aliases": [ 00:23:51.643 "f428c629-6713-42d0-adc8-23f119fe0123" 00:23:51.643 ], 00:23:51.643 "product_name": "Malloc disk", 00:23:51.643 "block_size": 512, 00:23:51.643 "num_blocks": 65536, 00:23:51.643 "uuid": "f428c629-6713-42d0-adc8-23f119fe0123", 00:23:51.643 "assigned_rate_limits": { 00:23:51.643 "rw_ios_per_sec": 0, 00:23:51.643 "rw_mbytes_per_sec": 0, 00:23:51.643 "r_mbytes_per_sec": 0, 00:23:51.643 "w_mbytes_per_sec": 0 00:23:51.643 }, 00:23:51.643 "claimed": true, 00:23:51.643 "claim_type": "exclusive_write", 00:23:51.643 "zoned": false, 00:23:51.643 "supported_io_types": { 00:23:51.643 "read": true, 00:23:51.643 "write": true, 00:23:51.643 "unmap": true, 00:23:51.643 "flush": true, 00:23:51.643 "reset": true, 00:23:51.643 "nvme_admin": false, 00:23:51.643 "nvme_io": false, 00:23:51.643 "nvme_io_md": false, 00:23:51.643 "write_zeroes": true, 00:23:51.643 "zcopy": true, 00:23:51.643 "get_zone_info": false, 00:23:51.643 "zone_management": false, 00:23:51.643 "zone_append": false, 00:23:51.643 "compare": false, 00:23:51.643 "compare_and_write": false, 00:23:51.643 "abort": true, 00:23:51.643 "seek_hole": false, 00:23:51.643 "seek_data": false, 00:23:51.643 "copy": true, 00:23:51.643 "nvme_iov_md": false 00:23:51.643 }, 00:23:51.643 "memory_domains": [ 00:23:51.643 { 00:23:51.643 "dma_device_id": "system", 00:23:51.643 "dma_device_type": 1 00:23:51.643 }, 00:23:51.643 { 00:23:51.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:51.643 "dma_device_type": 2 00:23:51.643 } 00:23:51.643 ], 00:23:51.643 "driver_specific": {} 00:23:51.643 } 00:23:51.643 ] 00:23:51.643 06:40:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:23:51.643 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:51.643 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:51.643 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:51.643 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:51.643 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:51.643 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:51.643 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:51.644 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:51.644 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:51.644 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:51.644 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:51.644 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:51.644 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.644 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:51.901 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:51.901 "name": "Existed_Raid", 00:23:51.901 "uuid": "8bad9edb-4530-4c33-9585-2a7fc18ade4d", 00:23:51.901 "strip_size_kb": 0, 00:23:51.901 "state": "configuring", 00:23:51.901 "raid_level": "raid1", 00:23:51.901 "superblock": true, 00:23:51.901 "num_base_bdevs": 4, 00:23:51.901 "num_base_bdevs_discovered": 2, 00:23:51.901 "num_base_bdevs_operational": 4, 00:23:51.901 "base_bdevs_list": [ 00:23:51.901 { 00:23:51.901 "name": "BaseBdev1", 00:23:51.901 "uuid": "662b3ce3-b68f-4d60-8a56-e9219218c6c6", 00:23:51.901 "is_configured": true, 00:23:51.901 "data_offset": 2048, 00:23:51.901 "data_size": 63488 00:23:51.901 }, 00:23:51.901 { 00:23:51.901 "name": "BaseBdev2", 00:23:51.901 "uuid": "f428c629-6713-42d0-adc8-23f119fe0123", 00:23:51.901 "is_configured": true, 00:23:51.901 "data_offset": 2048, 00:23:51.901 "data_size": 63488 00:23:51.901 }, 00:23:51.901 { 00:23:51.901 "name": "BaseBdev3", 00:23:51.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.901 "is_configured": false, 00:23:51.901 "data_offset": 0, 00:23:51.901 "data_size": 0 00:23:51.901 }, 00:23:51.901 { 00:23:51.901 "name": "BaseBdev4", 00:23:51.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.901 "is_configured": false, 00:23:51.901 "data_offset": 0, 00:23:51.901 "data_size": 0 00:23:51.901 } 00:23:51.901 ] 00:23:51.901 }' 00:23:51.901 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:51.901 06:40:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:52.463 06:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:52.719 [2024-07-25 06:40:06.107893] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:52.719 BaseBdev3 00:23:52.719 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:23:52.719 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:23:52.719 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:52.719 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:23:52.719 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:52.719 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:52.719 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:52.976 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:53.234 [ 00:23:53.234 { 00:23:53.234 "name": "BaseBdev3", 00:23:53.234 "aliases": [ 00:23:53.234 "f0296caf-5aba-4369-abe1-0c7d9b0217c6" 00:23:53.234 ], 00:23:53.234 "product_name": "Malloc disk", 00:23:53.234 "block_size": 512, 00:23:53.234 "num_blocks": 65536, 00:23:53.234 "uuid": "f0296caf-5aba-4369-abe1-0c7d9b0217c6", 00:23:53.234 "assigned_rate_limits": { 00:23:53.234 "rw_ios_per_sec": 0, 00:23:53.234 "rw_mbytes_per_sec": 0, 00:23:53.234 "r_mbytes_per_sec": 0, 00:23:53.234 "w_mbytes_per_sec": 0 00:23:53.234 }, 00:23:53.234 "claimed": true, 00:23:53.234 "claim_type": "exclusive_write", 00:23:53.234 "zoned": false, 00:23:53.234 "supported_io_types": { 00:23:53.234 "read": true, 00:23:53.234 "write": true, 00:23:53.234 "unmap": true, 00:23:53.234 "flush": true, 00:23:53.234 "reset": true, 00:23:53.234 "nvme_admin": false, 00:23:53.234 "nvme_io": false, 00:23:53.234 "nvme_io_md": false, 00:23:53.234 "write_zeroes": true, 00:23:53.234 "zcopy": true, 00:23:53.234 "get_zone_info": false, 00:23:53.234 "zone_management": false, 00:23:53.234 "zone_append": false, 00:23:53.234 "compare": false, 00:23:53.234 "compare_and_write": false, 00:23:53.234 "abort": true, 00:23:53.234 "seek_hole": false, 00:23:53.234 "seek_data": false, 00:23:53.234 "copy": true, 00:23:53.234 "nvme_iov_md": false 00:23:53.234 }, 00:23:53.234 "memory_domains": [ 00:23:53.234 { 00:23:53.234 "dma_device_id": "system", 00:23:53.234 "dma_device_type": 1 00:23:53.234 }, 00:23:53.234 { 00:23:53.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:53.234 "dma_device_type": 2 00:23:53.234 } 00:23:53.234 ], 00:23:53.234 "driver_specific": {} 00:23:53.234 } 00:23:53.234 ] 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:53.234 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:53.235 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.235 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:53.493 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.493 "name": "Existed_Raid", 00:23:53.493 "uuid": "8bad9edb-4530-4c33-9585-2a7fc18ade4d", 00:23:53.493 "strip_size_kb": 0, 00:23:53.493 "state": "configuring", 00:23:53.493 "raid_level": "raid1", 00:23:53.493 "superblock": true, 00:23:53.493 "num_base_bdevs": 4, 00:23:53.493 "num_base_bdevs_discovered": 3, 00:23:53.493 "num_base_bdevs_operational": 4, 00:23:53.493 "base_bdevs_list": [ 00:23:53.493 { 00:23:53.493 "name": "BaseBdev1", 00:23:53.493 "uuid": "662b3ce3-b68f-4d60-8a56-e9219218c6c6", 00:23:53.493 "is_configured": true, 00:23:53.493 "data_offset": 2048, 00:23:53.493 "data_size": 63488 00:23:53.493 }, 00:23:53.493 { 00:23:53.493 "name": "BaseBdev2", 00:23:53.493 "uuid": "f428c629-6713-42d0-adc8-23f119fe0123", 00:23:53.493 "is_configured": true, 00:23:53.493 "data_offset": 2048, 00:23:53.493 "data_size": 63488 00:23:53.493 }, 00:23:53.493 { 00:23:53.493 "name": "BaseBdev3", 00:23:53.493 "uuid": "f0296caf-5aba-4369-abe1-0c7d9b0217c6", 00:23:53.493 "is_configured": true, 00:23:53.493 "data_offset": 2048, 00:23:53.493 "data_size": 63488 00:23:53.493 }, 00:23:53.493 { 00:23:53.493 "name": "BaseBdev4", 00:23:53.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.493 "is_configured": false, 00:23:53.493 "data_offset": 0, 00:23:53.493 "data_size": 0 00:23:53.493 } 00:23:53.493 ] 00:23:53.493 }' 00:23:53.493 06:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.493 06:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:54.059 06:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:54.059 [2024-07-25 06:40:07.603028] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:54.059 [2024-07-25 06:40:07.603200] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ce5250 00:23:54.059 [2024-07-25 06:40:07.603214] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:54.059 [2024-07-25 06:40:07.603383] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b31030 00:23:54.059 [2024-07-25 06:40:07.603510] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ce5250 00:23:54.059 [2024-07-25 06:40:07.603520] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ce5250 00:23:54.059 [2024-07-25 06:40:07.603604] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.059 BaseBdev4 00:23:54.317 06:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:23:54.317 06:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:23:54.317 06:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:54.317 06:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:23:54.317 06:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:54.317 06:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:54.317 06:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:54.317 06:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:54.574 [ 00:23:54.574 { 00:23:54.574 "name": "BaseBdev4", 00:23:54.574 "aliases": [ 00:23:54.574 "dd32b5c5-b861-471e-9d08-fa564ab6fd92" 00:23:54.574 ], 00:23:54.574 "product_name": "Malloc disk", 00:23:54.574 "block_size": 512, 00:23:54.574 "num_blocks": 65536, 00:23:54.574 "uuid": "dd32b5c5-b861-471e-9d08-fa564ab6fd92", 00:23:54.574 "assigned_rate_limits": { 00:23:54.574 "rw_ios_per_sec": 0, 00:23:54.574 "rw_mbytes_per_sec": 0, 00:23:54.574 "r_mbytes_per_sec": 0, 00:23:54.574 "w_mbytes_per_sec": 0 00:23:54.574 }, 00:23:54.574 "claimed": true, 00:23:54.574 "claim_type": "exclusive_write", 00:23:54.574 "zoned": false, 00:23:54.574 "supported_io_types": { 00:23:54.574 "read": true, 00:23:54.574 "write": true, 00:23:54.574 "unmap": true, 00:23:54.574 "flush": true, 00:23:54.574 "reset": true, 00:23:54.574 "nvme_admin": false, 00:23:54.574 "nvme_io": false, 00:23:54.574 "nvme_io_md": false, 00:23:54.574 "write_zeroes": true, 00:23:54.574 "zcopy": true, 00:23:54.574 "get_zone_info": false, 00:23:54.574 "zone_management": false, 00:23:54.574 "zone_append": false, 00:23:54.574 "compare": false, 00:23:54.574 "compare_and_write": false, 00:23:54.574 "abort": true, 00:23:54.574 "seek_hole": false, 00:23:54.574 "seek_data": false, 00:23:54.574 "copy": true, 00:23:54.574 "nvme_iov_md": false 00:23:54.574 }, 00:23:54.574 "memory_domains": [ 00:23:54.574 { 00:23:54.574 "dma_device_id": "system", 00:23:54.574 "dma_device_type": 1 00:23:54.574 }, 00:23:54.574 { 00:23:54.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:54.574 "dma_device_type": 2 00:23:54.574 } 00:23:54.574 ], 00:23:54.574 "driver_specific": {} 00:23:54.574 } 00:23:54.574 ] 00:23:54.574 06:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:23:54.574 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:54.574 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.575 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:54.833 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.833 "name": "Existed_Raid", 00:23:54.833 "uuid": "8bad9edb-4530-4c33-9585-2a7fc18ade4d", 00:23:54.833 "strip_size_kb": 0, 00:23:54.833 "state": "online", 00:23:54.833 "raid_level": "raid1", 00:23:54.833 "superblock": true, 00:23:54.833 "num_base_bdevs": 4, 00:23:54.833 "num_base_bdevs_discovered": 4, 00:23:54.833 "num_base_bdevs_operational": 4, 00:23:54.833 "base_bdevs_list": [ 00:23:54.833 { 00:23:54.833 "name": "BaseBdev1", 00:23:54.833 "uuid": "662b3ce3-b68f-4d60-8a56-e9219218c6c6", 00:23:54.833 "is_configured": true, 00:23:54.833 "data_offset": 2048, 00:23:54.833 "data_size": 63488 00:23:54.833 }, 00:23:54.833 { 00:23:54.833 "name": "BaseBdev2", 00:23:54.833 "uuid": "f428c629-6713-42d0-adc8-23f119fe0123", 00:23:54.833 "is_configured": true, 00:23:54.833 "data_offset": 2048, 00:23:54.833 "data_size": 63488 00:23:54.833 }, 00:23:54.833 { 00:23:54.833 "name": "BaseBdev3", 00:23:54.833 "uuid": "f0296caf-5aba-4369-abe1-0c7d9b0217c6", 00:23:54.833 "is_configured": true, 00:23:54.833 "data_offset": 2048, 00:23:54.833 "data_size": 63488 00:23:54.833 }, 00:23:54.833 { 00:23:54.833 "name": "BaseBdev4", 00:23:54.833 "uuid": "dd32b5c5-b861-471e-9d08-fa564ab6fd92", 00:23:54.833 "is_configured": true, 00:23:54.833 "data_offset": 2048, 00:23:54.833 "data_size": 63488 00:23:54.833 } 00:23:54.833 ] 00:23:54.833 }' 00:23:54.833 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.833 06:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:55.398 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:55.398 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:55.398 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:55.398 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:55.398 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:55.398 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:55.398 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:55.398 06:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:55.656 [2024-07-25 06:40:09.071205] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:55.656 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:55.656 "name": "Existed_Raid", 00:23:55.656 "aliases": [ 00:23:55.656 "8bad9edb-4530-4c33-9585-2a7fc18ade4d" 00:23:55.656 ], 00:23:55.656 "product_name": "Raid Volume", 00:23:55.656 "block_size": 512, 00:23:55.656 "num_blocks": 63488, 00:23:55.656 "uuid": "8bad9edb-4530-4c33-9585-2a7fc18ade4d", 00:23:55.656 "assigned_rate_limits": { 00:23:55.656 "rw_ios_per_sec": 0, 00:23:55.656 "rw_mbytes_per_sec": 0, 00:23:55.656 "r_mbytes_per_sec": 0, 00:23:55.656 "w_mbytes_per_sec": 0 00:23:55.656 }, 00:23:55.656 "claimed": false, 00:23:55.656 "zoned": false, 00:23:55.656 "supported_io_types": { 00:23:55.656 "read": true, 00:23:55.656 "write": true, 00:23:55.656 "unmap": false, 00:23:55.656 "flush": false, 00:23:55.656 "reset": true, 00:23:55.656 "nvme_admin": false, 00:23:55.656 "nvme_io": false, 00:23:55.656 "nvme_io_md": false, 00:23:55.656 "write_zeroes": true, 00:23:55.656 "zcopy": false, 00:23:55.656 "get_zone_info": false, 00:23:55.656 "zone_management": false, 00:23:55.656 "zone_append": false, 00:23:55.656 "compare": false, 00:23:55.656 "compare_and_write": false, 00:23:55.656 "abort": false, 00:23:55.656 "seek_hole": false, 00:23:55.656 "seek_data": false, 00:23:55.656 "copy": false, 00:23:55.656 "nvme_iov_md": false 00:23:55.656 }, 00:23:55.656 "memory_domains": [ 00:23:55.656 { 00:23:55.656 "dma_device_id": "system", 00:23:55.656 "dma_device_type": 1 00:23:55.656 }, 00:23:55.656 { 00:23:55.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:55.656 "dma_device_type": 2 00:23:55.656 }, 00:23:55.656 { 00:23:55.656 "dma_device_id": "system", 00:23:55.656 "dma_device_type": 1 00:23:55.656 }, 00:23:55.656 { 00:23:55.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:55.656 "dma_device_type": 2 00:23:55.656 }, 00:23:55.656 { 00:23:55.656 "dma_device_id": "system", 00:23:55.656 "dma_device_type": 1 00:23:55.656 }, 00:23:55.656 { 00:23:55.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:55.656 "dma_device_type": 2 00:23:55.656 }, 00:23:55.656 { 00:23:55.656 "dma_device_id": "system", 00:23:55.656 "dma_device_type": 1 00:23:55.656 }, 00:23:55.656 { 00:23:55.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:55.656 "dma_device_type": 2 00:23:55.657 } 00:23:55.657 ], 00:23:55.657 "driver_specific": { 00:23:55.657 "raid": { 00:23:55.657 "uuid": "8bad9edb-4530-4c33-9585-2a7fc18ade4d", 00:23:55.657 "strip_size_kb": 0, 00:23:55.657 "state": "online", 00:23:55.657 "raid_level": "raid1", 00:23:55.657 "superblock": true, 00:23:55.657 "num_base_bdevs": 4, 00:23:55.657 "num_base_bdevs_discovered": 4, 00:23:55.657 "num_base_bdevs_operational": 4, 00:23:55.657 "base_bdevs_list": [ 00:23:55.657 { 00:23:55.657 "name": "BaseBdev1", 00:23:55.657 "uuid": "662b3ce3-b68f-4d60-8a56-e9219218c6c6", 00:23:55.657 "is_configured": true, 00:23:55.657 "data_offset": 2048, 00:23:55.657 "data_size": 63488 00:23:55.657 }, 00:23:55.657 { 00:23:55.657 "name": "BaseBdev2", 00:23:55.657 "uuid": "f428c629-6713-42d0-adc8-23f119fe0123", 00:23:55.657 "is_configured": true, 00:23:55.657 "data_offset": 2048, 00:23:55.657 "data_size": 63488 00:23:55.657 }, 00:23:55.657 { 00:23:55.657 "name": "BaseBdev3", 00:23:55.657 "uuid": "f0296caf-5aba-4369-abe1-0c7d9b0217c6", 00:23:55.657 "is_configured": true, 00:23:55.657 "data_offset": 2048, 00:23:55.657 "data_size": 63488 00:23:55.657 }, 00:23:55.657 { 00:23:55.657 "name": "BaseBdev4", 00:23:55.657 "uuid": "dd32b5c5-b861-471e-9d08-fa564ab6fd92", 00:23:55.657 "is_configured": true, 00:23:55.657 "data_offset": 2048, 00:23:55.657 "data_size": 63488 00:23:55.657 } 00:23:55.657 ] 00:23:55.657 } 00:23:55.657 } 00:23:55.657 }' 00:23:55.657 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:55.657 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:55.657 BaseBdev2 00:23:55.657 BaseBdev3 00:23:55.657 BaseBdev4' 00:23:55.657 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:55.657 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:55.657 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:55.914 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:55.914 "name": "BaseBdev1", 00:23:55.914 "aliases": [ 00:23:55.914 "662b3ce3-b68f-4d60-8a56-e9219218c6c6" 00:23:55.914 ], 00:23:55.914 "product_name": "Malloc disk", 00:23:55.914 "block_size": 512, 00:23:55.914 "num_blocks": 65536, 00:23:55.914 "uuid": "662b3ce3-b68f-4d60-8a56-e9219218c6c6", 00:23:55.914 "assigned_rate_limits": { 00:23:55.914 "rw_ios_per_sec": 0, 00:23:55.914 "rw_mbytes_per_sec": 0, 00:23:55.914 "r_mbytes_per_sec": 0, 00:23:55.914 "w_mbytes_per_sec": 0 00:23:55.914 }, 00:23:55.914 "claimed": true, 00:23:55.914 "claim_type": "exclusive_write", 00:23:55.914 "zoned": false, 00:23:55.914 "supported_io_types": { 00:23:55.914 "read": true, 00:23:55.914 "write": true, 00:23:55.914 "unmap": true, 00:23:55.914 "flush": true, 00:23:55.914 "reset": true, 00:23:55.914 "nvme_admin": false, 00:23:55.914 "nvme_io": false, 00:23:55.914 "nvme_io_md": false, 00:23:55.914 "write_zeroes": true, 00:23:55.914 "zcopy": true, 00:23:55.914 "get_zone_info": false, 00:23:55.914 "zone_management": false, 00:23:55.914 "zone_append": false, 00:23:55.914 "compare": false, 00:23:55.914 "compare_and_write": false, 00:23:55.914 "abort": true, 00:23:55.914 "seek_hole": false, 00:23:55.914 "seek_data": false, 00:23:55.914 "copy": true, 00:23:55.914 "nvme_iov_md": false 00:23:55.914 }, 00:23:55.914 "memory_domains": [ 00:23:55.914 { 00:23:55.914 "dma_device_id": "system", 00:23:55.914 "dma_device_type": 1 00:23:55.914 }, 00:23:55.914 { 00:23:55.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:55.914 "dma_device_type": 2 00:23:55.914 } 00:23:55.914 ], 00:23:55.914 "driver_specific": {} 00:23:55.914 }' 00:23:55.914 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:55.914 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:55.914 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:55.914 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:56.171 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:56.429 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:56.429 "name": "BaseBdev2", 00:23:56.429 "aliases": [ 00:23:56.429 "f428c629-6713-42d0-adc8-23f119fe0123" 00:23:56.429 ], 00:23:56.429 "product_name": "Malloc disk", 00:23:56.429 "block_size": 512, 00:23:56.429 "num_blocks": 65536, 00:23:56.429 "uuid": "f428c629-6713-42d0-adc8-23f119fe0123", 00:23:56.429 "assigned_rate_limits": { 00:23:56.429 "rw_ios_per_sec": 0, 00:23:56.429 "rw_mbytes_per_sec": 0, 00:23:56.429 "r_mbytes_per_sec": 0, 00:23:56.429 "w_mbytes_per_sec": 0 00:23:56.429 }, 00:23:56.429 "claimed": true, 00:23:56.429 "claim_type": "exclusive_write", 00:23:56.429 "zoned": false, 00:23:56.429 "supported_io_types": { 00:23:56.429 "read": true, 00:23:56.429 "write": true, 00:23:56.429 "unmap": true, 00:23:56.429 "flush": true, 00:23:56.429 "reset": true, 00:23:56.429 "nvme_admin": false, 00:23:56.429 "nvme_io": false, 00:23:56.429 "nvme_io_md": false, 00:23:56.429 "write_zeroes": true, 00:23:56.429 "zcopy": true, 00:23:56.429 "get_zone_info": false, 00:23:56.429 "zone_management": false, 00:23:56.429 "zone_append": false, 00:23:56.429 "compare": false, 00:23:56.429 "compare_and_write": false, 00:23:56.429 "abort": true, 00:23:56.429 "seek_hole": false, 00:23:56.429 "seek_data": false, 00:23:56.429 "copy": true, 00:23:56.429 "nvme_iov_md": false 00:23:56.429 }, 00:23:56.429 "memory_domains": [ 00:23:56.429 { 00:23:56.429 "dma_device_id": "system", 00:23:56.429 "dma_device_type": 1 00:23:56.429 }, 00:23:56.429 { 00:23:56.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.429 "dma_device_type": 2 00:23:56.429 } 00:23:56.429 ], 00:23:56.429 "driver_specific": {} 00:23:56.429 }' 00:23:56.429 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:56.687 06:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:56.687 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:56.687 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:56.687 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:56.687 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:56.687 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:56.687 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:56.687 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:56.687 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:56.944 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:56.944 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:56.944 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:56.944 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:56.944 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:57.201 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:57.201 "name": "BaseBdev3", 00:23:57.201 "aliases": [ 00:23:57.201 "f0296caf-5aba-4369-abe1-0c7d9b0217c6" 00:23:57.201 ], 00:23:57.201 "product_name": "Malloc disk", 00:23:57.201 "block_size": 512, 00:23:57.201 "num_blocks": 65536, 00:23:57.201 "uuid": "f0296caf-5aba-4369-abe1-0c7d9b0217c6", 00:23:57.201 "assigned_rate_limits": { 00:23:57.201 "rw_ios_per_sec": 0, 00:23:57.201 "rw_mbytes_per_sec": 0, 00:23:57.201 "r_mbytes_per_sec": 0, 00:23:57.201 "w_mbytes_per_sec": 0 00:23:57.201 }, 00:23:57.201 "claimed": true, 00:23:57.201 "claim_type": "exclusive_write", 00:23:57.201 "zoned": false, 00:23:57.201 "supported_io_types": { 00:23:57.201 "read": true, 00:23:57.201 "write": true, 00:23:57.201 "unmap": true, 00:23:57.201 "flush": true, 00:23:57.201 "reset": true, 00:23:57.201 "nvme_admin": false, 00:23:57.201 "nvme_io": false, 00:23:57.201 "nvme_io_md": false, 00:23:57.201 "write_zeroes": true, 00:23:57.201 "zcopy": true, 00:23:57.201 "get_zone_info": false, 00:23:57.202 "zone_management": false, 00:23:57.202 "zone_append": false, 00:23:57.202 "compare": false, 00:23:57.202 "compare_and_write": false, 00:23:57.202 "abort": true, 00:23:57.202 "seek_hole": false, 00:23:57.202 "seek_data": false, 00:23:57.202 "copy": true, 00:23:57.202 "nvme_iov_md": false 00:23:57.202 }, 00:23:57.202 "memory_domains": [ 00:23:57.202 { 00:23:57.202 "dma_device_id": "system", 00:23:57.202 "dma_device_type": 1 00:23:57.202 }, 00:23:57.202 { 00:23:57.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:57.202 "dma_device_type": 2 00:23:57.202 } 00:23:57.202 ], 00:23:57.202 "driver_specific": {} 00:23:57.202 }' 00:23:57.202 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.202 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.202 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:57.202 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.202 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.202 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:57.202 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.202 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.459 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:57.459 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.459 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.459 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:57.459 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:57.459 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:57.459 06:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:57.717 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:57.717 "name": "BaseBdev4", 00:23:57.717 "aliases": [ 00:23:57.717 "dd32b5c5-b861-471e-9d08-fa564ab6fd92" 00:23:57.717 ], 00:23:57.717 "product_name": "Malloc disk", 00:23:57.717 "block_size": 512, 00:23:57.717 "num_blocks": 65536, 00:23:57.717 "uuid": "dd32b5c5-b861-471e-9d08-fa564ab6fd92", 00:23:57.717 "assigned_rate_limits": { 00:23:57.717 "rw_ios_per_sec": 0, 00:23:57.717 "rw_mbytes_per_sec": 0, 00:23:57.717 "r_mbytes_per_sec": 0, 00:23:57.717 "w_mbytes_per_sec": 0 00:23:57.717 }, 00:23:57.717 "claimed": true, 00:23:57.717 "claim_type": "exclusive_write", 00:23:57.717 "zoned": false, 00:23:57.717 "supported_io_types": { 00:23:57.717 "read": true, 00:23:57.717 "write": true, 00:23:57.717 "unmap": true, 00:23:57.717 "flush": true, 00:23:57.717 "reset": true, 00:23:57.717 "nvme_admin": false, 00:23:57.717 "nvme_io": false, 00:23:57.717 "nvme_io_md": false, 00:23:57.717 "write_zeroes": true, 00:23:57.717 "zcopy": true, 00:23:57.717 "get_zone_info": false, 00:23:57.717 "zone_management": false, 00:23:57.717 "zone_append": false, 00:23:57.717 "compare": false, 00:23:57.717 "compare_and_write": false, 00:23:57.717 "abort": true, 00:23:57.717 "seek_hole": false, 00:23:57.717 "seek_data": false, 00:23:57.717 "copy": true, 00:23:57.717 "nvme_iov_md": false 00:23:57.717 }, 00:23:57.717 "memory_domains": [ 00:23:57.717 { 00:23:57.717 "dma_device_id": "system", 00:23:57.717 "dma_device_type": 1 00:23:57.717 }, 00:23:57.717 { 00:23:57.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:57.717 "dma_device_type": 2 00:23:57.717 } 00:23:57.717 ], 00:23:57.717 "driver_specific": {} 00:23:57.717 }' 00:23:57.717 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.717 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.717 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:57.717 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.717 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.975 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:57.975 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.975 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.975 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:57.975 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.975 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.975 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:57.975 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:58.232 [2024-07-25 06:40:11.689882] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.232 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:58.490 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:58.490 "name": "Existed_Raid", 00:23:58.490 "uuid": "8bad9edb-4530-4c33-9585-2a7fc18ade4d", 00:23:58.490 "strip_size_kb": 0, 00:23:58.490 "state": "online", 00:23:58.490 "raid_level": "raid1", 00:23:58.490 "superblock": true, 00:23:58.490 "num_base_bdevs": 4, 00:23:58.490 "num_base_bdevs_discovered": 3, 00:23:58.490 "num_base_bdevs_operational": 3, 00:23:58.490 "base_bdevs_list": [ 00:23:58.490 { 00:23:58.490 "name": null, 00:23:58.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.490 "is_configured": false, 00:23:58.490 "data_offset": 2048, 00:23:58.490 "data_size": 63488 00:23:58.490 }, 00:23:58.490 { 00:23:58.490 "name": "BaseBdev2", 00:23:58.490 "uuid": "f428c629-6713-42d0-adc8-23f119fe0123", 00:23:58.490 "is_configured": true, 00:23:58.490 "data_offset": 2048, 00:23:58.490 "data_size": 63488 00:23:58.490 }, 00:23:58.490 { 00:23:58.490 "name": "BaseBdev3", 00:23:58.490 "uuid": "f0296caf-5aba-4369-abe1-0c7d9b0217c6", 00:23:58.490 "is_configured": true, 00:23:58.490 "data_offset": 2048, 00:23:58.490 "data_size": 63488 00:23:58.490 }, 00:23:58.490 { 00:23:58.490 "name": "BaseBdev4", 00:23:58.490 "uuid": "dd32b5c5-b861-471e-9d08-fa564ab6fd92", 00:23:58.490 "is_configured": true, 00:23:58.490 "data_offset": 2048, 00:23:58.490 "data_size": 63488 00:23:58.490 } 00:23:58.490 ] 00:23:58.490 }' 00:23:58.490 06:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:58.490 06:40:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:59.054 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:59.054 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:59.054 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.054 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:59.312 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:59.312 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:59.312 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:59.570 [2024-07-25 06:40:12.873994] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:59.570 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:59.570 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:59.570 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.570 06:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:59.570 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:59.570 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:59.570 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:23:59.828 [2024-07-25 06:40:13.273018] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:59.828 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:59.828 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:59.828 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.828 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:00.085 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:00.085 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:00.085 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:24:00.343 [2024-07-25 06:40:13.732220] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:24:00.343 [2024-07-25 06:40:13.732302] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:00.343 [2024-07-25 06:40:13.742468] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:00.343 [2024-07-25 06:40:13.742497] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:00.343 [2024-07-25 06:40:13.742508] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce5250 name Existed_Raid, state offline 00:24:00.343 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:00.343 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:00.343 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.343 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:00.600 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:00.600 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:00.600 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:24:00.600 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:24:00.600 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:00.600 06:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:00.857 BaseBdev2 00:24:00.857 06:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:24:00.857 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:24:00.857 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:00.857 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:24:00.857 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:00.857 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:00.857 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:01.114 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:01.114 [ 00:24:01.114 { 00:24:01.114 "name": "BaseBdev2", 00:24:01.114 "aliases": [ 00:24:01.114 "00e4a481-2d69-4ea5-b0fa-abebad9e8444" 00:24:01.114 ], 00:24:01.114 "product_name": "Malloc disk", 00:24:01.114 "block_size": 512, 00:24:01.114 "num_blocks": 65536, 00:24:01.114 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:01.114 "assigned_rate_limits": { 00:24:01.114 "rw_ios_per_sec": 0, 00:24:01.114 "rw_mbytes_per_sec": 0, 00:24:01.114 "r_mbytes_per_sec": 0, 00:24:01.114 "w_mbytes_per_sec": 0 00:24:01.114 }, 00:24:01.114 "claimed": false, 00:24:01.114 "zoned": false, 00:24:01.114 "supported_io_types": { 00:24:01.114 "read": true, 00:24:01.114 "write": true, 00:24:01.114 "unmap": true, 00:24:01.114 "flush": true, 00:24:01.114 "reset": true, 00:24:01.114 "nvme_admin": false, 00:24:01.114 "nvme_io": false, 00:24:01.114 "nvme_io_md": false, 00:24:01.114 "write_zeroes": true, 00:24:01.114 "zcopy": true, 00:24:01.114 "get_zone_info": false, 00:24:01.114 "zone_management": false, 00:24:01.114 "zone_append": false, 00:24:01.114 "compare": false, 00:24:01.114 "compare_and_write": false, 00:24:01.114 "abort": true, 00:24:01.114 "seek_hole": false, 00:24:01.114 "seek_data": false, 00:24:01.114 "copy": true, 00:24:01.114 "nvme_iov_md": false 00:24:01.114 }, 00:24:01.114 "memory_domains": [ 00:24:01.114 { 00:24:01.114 "dma_device_id": "system", 00:24:01.114 "dma_device_type": 1 00:24:01.114 }, 00:24:01.114 { 00:24:01.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:01.114 "dma_device_type": 2 00:24:01.114 } 00:24:01.114 ], 00:24:01.114 "driver_specific": {} 00:24:01.114 } 00:24:01.114 ] 00:24:01.114 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:24:01.114 06:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:01.114 06:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:01.114 06:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:01.372 BaseBdev3 00:24:01.372 06:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:24:01.372 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:24:01.372 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:01.372 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:24:01.372 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:01.372 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:01.372 06:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:01.630 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:01.888 [ 00:24:01.888 { 00:24:01.888 "name": "BaseBdev3", 00:24:01.888 "aliases": [ 00:24:01.888 "3cfeac6f-d300-437b-a6ed-003bd6f7a774" 00:24:01.888 ], 00:24:01.888 "product_name": "Malloc disk", 00:24:01.888 "block_size": 512, 00:24:01.888 "num_blocks": 65536, 00:24:01.888 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:01.888 "assigned_rate_limits": { 00:24:01.888 "rw_ios_per_sec": 0, 00:24:01.888 "rw_mbytes_per_sec": 0, 00:24:01.888 "r_mbytes_per_sec": 0, 00:24:01.888 "w_mbytes_per_sec": 0 00:24:01.888 }, 00:24:01.888 "claimed": false, 00:24:01.888 "zoned": false, 00:24:01.888 "supported_io_types": { 00:24:01.888 "read": true, 00:24:01.888 "write": true, 00:24:01.888 "unmap": true, 00:24:01.888 "flush": true, 00:24:01.888 "reset": true, 00:24:01.888 "nvme_admin": false, 00:24:01.888 "nvme_io": false, 00:24:01.888 "nvme_io_md": false, 00:24:01.888 "write_zeroes": true, 00:24:01.888 "zcopy": true, 00:24:01.888 "get_zone_info": false, 00:24:01.888 "zone_management": false, 00:24:01.888 "zone_append": false, 00:24:01.888 "compare": false, 00:24:01.888 "compare_and_write": false, 00:24:01.888 "abort": true, 00:24:01.888 "seek_hole": false, 00:24:01.888 "seek_data": false, 00:24:01.888 "copy": true, 00:24:01.888 "nvme_iov_md": false 00:24:01.888 }, 00:24:01.888 "memory_domains": [ 00:24:01.888 { 00:24:01.888 "dma_device_id": "system", 00:24:01.888 "dma_device_type": 1 00:24:01.888 }, 00:24:01.888 { 00:24:01.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:01.888 "dma_device_type": 2 00:24:01.888 } 00:24:01.888 ], 00:24:01.888 "driver_specific": {} 00:24:01.888 } 00:24:01.888 ] 00:24:01.888 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:24:01.888 06:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:01.888 06:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:01.888 06:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:24:02.146 BaseBdev4 00:24:02.146 06:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:24:02.146 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:24:02.146 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:02.146 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:24:02.146 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:02.146 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:02.146 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:02.405 06:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:24:02.664 [ 00:24:02.664 { 00:24:02.664 "name": "BaseBdev4", 00:24:02.664 "aliases": [ 00:24:02.664 "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff" 00:24:02.664 ], 00:24:02.664 "product_name": "Malloc disk", 00:24:02.664 "block_size": 512, 00:24:02.664 "num_blocks": 65536, 00:24:02.664 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:02.664 "assigned_rate_limits": { 00:24:02.664 "rw_ios_per_sec": 0, 00:24:02.664 "rw_mbytes_per_sec": 0, 00:24:02.664 "r_mbytes_per_sec": 0, 00:24:02.664 "w_mbytes_per_sec": 0 00:24:02.664 }, 00:24:02.664 "claimed": false, 00:24:02.664 "zoned": false, 00:24:02.664 "supported_io_types": { 00:24:02.664 "read": true, 00:24:02.664 "write": true, 00:24:02.664 "unmap": true, 00:24:02.664 "flush": true, 00:24:02.664 "reset": true, 00:24:02.664 "nvme_admin": false, 00:24:02.664 "nvme_io": false, 00:24:02.664 "nvme_io_md": false, 00:24:02.664 "write_zeroes": true, 00:24:02.664 "zcopy": true, 00:24:02.664 "get_zone_info": false, 00:24:02.664 "zone_management": false, 00:24:02.664 "zone_append": false, 00:24:02.664 "compare": false, 00:24:02.664 "compare_and_write": false, 00:24:02.664 "abort": true, 00:24:02.664 "seek_hole": false, 00:24:02.664 "seek_data": false, 00:24:02.664 "copy": true, 00:24:02.664 "nvme_iov_md": false 00:24:02.664 }, 00:24:02.664 "memory_domains": [ 00:24:02.664 { 00:24:02.664 "dma_device_id": "system", 00:24:02.664 "dma_device_type": 1 00:24:02.664 }, 00:24:02.664 { 00:24:02.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:02.664 "dma_device_type": 2 00:24:02.664 } 00:24:02.664 ], 00:24:02.664 "driver_specific": {} 00:24:02.664 } 00:24:02.664 ] 00:24:02.664 06:40:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:24:02.664 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:02.664 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:02.664 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:02.922 [2024-07-25 06:40:16.221514] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:02.923 [2024-07-25 06:40:16.221555] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:02.923 [2024-07-25 06:40:16.221575] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:02.923 [2024-07-25 06:40:16.222800] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:02.923 [2024-07-25 06:40:16.222841] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:02.923 "name": "Existed_Raid", 00:24:02.923 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:02.923 "strip_size_kb": 0, 00:24:02.923 "state": "configuring", 00:24:02.923 "raid_level": "raid1", 00:24:02.923 "superblock": true, 00:24:02.923 "num_base_bdevs": 4, 00:24:02.923 "num_base_bdevs_discovered": 3, 00:24:02.923 "num_base_bdevs_operational": 4, 00:24:02.923 "base_bdevs_list": [ 00:24:02.923 { 00:24:02.923 "name": "BaseBdev1", 00:24:02.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.923 "is_configured": false, 00:24:02.923 "data_offset": 0, 00:24:02.923 "data_size": 0 00:24:02.923 }, 00:24:02.923 { 00:24:02.923 "name": "BaseBdev2", 00:24:02.923 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:02.923 "is_configured": true, 00:24:02.923 "data_offset": 2048, 00:24:02.923 "data_size": 63488 00:24:02.923 }, 00:24:02.923 { 00:24:02.923 "name": "BaseBdev3", 00:24:02.923 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:02.923 "is_configured": true, 00:24:02.923 "data_offset": 2048, 00:24:02.923 "data_size": 63488 00:24:02.923 }, 00:24:02.923 { 00:24:02.923 "name": "BaseBdev4", 00:24:02.923 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:02.923 "is_configured": true, 00:24:02.923 "data_offset": 2048, 00:24:02.923 "data_size": 63488 00:24:02.923 } 00:24:02.923 ] 00:24:02.923 }' 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:02.923 06:40:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:03.516 06:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:03.516 [2024-07-25 06:40:17.007546] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.516 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:03.775 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.775 "name": "Existed_Raid", 00:24:03.775 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:03.775 "strip_size_kb": 0, 00:24:03.775 "state": "configuring", 00:24:03.775 "raid_level": "raid1", 00:24:03.775 "superblock": true, 00:24:03.775 "num_base_bdevs": 4, 00:24:03.775 "num_base_bdevs_discovered": 2, 00:24:03.775 "num_base_bdevs_operational": 4, 00:24:03.775 "base_bdevs_list": [ 00:24:03.775 { 00:24:03.775 "name": "BaseBdev1", 00:24:03.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.775 "is_configured": false, 00:24:03.775 "data_offset": 0, 00:24:03.775 "data_size": 0 00:24:03.775 }, 00:24:03.775 { 00:24:03.775 "name": null, 00:24:03.775 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:03.775 "is_configured": false, 00:24:03.775 "data_offset": 2048, 00:24:03.775 "data_size": 63488 00:24:03.775 }, 00:24:03.775 { 00:24:03.775 "name": "BaseBdev3", 00:24:03.775 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:03.775 "is_configured": true, 00:24:03.775 "data_offset": 2048, 00:24:03.775 "data_size": 63488 00:24:03.775 }, 00:24:03.775 { 00:24:03.775 "name": "BaseBdev4", 00:24:03.775 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:03.775 "is_configured": true, 00:24:03.775 "data_offset": 2048, 00:24:03.775 "data_size": 63488 00:24:03.775 } 00:24:03.775 ] 00:24:03.775 }' 00:24:03.775 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.775 06:40:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:04.341 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.341 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:24:04.600 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:24:04.600 06:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:24:04.600 [2024-07-25 06:40:18.125562] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:04.600 BaseBdev1 00:24:04.600 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:24:04.600 06:40:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:24:04.600 06:40:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:04.600 06:40:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:24:04.600 06:40:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:04.600 06:40:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:04.600 06:40:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:04.858 06:40:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:05.116 [ 00:24:05.116 { 00:24:05.116 "name": "BaseBdev1", 00:24:05.116 "aliases": [ 00:24:05.116 "facc2333-e70c-4be9-a991-556cd966c719" 00:24:05.116 ], 00:24:05.116 "product_name": "Malloc disk", 00:24:05.116 "block_size": 512, 00:24:05.116 "num_blocks": 65536, 00:24:05.116 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:05.116 "assigned_rate_limits": { 00:24:05.116 "rw_ios_per_sec": 0, 00:24:05.116 "rw_mbytes_per_sec": 0, 00:24:05.116 "r_mbytes_per_sec": 0, 00:24:05.116 "w_mbytes_per_sec": 0 00:24:05.116 }, 00:24:05.116 "claimed": true, 00:24:05.116 "claim_type": "exclusive_write", 00:24:05.116 "zoned": false, 00:24:05.116 "supported_io_types": { 00:24:05.116 "read": true, 00:24:05.116 "write": true, 00:24:05.116 "unmap": true, 00:24:05.116 "flush": true, 00:24:05.116 "reset": true, 00:24:05.116 "nvme_admin": false, 00:24:05.116 "nvme_io": false, 00:24:05.116 "nvme_io_md": false, 00:24:05.116 "write_zeroes": true, 00:24:05.116 "zcopy": true, 00:24:05.116 "get_zone_info": false, 00:24:05.116 "zone_management": false, 00:24:05.116 "zone_append": false, 00:24:05.116 "compare": false, 00:24:05.116 "compare_and_write": false, 00:24:05.116 "abort": true, 00:24:05.116 "seek_hole": false, 00:24:05.116 "seek_data": false, 00:24:05.116 "copy": true, 00:24:05.116 "nvme_iov_md": false 00:24:05.116 }, 00:24:05.116 "memory_domains": [ 00:24:05.116 { 00:24:05.116 "dma_device_id": "system", 00:24:05.116 "dma_device_type": 1 00:24:05.116 }, 00:24:05.116 { 00:24:05.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:05.116 "dma_device_type": 2 00:24:05.116 } 00:24:05.116 ], 00:24:05.116 "driver_specific": {} 00:24:05.116 } 00:24:05.116 ] 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.116 06:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:05.680 06:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.680 "name": "Existed_Raid", 00:24:05.680 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:05.680 "strip_size_kb": 0, 00:24:05.680 "state": "configuring", 00:24:05.680 "raid_level": "raid1", 00:24:05.680 "superblock": true, 00:24:05.680 "num_base_bdevs": 4, 00:24:05.680 "num_base_bdevs_discovered": 3, 00:24:05.680 "num_base_bdevs_operational": 4, 00:24:05.680 "base_bdevs_list": [ 00:24:05.680 { 00:24:05.680 "name": "BaseBdev1", 00:24:05.680 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:05.680 "is_configured": true, 00:24:05.680 "data_offset": 2048, 00:24:05.680 "data_size": 63488 00:24:05.680 }, 00:24:05.680 { 00:24:05.680 "name": null, 00:24:05.680 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:05.680 "is_configured": false, 00:24:05.680 "data_offset": 2048, 00:24:05.680 "data_size": 63488 00:24:05.680 }, 00:24:05.680 { 00:24:05.680 "name": "BaseBdev3", 00:24:05.680 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:05.680 "is_configured": true, 00:24:05.680 "data_offset": 2048, 00:24:05.680 "data_size": 63488 00:24:05.680 }, 00:24:05.680 { 00:24:05.680 "name": "BaseBdev4", 00:24:05.680 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:05.681 "is_configured": true, 00:24:05.681 "data_offset": 2048, 00:24:05.681 "data_size": 63488 00:24:05.681 } 00:24:05.681 ] 00:24:05.681 }' 00:24:05.681 06:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.681 06:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:06.245 06:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:24:06.245 06:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.502 06:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:24:06.502 06:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:24:06.760 [2024-07-25 06:40:20.110967] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.760 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:07.017 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:07.017 "name": "Existed_Raid", 00:24:07.017 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:07.017 "strip_size_kb": 0, 00:24:07.017 "state": "configuring", 00:24:07.017 "raid_level": "raid1", 00:24:07.018 "superblock": true, 00:24:07.018 "num_base_bdevs": 4, 00:24:07.018 "num_base_bdevs_discovered": 2, 00:24:07.018 "num_base_bdevs_operational": 4, 00:24:07.018 "base_bdevs_list": [ 00:24:07.018 { 00:24:07.018 "name": "BaseBdev1", 00:24:07.018 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:07.018 "is_configured": true, 00:24:07.018 "data_offset": 2048, 00:24:07.018 "data_size": 63488 00:24:07.018 }, 00:24:07.018 { 00:24:07.018 "name": null, 00:24:07.018 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:07.018 "is_configured": false, 00:24:07.018 "data_offset": 2048, 00:24:07.018 "data_size": 63488 00:24:07.018 }, 00:24:07.018 { 00:24:07.018 "name": null, 00:24:07.018 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:07.018 "is_configured": false, 00:24:07.018 "data_offset": 2048, 00:24:07.018 "data_size": 63488 00:24:07.018 }, 00:24:07.018 { 00:24:07.018 "name": "BaseBdev4", 00:24:07.018 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:07.018 "is_configured": true, 00:24:07.018 "data_offset": 2048, 00:24:07.018 "data_size": 63488 00:24:07.018 } 00:24:07.018 ] 00:24:07.018 }' 00:24:07.018 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:07.018 06:40:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:07.582 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.582 06:40:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:24:07.582 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:24:07.582 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:24:07.840 [2024-07-25 06:40:21.181806] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.840 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:08.098 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:08.098 "name": "Existed_Raid", 00:24:08.098 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:08.098 "strip_size_kb": 0, 00:24:08.098 "state": "configuring", 00:24:08.098 "raid_level": "raid1", 00:24:08.098 "superblock": true, 00:24:08.098 "num_base_bdevs": 4, 00:24:08.098 "num_base_bdevs_discovered": 3, 00:24:08.098 "num_base_bdevs_operational": 4, 00:24:08.098 "base_bdevs_list": [ 00:24:08.098 { 00:24:08.098 "name": "BaseBdev1", 00:24:08.098 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:08.098 "is_configured": true, 00:24:08.098 "data_offset": 2048, 00:24:08.098 "data_size": 63488 00:24:08.098 }, 00:24:08.098 { 00:24:08.098 "name": null, 00:24:08.098 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:08.098 "is_configured": false, 00:24:08.098 "data_offset": 2048, 00:24:08.098 "data_size": 63488 00:24:08.098 }, 00:24:08.098 { 00:24:08.098 "name": "BaseBdev3", 00:24:08.098 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:08.098 "is_configured": true, 00:24:08.098 "data_offset": 2048, 00:24:08.098 "data_size": 63488 00:24:08.098 }, 00:24:08.098 { 00:24:08.098 "name": "BaseBdev4", 00:24:08.098 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:08.098 "is_configured": true, 00:24:08.098 "data_offset": 2048, 00:24:08.098 "data_size": 63488 00:24:08.098 } 00:24:08.098 ] 00:24:08.098 }' 00:24:08.098 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:08.098 06:40:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:08.663 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:24:08.663 06:40:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.663 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:24:08.663 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:08.921 [2024-07-25 06:40:22.316842] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.921 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:09.487 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.487 "name": "Existed_Raid", 00:24:09.487 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:09.487 "strip_size_kb": 0, 00:24:09.487 "state": "configuring", 00:24:09.487 "raid_level": "raid1", 00:24:09.487 "superblock": true, 00:24:09.487 "num_base_bdevs": 4, 00:24:09.487 "num_base_bdevs_discovered": 2, 00:24:09.487 "num_base_bdevs_operational": 4, 00:24:09.487 "base_bdevs_list": [ 00:24:09.487 { 00:24:09.487 "name": null, 00:24:09.487 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:09.487 "is_configured": false, 00:24:09.487 "data_offset": 2048, 00:24:09.487 "data_size": 63488 00:24:09.487 }, 00:24:09.487 { 00:24:09.487 "name": null, 00:24:09.487 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:09.487 "is_configured": false, 00:24:09.487 "data_offset": 2048, 00:24:09.487 "data_size": 63488 00:24:09.487 }, 00:24:09.487 { 00:24:09.487 "name": "BaseBdev3", 00:24:09.487 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:09.487 "is_configured": true, 00:24:09.487 "data_offset": 2048, 00:24:09.487 "data_size": 63488 00:24:09.487 }, 00:24:09.487 { 00:24:09.487 "name": "BaseBdev4", 00:24:09.487 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:09.487 "is_configured": true, 00:24:09.487 "data_offset": 2048, 00:24:09.487 "data_size": 63488 00:24:09.487 } 00:24:09.487 ] 00:24:09.487 }' 00:24:09.487 06:40:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.487 06:40:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:10.051 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.051 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:24:10.308 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:24:10.308 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:24:10.566 [2024-07-25 06:40:23.866704] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:10.566 06:40:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.566 06:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:10.566 "name": "Existed_Raid", 00:24:10.566 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:10.566 "strip_size_kb": 0, 00:24:10.566 "state": "configuring", 00:24:10.566 "raid_level": "raid1", 00:24:10.566 "superblock": true, 00:24:10.566 "num_base_bdevs": 4, 00:24:10.566 "num_base_bdevs_discovered": 3, 00:24:10.566 "num_base_bdevs_operational": 4, 00:24:10.566 "base_bdevs_list": [ 00:24:10.566 { 00:24:10.566 "name": null, 00:24:10.566 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:10.566 "is_configured": false, 00:24:10.566 "data_offset": 2048, 00:24:10.566 "data_size": 63488 00:24:10.566 }, 00:24:10.566 { 00:24:10.566 "name": "BaseBdev2", 00:24:10.566 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:10.566 "is_configured": true, 00:24:10.566 "data_offset": 2048, 00:24:10.566 "data_size": 63488 00:24:10.566 }, 00:24:10.566 { 00:24:10.566 "name": "BaseBdev3", 00:24:10.566 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:10.566 "is_configured": true, 00:24:10.566 "data_offset": 2048, 00:24:10.566 "data_size": 63488 00:24:10.566 }, 00:24:10.566 { 00:24:10.566 "name": "BaseBdev4", 00:24:10.566 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:10.566 "is_configured": true, 00:24:10.566 "data_offset": 2048, 00:24:10.566 "data_size": 63488 00:24:10.566 } 00:24:10.566 ] 00:24:10.566 }' 00:24:10.566 06:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:10.566 06:40:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:11.497 06:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.497 06:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:24:11.497 06:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:24:11.497 06:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.497 06:40:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:24:11.755 06:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u facc2333-e70c-4be9-a991-556cd966c719 00:24:12.013 [2024-07-25 06:40:25.353872] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:24:12.013 [2024-07-25 06:40:25.354028] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cd8450 00:24:12.013 [2024-07-25 06:40:25.354040] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:12.013 [2024-07-25 06:40:25.354217] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd8790 00:24:12.013 [2024-07-25 06:40:25.354339] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cd8450 00:24:12.013 [2024-07-25 06:40:25.354349] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cd8450 00:24:12.013 [2024-07-25 06:40:25.354437] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:12.013 NewBaseBdev 00:24:12.013 06:40:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:24:12.013 06:40:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:24:12.013 06:40:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:12.013 06:40:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:24:12.013 06:40:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:12.014 06:40:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:12.014 06:40:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:12.580 06:40:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:24:12.580 [ 00:24:12.580 { 00:24:12.580 "name": "NewBaseBdev", 00:24:12.580 "aliases": [ 00:24:12.580 "facc2333-e70c-4be9-a991-556cd966c719" 00:24:12.580 ], 00:24:12.580 "product_name": "Malloc disk", 00:24:12.580 "block_size": 512, 00:24:12.580 "num_blocks": 65536, 00:24:12.580 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:12.580 "assigned_rate_limits": { 00:24:12.580 "rw_ios_per_sec": 0, 00:24:12.580 "rw_mbytes_per_sec": 0, 00:24:12.580 "r_mbytes_per_sec": 0, 00:24:12.580 "w_mbytes_per_sec": 0 00:24:12.580 }, 00:24:12.580 "claimed": true, 00:24:12.580 "claim_type": "exclusive_write", 00:24:12.580 "zoned": false, 00:24:12.580 "supported_io_types": { 00:24:12.580 "read": true, 00:24:12.580 "write": true, 00:24:12.580 "unmap": true, 00:24:12.580 "flush": true, 00:24:12.580 "reset": true, 00:24:12.580 "nvme_admin": false, 00:24:12.580 "nvme_io": false, 00:24:12.580 "nvme_io_md": false, 00:24:12.580 "write_zeroes": true, 00:24:12.580 "zcopy": true, 00:24:12.580 "get_zone_info": false, 00:24:12.580 "zone_management": false, 00:24:12.580 "zone_append": false, 00:24:12.580 "compare": false, 00:24:12.580 "compare_and_write": false, 00:24:12.580 "abort": true, 00:24:12.580 "seek_hole": false, 00:24:12.580 "seek_data": false, 00:24:12.580 "copy": true, 00:24:12.580 "nvme_iov_md": false 00:24:12.580 }, 00:24:12.580 "memory_domains": [ 00:24:12.580 { 00:24:12.580 "dma_device_id": "system", 00:24:12.580 "dma_device_type": 1 00:24:12.580 }, 00:24:12.580 { 00:24:12.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:12.580 "dma_device_type": 2 00:24:12.580 } 00:24:12.580 ], 00:24:12.580 "driver_specific": {} 00:24:12.580 } 00:24:12.580 ] 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.580 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:12.838 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.838 "name": "Existed_Raid", 00:24:12.838 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:12.838 "strip_size_kb": 0, 00:24:12.838 "state": "online", 00:24:12.838 "raid_level": "raid1", 00:24:12.838 "superblock": true, 00:24:12.838 "num_base_bdevs": 4, 00:24:12.838 "num_base_bdevs_discovered": 4, 00:24:12.838 "num_base_bdevs_operational": 4, 00:24:12.838 "base_bdevs_list": [ 00:24:12.838 { 00:24:12.838 "name": "NewBaseBdev", 00:24:12.838 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:12.838 "is_configured": true, 00:24:12.838 "data_offset": 2048, 00:24:12.838 "data_size": 63488 00:24:12.838 }, 00:24:12.838 { 00:24:12.838 "name": "BaseBdev2", 00:24:12.839 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:12.839 "is_configured": true, 00:24:12.839 "data_offset": 2048, 00:24:12.839 "data_size": 63488 00:24:12.839 }, 00:24:12.839 { 00:24:12.839 "name": "BaseBdev3", 00:24:12.839 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:12.839 "is_configured": true, 00:24:12.839 "data_offset": 2048, 00:24:12.839 "data_size": 63488 00:24:12.839 }, 00:24:12.839 { 00:24:12.839 "name": "BaseBdev4", 00:24:12.839 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:12.839 "is_configured": true, 00:24:12.839 "data_offset": 2048, 00:24:12.839 "data_size": 63488 00:24:12.839 } 00:24:12.839 ] 00:24:12.839 }' 00:24:12.839 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.839 06:40:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:13.404 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:24:13.404 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:13.404 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:13.404 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:13.404 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:13.404 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:24:13.404 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:13.404 06:40:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:13.662 [2024-07-25 06:40:27.122864] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:13.662 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:13.662 "name": "Existed_Raid", 00:24:13.662 "aliases": [ 00:24:13.662 "564da6d7-98ad-42ba-9f33-93aefd2fec6f" 00:24:13.662 ], 00:24:13.662 "product_name": "Raid Volume", 00:24:13.662 "block_size": 512, 00:24:13.662 "num_blocks": 63488, 00:24:13.662 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:13.662 "assigned_rate_limits": { 00:24:13.662 "rw_ios_per_sec": 0, 00:24:13.662 "rw_mbytes_per_sec": 0, 00:24:13.662 "r_mbytes_per_sec": 0, 00:24:13.662 "w_mbytes_per_sec": 0 00:24:13.662 }, 00:24:13.662 "claimed": false, 00:24:13.662 "zoned": false, 00:24:13.662 "supported_io_types": { 00:24:13.662 "read": true, 00:24:13.662 "write": true, 00:24:13.662 "unmap": false, 00:24:13.662 "flush": false, 00:24:13.662 "reset": true, 00:24:13.662 "nvme_admin": false, 00:24:13.662 "nvme_io": false, 00:24:13.662 "nvme_io_md": false, 00:24:13.662 "write_zeroes": true, 00:24:13.662 "zcopy": false, 00:24:13.662 "get_zone_info": false, 00:24:13.662 "zone_management": false, 00:24:13.662 "zone_append": false, 00:24:13.662 "compare": false, 00:24:13.662 "compare_and_write": false, 00:24:13.662 "abort": false, 00:24:13.663 "seek_hole": false, 00:24:13.663 "seek_data": false, 00:24:13.663 "copy": false, 00:24:13.663 "nvme_iov_md": false 00:24:13.663 }, 00:24:13.663 "memory_domains": [ 00:24:13.663 { 00:24:13.663 "dma_device_id": "system", 00:24:13.663 "dma_device_type": 1 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.663 "dma_device_type": 2 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "dma_device_id": "system", 00:24:13.663 "dma_device_type": 1 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.663 "dma_device_type": 2 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "dma_device_id": "system", 00:24:13.663 "dma_device_type": 1 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.663 "dma_device_type": 2 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "dma_device_id": "system", 00:24:13.663 "dma_device_type": 1 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.663 "dma_device_type": 2 00:24:13.663 } 00:24:13.663 ], 00:24:13.663 "driver_specific": { 00:24:13.663 "raid": { 00:24:13.663 "uuid": "564da6d7-98ad-42ba-9f33-93aefd2fec6f", 00:24:13.663 "strip_size_kb": 0, 00:24:13.663 "state": "online", 00:24:13.663 "raid_level": "raid1", 00:24:13.663 "superblock": true, 00:24:13.663 "num_base_bdevs": 4, 00:24:13.663 "num_base_bdevs_discovered": 4, 00:24:13.663 "num_base_bdevs_operational": 4, 00:24:13.663 "base_bdevs_list": [ 00:24:13.663 { 00:24:13.663 "name": "NewBaseBdev", 00:24:13.663 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:13.663 "is_configured": true, 00:24:13.663 "data_offset": 2048, 00:24:13.663 "data_size": 63488 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "name": "BaseBdev2", 00:24:13.663 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:13.663 "is_configured": true, 00:24:13.663 "data_offset": 2048, 00:24:13.663 "data_size": 63488 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "name": "BaseBdev3", 00:24:13.663 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:13.663 "is_configured": true, 00:24:13.663 "data_offset": 2048, 00:24:13.663 "data_size": 63488 00:24:13.663 }, 00:24:13.663 { 00:24:13.663 "name": "BaseBdev4", 00:24:13.663 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:13.663 "is_configured": true, 00:24:13.663 "data_offset": 2048, 00:24:13.663 "data_size": 63488 00:24:13.663 } 00:24:13.663 ] 00:24:13.663 } 00:24:13.663 } 00:24:13.663 }' 00:24:13.663 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:13.663 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:24:13.663 BaseBdev2 00:24:13.663 BaseBdev3 00:24:13.663 BaseBdev4' 00:24:13.663 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:13.663 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:24:13.663 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:13.921 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:13.921 "name": "NewBaseBdev", 00:24:13.921 "aliases": [ 00:24:13.921 "facc2333-e70c-4be9-a991-556cd966c719" 00:24:13.921 ], 00:24:13.921 "product_name": "Malloc disk", 00:24:13.921 "block_size": 512, 00:24:13.921 "num_blocks": 65536, 00:24:13.921 "uuid": "facc2333-e70c-4be9-a991-556cd966c719", 00:24:13.921 "assigned_rate_limits": { 00:24:13.921 "rw_ios_per_sec": 0, 00:24:13.921 "rw_mbytes_per_sec": 0, 00:24:13.921 "r_mbytes_per_sec": 0, 00:24:13.921 "w_mbytes_per_sec": 0 00:24:13.921 }, 00:24:13.921 "claimed": true, 00:24:13.921 "claim_type": "exclusive_write", 00:24:13.921 "zoned": false, 00:24:13.921 "supported_io_types": { 00:24:13.921 "read": true, 00:24:13.921 "write": true, 00:24:13.922 "unmap": true, 00:24:13.922 "flush": true, 00:24:13.922 "reset": true, 00:24:13.922 "nvme_admin": false, 00:24:13.922 "nvme_io": false, 00:24:13.922 "nvme_io_md": false, 00:24:13.922 "write_zeroes": true, 00:24:13.922 "zcopy": true, 00:24:13.922 "get_zone_info": false, 00:24:13.922 "zone_management": false, 00:24:13.922 "zone_append": false, 00:24:13.922 "compare": false, 00:24:13.922 "compare_and_write": false, 00:24:13.922 "abort": true, 00:24:13.922 "seek_hole": false, 00:24:13.922 "seek_data": false, 00:24:13.922 "copy": true, 00:24:13.922 "nvme_iov_md": false 00:24:13.922 }, 00:24:13.922 "memory_domains": [ 00:24:13.922 { 00:24:13.922 "dma_device_id": "system", 00:24:13.922 "dma_device_type": 1 00:24:13.922 }, 00:24:13.922 { 00:24:13.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.922 "dma_device_type": 2 00:24:13.922 } 00:24:13.922 ], 00:24:13.922 "driver_specific": {} 00:24:13.922 }' 00:24:13.922 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:13.922 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:14.180 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:14.438 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:14.438 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:14.438 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:14.438 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:14.696 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:14.696 "name": "BaseBdev2", 00:24:14.696 "aliases": [ 00:24:14.696 "00e4a481-2d69-4ea5-b0fa-abebad9e8444" 00:24:14.696 ], 00:24:14.696 "product_name": "Malloc disk", 00:24:14.696 "block_size": 512, 00:24:14.696 "num_blocks": 65536, 00:24:14.696 "uuid": "00e4a481-2d69-4ea5-b0fa-abebad9e8444", 00:24:14.696 "assigned_rate_limits": { 00:24:14.696 "rw_ios_per_sec": 0, 00:24:14.696 "rw_mbytes_per_sec": 0, 00:24:14.696 "r_mbytes_per_sec": 0, 00:24:14.696 "w_mbytes_per_sec": 0 00:24:14.696 }, 00:24:14.696 "claimed": true, 00:24:14.696 "claim_type": "exclusive_write", 00:24:14.696 "zoned": false, 00:24:14.696 "supported_io_types": { 00:24:14.696 "read": true, 00:24:14.696 "write": true, 00:24:14.696 "unmap": true, 00:24:14.697 "flush": true, 00:24:14.697 "reset": true, 00:24:14.697 "nvme_admin": false, 00:24:14.697 "nvme_io": false, 00:24:14.697 "nvme_io_md": false, 00:24:14.697 "write_zeroes": true, 00:24:14.697 "zcopy": true, 00:24:14.697 "get_zone_info": false, 00:24:14.697 "zone_management": false, 00:24:14.697 "zone_append": false, 00:24:14.697 "compare": false, 00:24:14.697 "compare_and_write": false, 00:24:14.697 "abort": true, 00:24:14.697 "seek_hole": false, 00:24:14.697 "seek_data": false, 00:24:14.697 "copy": true, 00:24:14.697 "nvme_iov_md": false 00:24:14.697 }, 00:24:14.697 "memory_domains": [ 00:24:14.697 { 00:24:14.697 "dma_device_id": "system", 00:24:14.697 "dma_device_type": 1 00:24:14.697 }, 00:24:14.697 { 00:24:14.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:14.697 "dma_device_type": 2 00:24:14.697 } 00:24:14.697 ], 00:24:14.697 "driver_specific": {} 00:24:14.697 }' 00:24:14.697 06:40:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:14.697 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:14.955 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:14.955 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:14.955 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:14.955 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:24:14.955 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:15.213 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:15.213 "name": "BaseBdev3", 00:24:15.213 "aliases": [ 00:24:15.213 "3cfeac6f-d300-437b-a6ed-003bd6f7a774" 00:24:15.213 ], 00:24:15.213 "product_name": "Malloc disk", 00:24:15.213 "block_size": 512, 00:24:15.213 "num_blocks": 65536, 00:24:15.213 "uuid": "3cfeac6f-d300-437b-a6ed-003bd6f7a774", 00:24:15.213 "assigned_rate_limits": { 00:24:15.213 "rw_ios_per_sec": 0, 00:24:15.213 "rw_mbytes_per_sec": 0, 00:24:15.213 "r_mbytes_per_sec": 0, 00:24:15.213 "w_mbytes_per_sec": 0 00:24:15.213 }, 00:24:15.213 "claimed": true, 00:24:15.213 "claim_type": "exclusive_write", 00:24:15.213 "zoned": false, 00:24:15.213 "supported_io_types": { 00:24:15.213 "read": true, 00:24:15.213 "write": true, 00:24:15.213 "unmap": true, 00:24:15.213 "flush": true, 00:24:15.213 "reset": true, 00:24:15.213 "nvme_admin": false, 00:24:15.213 "nvme_io": false, 00:24:15.213 "nvme_io_md": false, 00:24:15.213 "write_zeroes": true, 00:24:15.213 "zcopy": true, 00:24:15.213 "get_zone_info": false, 00:24:15.213 "zone_management": false, 00:24:15.213 "zone_append": false, 00:24:15.213 "compare": false, 00:24:15.213 "compare_and_write": false, 00:24:15.213 "abort": true, 00:24:15.213 "seek_hole": false, 00:24:15.213 "seek_data": false, 00:24:15.213 "copy": true, 00:24:15.213 "nvme_iov_md": false 00:24:15.213 }, 00:24:15.213 "memory_domains": [ 00:24:15.213 { 00:24:15.213 "dma_device_id": "system", 00:24:15.213 "dma_device_type": 1 00:24:15.213 }, 00:24:15.213 { 00:24:15.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:15.213 "dma_device_type": 2 00:24:15.213 } 00:24:15.214 ], 00:24:15.214 "driver_specific": {} 00:24:15.214 }' 00:24:15.214 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.214 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.214 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:15.214 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.214 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.214 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:15.214 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.472 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.472 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:15.472 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.472 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.472 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:15.472 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:15.472 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:24:15.472 06:40:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:15.730 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:15.730 "name": "BaseBdev4", 00:24:15.730 "aliases": [ 00:24:15.730 "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff" 00:24:15.730 ], 00:24:15.730 "product_name": "Malloc disk", 00:24:15.730 "block_size": 512, 00:24:15.730 "num_blocks": 65536, 00:24:15.730 "uuid": "6eb17f05-c1d3-43c4-b907-d436cb1bd2ff", 00:24:15.730 "assigned_rate_limits": { 00:24:15.730 "rw_ios_per_sec": 0, 00:24:15.730 "rw_mbytes_per_sec": 0, 00:24:15.730 "r_mbytes_per_sec": 0, 00:24:15.730 "w_mbytes_per_sec": 0 00:24:15.730 }, 00:24:15.730 "claimed": true, 00:24:15.730 "claim_type": "exclusive_write", 00:24:15.730 "zoned": false, 00:24:15.730 "supported_io_types": { 00:24:15.730 "read": true, 00:24:15.730 "write": true, 00:24:15.730 "unmap": true, 00:24:15.730 "flush": true, 00:24:15.730 "reset": true, 00:24:15.730 "nvme_admin": false, 00:24:15.730 "nvme_io": false, 00:24:15.730 "nvme_io_md": false, 00:24:15.730 "write_zeroes": true, 00:24:15.730 "zcopy": true, 00:24:15.730 "get_zone_info": false, 00:24:15.730 "zone_management": false, 00:24:15.730 "zone_append": false, 00:24:15.730 "compare": false, 00:24:15.730 "compare_and_write": false, 00:24:15.730 "abort": true, 00:24:15.730 "seek_hole": false, 00:24:15.730 "seek_data": false, 00:24:15.730 "copy": true, 00:24:15.730 "nvme_iov_md": false 00:24:15.730 }, 00:24:15.730 "memory_domains": [ 00:24:15.730 { 00:24:15.730 "dma_device_id": "system", 00:24:15.730 "dma_device_type": 1 00:24:15.730 }, 00:24:15.730 { 00:24:15.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:15.730 "dma_device_type": 2 00:24:15.730 } 00:24:15.730 ], 00:24:15.730 "driver_specific": {} 00:24:15.730 }' 00:24:15.730 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.730 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.730 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:15.730 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.730 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.988 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:15.988 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.989 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.989 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:15.989 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.989 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.989 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:15.989 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:16.246 [2024-07-25 06:40:29.697435] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:16.247 [2024-07-25 06:40:29.697458] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:16.247 [2024-07-25 06:40:29.697506] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:16.247 [2024-07-25 06:40:29.697747] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:16.247 [2024-07-25 06:40:29.697758] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd8450 name Existed_Raid, state offline 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1208232 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1208232 ']' 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1208232 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1208232 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1208232' 00:24:16.247 killing process with pid 1208232 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1208232 00:24:16.247 [2024-07-25 06:40:29.775656] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:16.247 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1208232 00:24:16.567 [2024-07-25 06:40:29.807552] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:16.567 06:40:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:24:16.567 00:24:16.567 real 0m30.739s 00:24:16.567 user 0m56.455s 00:24:16.567 sys 0m5.475s 00:24:16.567 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:16.567 06:40:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:16.567 ************************************ 00:24:16.567 END TEST raid_state_function_test_sb 00:24:16.567 ************************************ 00:24:16.567 06:40:30 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:24:16.567 06:40:30 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:24:16.567 06:40:30 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:16.567 06:40:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:16.567 ************************************ 00:24:16.567 START TEST raid_superblock_test 00:24:16.567 ************************************ 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 4 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1214020 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1214020 /var/tmp/spdk-raid.sock 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1214020 ']' 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:16.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:16.567 06:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:16.826 [2024-07-25 06:40:30.135073] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:24:16.826 [2024-07-25 06:40:30.135122] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1214020 ] 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:01.0 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:01.1 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:01.2 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:01.3 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:01.4 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:01.5 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:01.6 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:01.7 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:02.0 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:02.1 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:02.2 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:02.3 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:02.4 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:02.5 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:02.6 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3d:02.7 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:01.0 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:01.1 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:01.2 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:01.3 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:01.4 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:01.5 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:01.6 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:01.7 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:02.0 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:02.1 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:02.2 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:02.3 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:02.4 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:02.5 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:02.6 cannot be used 00:24:16.826 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:16.826 EAL: Requested device 0000:3f:02.7 cannot be used 00:24:16.826 [2024-07-25 06:40:30.258043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:16.826 [2024-07-25 06:40:30.302616] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:16.826 [2024-07-25 06:40:30.365780] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:16.826 [2024-07-25 06:40:30.365823] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:24:17.760 malloc1 00:24:17.760 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:18.018 [2024-07-25 06:40:31.473605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:18.018 [2024-07-25 06:40:31.473651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:18.018 [2024-07-25 06:40:31.473670] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x228fd70 00:24:18.018 [2024-07-25 06:40:31.473687] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:18.018 [2024-07-25 06:40:31.475160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:18.018 [2024-07-25 06:40:31.475187] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:18.018 pt1 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:18.018 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:24:18.276 malloc2 00:24:18.276 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:18.533 [2024-07-25 06:40:31.927048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:18.533 [2024-07-25 06:40:31.927093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:18.533 [2024-07-25 06:40:31.927110] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20de790 00:24:18.533 [2024-07-25 06:40:31.927121] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:18.533 [2024-07-25 06:40:31.928429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:18.533 [2024-07-25 06:40:31.928456] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:18.533 pt2 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:18.533 06:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:24:18.791 malloc3 00:24:18.791 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:24:19.048 [2024-07-25 06:40:32.388488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:24:19.048 [2024-07-25 06:40:32.388528] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:19.048 [2024-07-25 06:40:32.388544] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22838c0 00:24:19.048 [2024-07-25 06:40:32.388556] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:19.048 [2024-07-25 06:40:32.389797] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:19.048 [2024-07-25 06:40:32.389822] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:24:19.048 pt3 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:19.048 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:24:19.307 malloc4 00:24:19.307 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:19.307 [2024-07-25 06:40:32.849832] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:19.307 [2024-07-25 06:40:32.849872] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:19.307 [2024-07-25 06:40:32.849887] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2286300 00:24:19.307 [2024-07-25 06:40:32.849897] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:19.307 [2024-07-25 06:40:32.851149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:19.307 [2024-07-25 06:40:32.851175] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:19.307 pt4 00:24:19.565 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:24:19.565 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:24:19.565 06:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:24:19.565 [2024-07-25 06:40:33.078455] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:19.565 [2024-07-25 06:40:33.079516] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:19.565 [2024-07-25 06:40:33.079565] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:19.565 [2024-07-25 06:40:33.079605] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:19.565 [2024-07-25 06:40:33.079765] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20d6770 00:24:19.565 [2024-07-25 06:40:33.079775] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:19.565 [2024-07-25 06:40:33.079940] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22829f0 00:24:19.565 [2024-07-25 06:40:33.080074] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20d6770 00:24:19.565 [2024-07-25 06:40:33.080084] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20d6770 00:24:19.565 [2024-07-25 06:40:33.080173] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.565 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.824 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:19.824 "name": "raid_bdev1", 00:24:19.824 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:19.824 "strip_size_kb": 0, 00:24:19.824 "state": "online", 00:24:19.824 "raid_level": "raid1", 00:24:19.824 "superblock": true, 00:24:19.824 "num_base_bdevs": 4, 00:24:19.824 "num_base_bdevs_discovered": 4, 00:24:19.824 "num_base_bdevs_operational": 4, 00:24:19.824 "base_bdevs_list": [ 00:24:19.824 { 00:24:19.824 "name": "pt1", 00:24:19.824 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:19.824 "is_configured": true, 00:24:19.824 "data_offset": 2048, 00:24:19.824 "data_size": 63488 00:24:19.824 }, 00:24:19.824 { 00:24:19.824 "name": "pt2", 00:24:19.824 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:19.824 "is_configured": true, 00:24:19.824 "data_offset": 2048, 00:24:19.824 "data_size": 63488 00:24:19.824 }, 00:24:19.824 { 00:24:19.824 "name": "pt3", 00:24:19.824 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:19.824 "is_configured": true, 00:24:19.824 "data_offset": 2048, 00:24:19.824 "data_size": 63488 00:24:19.824 }, 00:24:19.824 { 00:24:19.824 "name": "pt4", 00:24:19.824 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:19.824 "is_configured": true, 00:24:19.824 "data_offset": 2048, 00:24:19.824 "data_size": 63488 00:24:19.824 } 00:24:19.824 ] 00:24:19.824 }' 00:24:19.824 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:19.824 06:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:20.388 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:24:20.388 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:20.388 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:20.388 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:20.388 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:20.388 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:24:20.388 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:20.388 06:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:20.645 [2024-07-25 06:40:34.109444] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:20.645 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:20.645 "name": "raid_bdev1", 00:24:20.645 "aliases": [ 00:24:20.645 "79c5bd43-3208-4f02-88a5-ed603d17858f" 00:24:20.645 ], 00:24:20.645 "product_name": "Raid Volume", 00:24:20.645 "block_size": 512, 00:24:20.645 "num_blocks": 63488, 00:24:20.645 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:20.645 "assigned_rate_limits": { 00:24:20.645 "rw_ios_per_sec": 0, 00:24:20.645 "rw_mbytes_per_sec": 0, 00:24:20.645 "r_mbytes_per_sec": 0, 00:24:20.645 "w_mbytes_per_sec": 0 00:24:20.645 }, 00:24:20.645 "claimed": false, 00:24:20.645 "zoned": false, 00:24:20.645 "supported_io_types": { 00:24:20.645 "read": true, 00:24:20.645 "write": true, 00:24:20.645 "unmap": false, 00:24:20.645 "flush": false, 00:24:20.645 "reset": true, 00:24:20.645 "nvme_admin": false, 00:24:20.645 "nvme_io": false, 00:24:20.645 "nvme_io_md": false, 00:24:20.645 "write_zeroes": true, 00:24:20.645 "zcopy": false, 00:24:20.645 "get_zone_info": false, 00:24:20.645 "zone_management": false, 00:24:20.645 "zone_append": false, 00:24:20.645 "compare": false, 00:24:20.645 "compare_and_write": false, 00:24:20.645 "abort": false, 00:24:20.645 "seek_hole": false, 00:24:20.645 "seek_data": false, 00:24:20.645 "copy": false, 00:24:20.645 "nvme_iov_md": false 00:24:20.645 }, 00:24:20.645 "memory_domains": [ 00:24:20.645 { 00:24:20.645 "dma_device_id": "system", 00:24:20.645 "dma_device_type": 1 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.645 "dma_device_type": 2 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "dma_device_id": "system", 00:24:20.645 "dma_device_type": 1 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.645 "dma_device_type": 2 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "dma_device_id": "system", 00:24:20.645 "dma_device_type": 1 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.645 "dma_device_type": 2 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "dma_device_id": "system", 00:24:20.645 "dma_device_type": 1 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.645 "dma_device_type": 2 00:24:20.645 } 00:24:20.645 ], 00:24:20.645 "driver_specific": { 00:24:20.645 "raid": { 00:24:20.645 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:20.645 "strip_size_kb": 0, 00:24:20.645 "state": "online", 00:24:20.645 "raid_level": "raid1", 00:24:20.645 "superblock": true, 00:24:20.645 "num_base_bdevs": 4, 00:24:20.645 "num_base_bdevs_discovered": 4, 00:24:20.645 "num_base_bdevs_operational": 4, 00:24:20.645 "base_bdevs_list": [ 00:24:20.645 { 00:24:20.645 "name": "pt1", 00:24:20.645 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:20.645 "is_configured": true, 00:24:20.645 "data_offset": 2048, 00:24:20.645 "data_size": 63488 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "name": "pt2", 00:24:20.645 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:20.645 "is_configured": true, 00:24:20.645 "data_offset": 2048, 00:24:20.645 "data_size": 63488 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "name": "pt3", 00:24:20.645 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:20.645 "is_configured": true, 00:24:20.645 "data_offset": 2048, 00:24:20.645 "data_size": 63488 00:24:20.645 }, 00:24:20.645 { 00:24:20.645 "name": "pt4", 00:24:20.645 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:20.645 "is_configured": true, 00:24:20.645 "data_offset": 2048, 00:24:20.645 "data_size": 63488 00:24:20.645 } 00:24:20.645 ] 00:24:20.645 } 00:24:20.645 } 00:24:20.645 }' 00:24:20.645 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:20.645 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:20.645 pt2 00:24:20.645 pt3 00:24:20.645 pt4' 00:24:20.645 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:20.645 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:20.645 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:20.903 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:20.903 "name": "pt1", 00:24:20.903 "aliases": [ 00:24:20.903 "00000000-0000-0000-0000-000000000001" 00:24:20.903 ], 00:24:20.903 "product_name": "passthru", 00:24:20.903 "block_size": 512, 00:24:20.903 "num_blocks": 65536, 00:24:20.903 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:20.903 "assigned_rate_limits": { 00:24:20.903 "rw_ios_per_sec": 0, 00:24:20.903 "rw_mbytes_per_sec": 0, 00:24:20.903 "r_mbytes_per_sec": 0, 00:24:20.903 "w_mbytes_per_sec": 0 00:24:20.903 }, 00:24:20.903 "claimed": true, 00:24:20.903 "claim_type": "exclusive_write", 00:24:20.903 "zoned": false, 00:24:20.903 "supported_io_types": { 00:24:20.903 "read": true, 00:24:20.903 "write": true, 00:24:20.903 "unmap": true, 00:24:20.903 "flush": true, 00:24:20.903 "reset": true, 00:24:20.903 "nvme_admin": false, 00:24:20.903 "nvme_io": false, 00:24:20.903 "nvme_io_md": false, 00:24:20.903 "write_zeroes": true, 00:24:20.903 "zcopy": true, 00:24:20.903 "get_zone_info": false, 00:24:20.903 "zone_management": false, 00:24:20.903 "zone_append": false, 00:24:20.903 "compare": false, 00:24:20.903 "compare_and_write": false, 00:24:20.903 "abort": true, 00:24:20.903 "seek_hole": false, 00:24:20.903 "seek_data": false, 00:24:20.903 "copy": true, 00:24:20.903 "nvme_iov_md": false 00:24:20.903 }, 00:24:20.903 "memory_domains": [ 00:24:20.903 { 00:24:20.903 "dma_device_id": "system", 00:24:20.903 "dma_device_type": 1 00:24:20.903 }, 00:24:20.903 { 00:24:20.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.903 "dma_device_type": 2 00:24:20.903 } 00:24:20.903 ], 00:24:20.903 "driver_specific": { 00:24:20.903 "passthru": { 00:24:20.903 "name": "pt1", 00:24:20.903 "base_bdev_name": "malloc1" 00:24:20.903 } 00:24:20.903 } 00:24:20.903 }' 00:24:20.903 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:20.903 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:21.161 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:21.420 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:21.420 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:21.420 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:21.420 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:21.420 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:21.420 "name": "pt2", 00:24:21.420 "aliases": [ 00:24:21.420 "00000000-0000-0000-0000-000000000002" 00:24:21.420 ], 00:24:21.420 "product_name": "passthru", 00:24:21.420 "block_size": 512, 00:24:21.420 "num_blocks": 65536, 00:24:21.420 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:21.420 "assigned_rate_limits": { 00:24:21.420 "rw_ios_per_sec": 0, 00:24:21.420 "rw_mbytes_per_sec": 0, 00:24:21.420 "r_mbytes_per_sec": 0, 00:24:21.420 "w_mbytes_per_sec": 0 00:24:21.420 }, 00:24:21.420 "claimed": true, 00:24:21.420 "claim_type": "exclusive_write", 00:24:21.420 "zoned": false, 00:24:21.420 "supported_io_types": { 00:24:21.420 "read": true, 00:24:21.420 "write": true, 00:24:21.420 "unmap": true, 00:24:21.420 "flush": true, 00:24:21.420 "reset": true, 00:24:21.420 "nvme_admin": false, 00:24:21.420 "nvme_io": false, 00:24:21.420 "nvme_io_md": false, 00:24:21.420 "write_zeroes": true, 00:24:21.420 "zcopy": true, 00:24:21.420 "get_zone_info": false, 00:24:21.420 "zone_management": false, 00:24:21.420 "zone_append": false, 00:24:21.420 "compare": false, 00:24:21.420 "compare_and_write": false, 00:24:21.420 "abort": true, 00:24:21.420 "seek_hole": false, 00:24:21.420 "seek_data": false, 00:24:21.420 "copy": true, 00:24:21.420 "nvme_iov_md": false 00:24:21.420 }, 00:24:21.420 "memory_domains": [ 00:24:21.420 { 00:24:21.420 "dma_device_id": "system", 00:24:21.420 "dma_device_type": 1 00:24:21.420 }, 00:24:21.420 { 00:24:21.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:21.420 "dma_device_type": 2 00:24:21.420 } 00:24:21.420 ], 00:24:21.420 "driver_specific": { 00:24:21.420 "passthru": { 00:24:21.420 "name": "pt2", 00:24:21.420 "base_bdev_name": "malloc2" 00:24:21.420 } 00:24:21.420 } 00:24:21.420 }' 00:24:21.678 06:40:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:21.678 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:21.936 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:21.936 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:21.936 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:21.936 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:24:21.936 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:22.193 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:22.193 "name": "pt3", 00:24:22.193 "aliases": [ 00:24:22.193 "00000000-0000-0000-0000-000000000003" 00:24:22.193 ], 00:24:22.193 "product_name": "passthru", 00:24:22.193 "block_size": 512, 00:24:22.193 "num_blocks": 65536, 00:24:22.193 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:22.193 "assigned_rate_limits": { 00:24:22.193 "rw_ios_per_sec": 0, 00:24:22.193 "rw_mbytes_per_sec": 0, 00:24:22.193 "r_mbytes_per_sec": 0, 00:24:22.193 "w_mbytes_per_sec": 0 00:24:22.193 }, 00:24:22.193 "claimed": true, 00:24:22.193 "claim_type": "exclusive_write", 00:24:22.193 "zoned": false, 00:24:22.193 "supported_io_types": { 00:24:22.193 "read": true, 00:24:22.193 "write": true, 00:24:22.193 "unmap": true, 00:24:22.193 "flush": true, 00:24:22.193 "reset": true, 00:24:22.193 "nvme_admin": false, 00:24:22.193 "nvme_io": false, 00:24:22.193 "nvme_io_md": false, 00:24:22.193 "write_zeroes": true, 00:24:22.193 "zcopy": true, 00:24:22.193 "get_zone_info": false, 00:24:22.193 "zone_management": false, 00:24:22.193 "zone_append": false, 00:24:22.193 "compare": false, 00:24:22.193 "compare_and_write": false, 00:24:22.193 "abort": true, 00:24:22.193 "seek_hole": false, 00:24:22.193 "seek_data": false, 00:24:22.193 "copy": true, 00:24:22.193 "nvme_iov_md": false 00:24:22.193 }, 00:24:22.193 "memory_domains": [ 00:24:22.193 { 00:24:22.193 "dma_device_id": "system", 00:24:22.193 "dma_device_type": 1 00:24:22.193 }, 00:24:22.193 { 00:24:22.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:22.193 "dma_device_type": 2 00:24:22.193 } 00:24:22.193 ], 00:24:22.193 "driver_specific": { 00:24:22.193 "passthru": { 00:24:22.193 "name": "pt3", 00:24:22.193 "base_bdev_name": "malloc3" 00:24:22.193 } 00:24:22.193 } 00:24:22.193 }' 00:24:22.193 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:22.193 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:22.193 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:22.193 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:22.193 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:22.193 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:22.193 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:22.451 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:22.451 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:22.451 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:22.451 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:22.451 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:22.451 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:22.451 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:22.451 06:40:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:24:22.708 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:22.708 "name": "pt4", 00:24:22.708 "aliases": [ 00:24:22.708 "00000000-0000-0000-0000-000000000004" 00:24:22.708 ], 00:24:22.708 "product_name": "passthru", 00:24:22.708 "block_size": 512, 00:24:22.708 "num_blocks": 65536, 00:24:22.708 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:22.708 "assigned_rate_limits": { 00:24:22.708 "rw_ios_per_sec": 0, 00:24:22.708 "rw_mbytes_per_sec": 0, 00:24:22.708 "r_mbytes_per_sec": 0, 00:24:22.708 "w_mbytes_per_sec": 0 00:24:22.708 }, 00:24:22.708 "claimed": true, 00:24:22.708 "claim_type": "exclusive_write", 00:24:22.708 "zoned": false, 00:24:22.708 "supported_io_types": { 00:24:22.708 "read": true, 00:24:22.708 "write": true, 00:24:22.708 "unmap": true, 00:24:22.708 "flush": true, 00:24:22.708 "reset": true, 00:24:22.708 "nvme_admin": false, 00:24:22.708 "nvme_io": false, 00:24:22.708 "nvme_io_md": false, 00:24:22.708 "write_zeroes": true, 00:24:22.708 "zcopy": true, 00:24:22.708 "get_zone_info": false, 00:24:22.708 "zone_management": false, 00:24:22.708 "zone_append": false, 00:24:22.708 "compare": false, 00:24:22.708 "compare_and_write": false, 00:24:22.709 "abort": true, 00:24:22.709 "seek_hole": false, 00:24:22.709 "seek_data": false, 00:24:22.709 "copy": true, 00:24:22.709 "nvme_iov_md": false 00:24:22.709 }, 00:24:22.709 "memory_domains": [ 00:24:22.709 { 00:24:22.709 "dma_device_id": "system", 00:24:22.709 "dma_device_type": 1 00:24:22.709 }, 00:24:22.709 { 00:24:22.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:22.709 "dma_device_type": 2 00:24:22.709 } 00:24:22.709 ], 00:24:22.709 "driver_specific": { 00:24:22.709 "passthru": { 00:24:22.709 "name": "pt4", 00:24:22.709 "base_bdev_name": "malloc4" 00:24:22.709 } 00:24:22.709 } 00:24:22.709 }' 00:24:22.709 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:22.709 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:22.709 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:22.709 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:22.709 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:22.967 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:24:23.225 [2024-07-25 06:40:36.656171] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:23.225 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=79c5bd43-3208-4f02-88a5-ed603d17858f 00:24:23.225 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 79c5bd43-3208-4f02-88a5-ed603d17858f ']' 00:24:23.225 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:23.483 [2024-07-25 06:40:36.884458] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:23.483 [2024-07-25 06:40:36.884478] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:23.483 [2024-07-25 06:40:36.884528] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:23.483 [2024-07-25 06:40:36.884606] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:23.483 [2024-07-25 06:40:36.884617] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d6770 name raid_bdev1, state offline 00:24:23.483 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.483 06:40:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:24:23.740 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:24:23.740 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:24:23.740 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:24:23.740 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:23.998 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:24:23.998 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:24.255 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:24:24.255 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:24:24.255 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:24:24.255 06:40:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:24:24.513 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:24:24.513 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:24.771 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:24:25.030 [2024-07-25 06:40:38.480592] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:24:25.030 [2024-07-25 06:40:38.481829] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:24:25.030 [2024-07-25 06:40:38.481869] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:24:25.030 [2024-07-25 06:40:38.481900] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:24:25.030 [2024-07-25 06:40:38.481941] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:24:25.030 [2024-07-25 06:40:38.481977] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:24:25.030 [2024-07-25 06:40:38.481998] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:24:25.030 [2024-07-25 06:40:38.482017] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:24:25.030 [2024-07-25 06:40:38.482034] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:25.030 [2024-07-25 06:40:38.482043] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2285d00 name raid_bdev1, state configuring 00:24:25.030 request: 00:24:25.030 { 00:24:25.030 "name": "raid_bdev1", 00:24:25.030 "raid_level": "raid1", 00:24:25.030 "base_bdevs": [ 00:24:25.030 "malloc1", 00:24:25.030 "malloc2", 00:24:25.030 "malloc3", 00:24:25.030 "malloc4" 00:24:25.030 ], 00:24:25.030 "superblock": false, 00:24:25.030 "method": "bdev_raid_create", 00:24:25.030 "req_id": 1 00:24:25.030 } 00:24:25.030 Got JSON-RPC error response 00:24:25.030 response: 00:24:25.030 { 00:24:25.030 "code": -17, 00:24:25.030 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:24:25.030 } 00:24:25.030 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:24:25.030 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:24:25.030 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:24:25.030 06:40:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:24:25.030 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.030 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:24:25.288 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:24:25.288 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:24:25.288 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:25.546 [2024-07-25 06:40:38.865556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:25.546 [2024-07-25 06:40:38.865594] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:25.546 [2024-07-25 06:40:38.865613] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2280f60 00:24:25.546 [2024-07-25 06:40:38.865624] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:25.546 [2024-07-25 06:40:38.867014] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:25.546 [2024-07-25 06:40:38.867041] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:25.546 [2024-07-25 06:40:38.867099] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:25.546 [2024-07-25 06:40:38.867123] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:25.546 pt1 00:24:25.546 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:24:25.546 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:25.546 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:25.546 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:25.547 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:25.547 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:25.547 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.547 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.547 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.547 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:25.547 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.547 06:40:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.805 06:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:25.805 "name": "raid_bdev1", 00:24:25.805 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:25.805 "strip_size_kb": 0, 00:24:25.805 "state": "configuring", 00:24:25.805 "raid_level": "raid1", 00:24:25.805 "superblock": true, 00:24:25.805 "num_base_bdevs": 4, 00:24:25.805 "num_base_bdevs_discovered": 1, 00:24:25.805 "num_base_bdevs_operational": 4, 00:24:25.805 "base_bdevs_list": [ 00:24:25.805 { 00:24:25.805 "name": "pt1", 00:24:25.805 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:25.805 "is_configured": true, 00:24:25.805 "data_offset": 2048, 00:24:25.805 "data_size": 63488 00:24:25.805 }, 00:24:25.805 { 00:24:25.805 "name": null, 00:24:25.805 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:25.805 "is_configured": false, 00:24:25.805 "data_offset": 2048, 00:24:25.805 "data_size": 63488 00:24:25.805 }, 00:24:25.805 { 00:24:25.805 "name": null, 00:24:25.805 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:25.805 "is_configured": false, 00:24:25.805 "data_offset": 2048, 00:24:25.805 "data_size": 63488 00:24:25.805 }, 00:24:25.805 { 00:24:25.805 "name": null, 00:24:25.805 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:25.805 "is_configured": false, 00:24:25.805 "data_offset": 2048, 00:24:25.805 "data_size": 63488 00:24:25.805 } 00:24:25.805 ] 00:24:25.805 }' 00:24:25.805 06:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:25.805 06:40:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:26.371 06:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:24:26.371 06:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:26.371 [2024-07-25 06:40:39.904303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:26.371 [2024-07-25 06:40:39.904352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:26.371 [2024-07-25 06:40:39.904369] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x228f6b0 00:24:26.371 [2024-07-25 06:40:39.904381] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:26.371 [2024-07-25 06:40:39.904709] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:26.371 [2024-07-25 06:40:39.904726] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:26.371 [2024-07-25 06:40:39.904782] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:26.371 [2024-07-25 06:40:39.904800] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:26.371 pt2 00:24:26.371 06:40:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:26.629 [2024-07-25 06:40:40.128932] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.629 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.887 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.887 "name": "raid_bdev1", 00:24:26.887 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:26.887 "strip_size_kb": 0, 00:24:26.887 "state": "configuring", 00:24:26.887 "raid_level": "raid1", 00:24:26.887 "superblock": true, 00:24:26.887 "num_base_bdevs": 4, 00:24:26.887 "num_base_bdevs_discovered": 1, 00:24:26.887 "num_base_bdevs_operational": 4, 00:24:26.887 "base_bdevs_list": [ 00:24:26.887 { 00:24:26.887 "name": "pt1", 00:24:26.887 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:26.887 "is_configured": true, 00:24:26.887 "data_offset": 2048, 00:24:26.887 "data_size": 63488 00:24:26.887 }, 00:24:26.887 { 00:24:26.887 "name": null, 00:24:26.887 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:26.887 "is_configured": false, 00:24:26.887 "data_offset": 2048, 00:24:26.887 "data_size": 63488 00:24:26.887 }, 00:24:26.887 { 00:24:26.887 "name": null, 00:24:26.887 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:26.887 "is_configured": false, 00:24:26.887 "data_offset": 2048, 00:24:26.887 "data_size": 63488 00:24:26.887 }, 00:24:26.887 { 00:24:26.887 "name": null, 00:24:26.887 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:26.887 "is_configured": false, 00:24:26.887 "data_offset": 2048, 00:24:26.887 "data_size": 63488 00:24:26.887 } 00:24:26.887 ] 00:24:26.887 }' 00:24:26.887 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.887 06:40:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:27.453 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:24:27.453 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:24:27.453 06:40:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:27.711 [2024-07-25 06:40:41.159592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:27.711 [2024-07-25 06:40:41.159648] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:27.711 [2024-07-25 06:40:41.159667] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d9ab0 00:24:27.711 [2024-07-25 06:40:41.159678] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:27.711 [2024-07-25 06:40:41.160013] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:27.711 [2024-07-25 06:40:41.160031] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:27.711 [2024-07-25 06:40:41.160091] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:27.711 [2024-07-25 06:40:41.160110] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:27.711 pt2 00:24:27.711 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:24:27.711 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:24:27.711 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:24:27.970 [2024-07-25 06:40:41.384197] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:24:27.970 [2024-07-25 06:40:41.384242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:27.970 [2024-07-25 06:40:41.384259] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d9d40 00:24:27.970 [2024-07-25 06:40:41.384270] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:27.970 [2024-07-25 06:40:41.384571] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:27.970 [2024-07-25 06:40:41.384587] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:24:27.970 [2024-07-25 06:40:41.384640] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:24:27.970 [2024-07-25 06:40:41.384658] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:27.970 pt3 00:24:27.970 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:24:27.970 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:24:27.970 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:28.228 [2024-07-25 06:40:41.608767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:28.228 [2024-07-25 06:40:41.608798] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:28.228 [2024-07-25 06:40:41.608813] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d9560 00:24:28.228 [2024-07-25 06:40:41.608823] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:28.228 [2024-07-25 06:40:41.609096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:28.228 [2024-07-25 06:40:41.609110] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:28.228 [2024-07-25 06:40:41.609165] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:24:28.228 [2024-07-25 06:40:41.609183] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:28.228 [2024-07-25 06:40:41.609296] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2284af0 00:24:28.228 [2024-07-25 06:40:41.609306] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:28.228 [2024-07-25 06:40:41.609465] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20dd760 00:24:28.228 [2024-07-25 06:40:41.609589] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2284af0 00:24:28.228 [2024-07-25 06:40:41.609598] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2284af0 00:24:28.228 [2024-07-25 06:40:41.609687] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:28.228 pt4 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.228 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.486 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:28.486 "name": "raid_bdev1", 00:24:28.486 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:28.486 "strip_size_kb": 0, 00:24:28.486 "state": "online", 00:24:28.486 "raid_level": "raid1", 00:24:28.486 "superblock": true, 00:24:28.486 "num_base_bdevs": 4, 00:24:28.486 "num_base_bdevs_discovered": 4, 00:24:28.486 "num_base_bdevs_operational": 4, 00:24:28.486 "base_bdevs_list": [ 00:24:28.486 { 00:24:28.486 "name": "pt1", 00:24:28.486 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:28.486 "is_configured": true, 00:24:28.486 "data_offset": 2048, 00:24:28.486 "data_size": 63488 00:24:28.486 }, 00:24:28.486 { 00:24:28.486 "name": "pt2", 00:24:28.486 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:28.486 "is_configured": true, 00:24:28.486 "data_offset": 2048, 00:24:28.486 "data_size": 63488 00:24:28.486 }, 00:24:28.486 { 00:24:28.486 "name": "pt3", 00:24:28.486 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:28.486 "is_configured": true, 00:24:28.486 "data_offset": 2048, 00:24:28.486 "data_size": 63488 00:24:28.486 }, 00:24:28.486 { 00:24:28.486 "name": "pt4", 00:24:28.486 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:28.486 "is_configured": true, 00:24:28.486 "data_offset": 2048, 00:24:28.486 "data_size": 63488 00:24:28.486 } 00:24:28.486 ] 00:24:28.486 }' 00:24:28.486 06:40:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:28.486 06:40:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:29.053 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:24:29.054 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:29.054 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:29.054 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:29.054 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:29.054 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:24:29.054 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:29.054 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:29.320 [2024-07-25 06:40:42.631760] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:29.320 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:29.320 "name": "raid_bdev1", 00:24:29.320 "aliases": [ 00:24:29.320 "79c5bd43-3208-4f02-88a5-ed603d17858f" 00:24:29.321 ], 00:24:29.321 "product_name": "Raid Volume", 00:24:29.321 "block_size": 512, 00:24:29.321 "num_blocks": 63488, 00:24:29.321 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:29.321 "assigned_rate_limits": { 00:24:29.321 "rw_ios_per_sec": 0, 00:24:29.321 "rw_mbytes_per_sec": 0, 00:24:29.321 "r_mbytes_per_sec": 0, 00:24:29.321 "w_mbytes_per_sec": 0 00:24:29.321 }, 00:24:29.321 "claimed": false, 00:24:29.321 "zoned": false, 00:24:29.321 "supported_io_types": { 00:24:29.321 "read": true, 00:24:29.321 "write": true, 00:24:29.321 "unmap": false, 00:24:29.321 "flush": false, 00:24:29.321 "reset": true, 00:24:29.321 "nvme_admin": false, 00:24:29.321 "nvme_io": false, 00:24:29.321 "nvme_io_md": false, 00:24:29.321 "write_zeroes": true, 00:24:29.321 "zcopy": false, 00:24:29.321 "get_zone_info": false, 00:24:29.321 "zone_management": false, 00:24:29.321 "zone_append": false, 00:24:29.321 "compare": false, 00:24:29.321 "compare_and_write": false, 00:24:29.321 "abort": false, 00:24:29.321 "seek_hole": false, 00:24:29.321 "seek_data": false, 00:24:29.321 "copy": false, 00:24:29.321 "nvme_iov_md": false 00:24:29.321 }, 00:24:29.321 "memory_domains": [ 00:24:29.321 { 00:24:29.321 "dma_device_id": "system", 00:24:29.321 "dma_device_type": 1 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.321 "dma_device_type": 2 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "dma_device_id": "system", 00:24:29.321 "dma_device_type": 1 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.321 "dma_device_type": 2 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "dma_device_id": "system", 00:24:29.321 "dma_device_type": 1 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.321 "dma_device_type": 2 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "dma_device_id": "system", 00:24:29.321 "dma_device_type": 1 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.321 "dma_device_type": 2 00:24:29.321 } 00:24:29.321 ], 00:24:29.321 "driver_specific": { 00:24:29.321 "raid": { 00:24:29.321 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:29.321 "strip_size_kb": 0, 00:24:29.321 "state": "online", 00:24:29.321 "raid_level": "raid1", 00:24:29.321 "superblock": true, 00:24:29.321 "num_base_bdevs": 4, 00:24:29.321 "num_base_bdevs_discovered": 4, 00:24:29.321 "num_base_bdevs_operational": 4, 00:24:29.321 "base_bdevs_list": [ 00:24:29.321 { 00:24:29.321 "name": "pt1", 00:24:29.321 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:29.321 "is_configured": true, 00:24:29.321 "data_offset": 2048, 00:24:29.321 "data_size": 63488 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "name": "pt2", 00:24:29.321 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:29.321 "is_configured": true, 00:24:29.321 "data_offset": 2048, 00:24:29.321 "data_size": 63488 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "name": "pt3", 00:24:29.321 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:29.321 "is_configured": true, 00:24:29.321 "data_offset": 2048, 00:24:29.321 "data_size": 63488 00:24:29.321 }, 00:24:29.321 { 00:24:29.321 "name": "pt4", 00:24:29.321 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:29.321 "is_configured": true, 00:24:29.321 "data_offset": 2048, 00:24:29.321 "data_size": 63488 00:24:29.321 } 00:24:29.321 ] 00:24:29.321 } 00:24:29.321 } 00:24:29.321 }' 00:24:29.321 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:29.321 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:29.321 pt2 00:24:29.321 pt3 00:24:29.321 pt4' 00:24:29.321 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:29.321 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:29.321 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:29.612 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:29.612 "name": "pt1", 00:24:29.612 "aliases": [ 00:24:29.612 "00000000-0000-0000-0000-000000000001" 00:24:29.612 ], 00:24:29.612 "product_name": "passthru", 00:24:29.612 "block_size": 512, 00:24:29.612 "num_blocks": 65536, 00:24:29.612 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:29.612 "assigned_rate_limits": { 00:24:29.612 "rw_ios_per_sec": 0, 00:24:29.612 "rw_mbytes_per_sec": 0, 00:24:29.612 "r_mbytes_per_sec": 0, 00:24:29.612 "w_mbytes_per_sec": 0 00:24:29.612 }, 00:24:29.612 "claimed": true, 00:24:29.612 "claim_type": "exclusive_write", 00:24:29.612 "zoned": false, 00:24:29.612 "supported_io_types": { 00:24:29.612 "read": true, 00:24:29.612 "write": true, 00:24:29.612 "unmap": true, 00:24:29.612 "flush": true, 00:24:29.612 "reset": true, 00:24:29.612 "nvme_admin": false, 00:24:29.612 "nvme_io": false, 00:24:29.612 "nvme_io_md": false, 00:24:29.612 "write_zeroes": true, 00:24:29.612 "zcopy": true, 00:24:29.612 "get_zone_info": false, 00:24:29.612 "zone_management": false, 00:24:29.612 "zone_append": false, 00:24:29.612 "compare": false, 00:24:29.612 "compare_and_write": false, 00:24:29.612 "abort": true, 00:24:29.612 "seek_hole": false, 00:24:29.612 "seek_data": false, 00:24:29.612 "copy": true, 00:24:29.612 "nvme_iov_md": false 00:24:29.612 }, 00:24:29.612 "memory_domains": [ 00:24:29.612 { 00:24:29.612 "dma_device_id": "system", 00:24:29.612 "dma_device_type": 1 00:24:29.612 }, 00:24:29.612 { 00:24:29.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:29.612 "dma_device_type": 2 00:24:29.612 } 00:24:29.612 ], 00:24:29.612 "driver_specific": { 00:24:29.612 "passthru": { 00:24:29.612 "name": "pt1", 00:24:29.612 "base_bdev_name": "malloc1" 00:24:29.612 } 00:24:29.612 } 00:24:29.612 }' 00:24:29.612 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:29.612 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:29.612 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:29.612 06:40:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:29.612 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:29.612 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:29.612 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:29.612 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:29.870 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:29.870 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:29.870 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:29.870 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:29.870 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:29.870 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:29.870 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:30.128 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:30.128 "name": "pt2", 00:24:30.128 "aliases": [ 00:24:30.128 "00000000-0000-0000-0000-000000000002" 00:24:30.128 ], 00:24:30.128 "product_name": "passthru", 00:24:30.128 "block_size": 512, 00:24:30.128 "num_blocks": 65536, 00:24:30.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:30.128 "assigned_rate_limits": { 00:24:30.128 "rw_ios_per_sec": 0, 00:24:30.128 "rw_mbytes_per_sec": 0, 00:24:30.128 "r_mbytes_per_sec": 0, 00:24:30.128 "w_mbytes_per_sec": 0 00:24:30.128 }, 00:24:30.128 "claimed": true, 00:24:30.128 "claim_type": "exclusive_write", 00:24:30.128 "zoned": false, 00:24:30.128 "supported_io_types": { 00:24:30.128 "read": true, 00:24:30.128 "write": true, 00:24:30.128 "unmap": true, 00:24:30.128 "flush": true, 00:24:30.128 "reset": true, 00:24:30.128 "nvme_admin": false, 00:24:30.128 "nvme_io": false, 00:24:30.128 "nvme_io_md": false, 00:24:30.128 "write_zeroes": true, 00:24:30.128 "zcopy": true, 00:24:30.128 "get_zone_info": false, 00:24:30.128 "zone_management": false, 00:24:30.128 "zone_append": false, 00:24:30.128 "compare": false, 00:24:30.128 "compare_and_write": false, 00:24:30.128 "abort": true, 00:24:30.128 "seek_hole": false, 00:24:30.128 "seek_data": false, 00:24:30.128 "copy": true, 00:24:30.128 "nvme_iov_md": false 00:24:30.128 }, 00:24:30.128 "memory_domains": [ 00:24:30.128 { 00:24:30.128 "dma_device_id": "system", 00:24:30.128 "dma_device_type": 1 00:24:30.128 }, 00:24:30.128 { 00:24:30.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:30.128 "dma_device_type": 2 00:24:30.128 } 00:24:30.128 ], 00:24:30.128 "driver_specific": { 00:24:30.128 "passthru": { 00:24:30.128 "name": "pt2", 00:24:30.128 "base_bdev_name": "malloc2" 00:24:30.128 } 00:24:30.128 } 00:24:30.128 }' 00:24:30.128 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:30.128 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:30.128 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:30.128 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:30.128 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:30.128 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:30.128 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:30.385 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:30.385 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:30.385 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:30.385 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:30.385 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:30.386 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:30.386 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:30.386 06:40:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:24:30.643 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:30.643 "name": "pt3", 00:24:30.643 "aliases": [ 00:24:30.643 "00000000-0000-0000-0000-000000000003" 00:24:30.643 ], 00:24:30.643 "product_name": "passthru", 00:24:30.643 "block_size": 512, 00:24:30.643 "num_blocks": 65536, 00:24:30.643 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:30.643 "assigned_rate_limits": { 00:24:30.643 "rw_ios_per_sec": 0, 00:24:30.643 "rw_mbytes_per_sec": 0, 00:24:30.643 "r_mbytes_per_sec": 0, 00:24:30.643 "w_mbytes_per_sec": 0 00:24:30.643 }, 00:24:30.643 "claimed": true, 00:24:30.643 "claim_type": "exclusive_write", 00:24:30.643 "zoned": false, 00:24:30.643 "supported_io_types": { 00:24:30.643 "read": true, 00:24:30.643 "write": true, 00:24:30.643 "unmap": true, 00:24:30.643 "flush": true, 00:24:30.643 "reset": true, 00:24:30.643 "nvme_admin": false, 00:24:30.643 "nvme_io": false, 00:24:30.643 "nvme_io_md": false, 00:24:30.643 "write_zeroes": true, 00:24:30.643 "zcopy": true, 00:24:30.643 "get_zone_info": false, 00:24:30.643 "zone_management": false, 00:24:30.643 "zone_append": false, 00:24:30.643 "compare": false, 00:24:30.643 "compare_and_write": false, 00:24:30.643 "abort": true, 00:24:30.643 "seek_hole": false, 00:24:30.643 "seek_data": false, 00:24:30.643 "copy": true, 00:24:30.643 "nvme_iov_md": false 00:24:30.643 }, 00:24:30.643 "memory_domains": [ 00:24:30.643 { 00:24:30.643 "dma_device_id": "system", 00:24:30.643 "dma_device_type": 1 00:24:30.643 }, 00:24:30.643 { 00:24:30.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:30.643 "dma_device_type": 2 00:24:30.643 } 00:24:30.643 ], 00:24:30.643 "driver_specific": { 00:24:30.643 "passthru": { 00:24:30.643 "name": "pt3", 00:24:30.643 "base_bdev_name": "malloc3" 00:24:30.643 } 00:24:30.643 } 00:24:30.643 }' 00:24:30.643 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:30.643 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:30.643 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:30.643 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:24:30.901 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:31.159 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:31.159 "name": "pt4", 00:24:31.159 "aliases": [ 00:24:31.159 "00000000-0000-0000-0000-000000000004" 00:24:31.159 ], 00:24:31.159 "product_name": "passthru", 00:24:31.159 "block_size": 512, 00:24:31.159 "num_blocks": 65536, 00:24:31.159 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:31.159 "assigned_rate_limits": { 00:24:31.159 "rw_ios_per_sec": 0, 00:24:31.159 "rw_mbytes_per_sec": 0, 00:24:31.159 "r_mbytes_per_sec": 0, 00:24:31.159 "w_mbytes_per_sec": 0 00:24:31.159 }, 00:24:31.159 "claimed": true, 00:24:31.159 "claim_type": "exclusive_write", 00:24:31.159 "zoned": false, 00:24:31.159 "supported_io_types": { 00:24:31.159 "read": true, 00:24:31.159 "write": true, 00:24:31.159 "unmap": true, 00:24:31.159 "flush": true, 00:24:31.159 "reset": true, 00:24:31.159 "nvme_admin": false, 00:24:31.159 "nvme_io": false, 00:24:31.159 "nvme_io_md": false, 00:24:31.159 "write_zeroes": true, 00:24:31.159 "zcopy": true, 00:24:31.159 "get_zone_info": false, 00:24:31.159 "zone_management": false, 00:24:31.159 "zone_append": false, 00:24:31.159 "compare": false, 00:24:31.159 "compare_and_write": false, 00:24:31.159 "abort": true, 00:24:31.159 "seek_hole": false, 00:24:31.159 "seek_data": false, 00:24:31.159 "copy": true, 00:24:31.159 "nvme_iov_md": false 00:24:31.159 }, 00:24:31.159 "memory_domains": [ 00:24:31.159 { 00:24:31.159 "dma_device_id": "system", 00:24:31.159 "dma_device_type": 1 00:24:31.159 }, 00:24:31.159 { 00:24:31.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:31.159 "dma_device_type": 2 00:24:31.159 } 00:24:31.159 ], 00:24:31.159 "driver_specific": { 00:24:31.159 "passthru": { 00:24:31.159 "name": "pt4", 00:24:31.159 "base_bdev_name": "malloc4" 00:24:31.159 } 00:24:31.159 } 00:24:31.159 }' 00:24:31.159 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:31.159 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:31.159 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:31.159 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:31.417 06:40:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:24:31.675 [2024-07-25 06:40:45.170427] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:31.675 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 79c5bd43-3208-4f02-88a5-ed603d17858f '!=' 79c5bd43-3208-4f02-88a5-ed603d17858f ']' 00:24:31.675 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:24:31.675 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:31.675 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:24:31.675 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:31.933 [2024-07-25 06:40:45.398776] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.933 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.192 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:32.192 "name": "raid_bdev1", 00:24:32.192 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:32.192 "strip_size_kb": 0, 00:24:32.192 "state": "online", 00:24:32.192 "raid_level": "raid1", 00:24:32.192 "superblock": true, 00:24:32.192 "num_base_bdevs": 4, 00:24:32.192 "num_base_bdevs_discovered": 3, 00:24:32.192 "num_base_bdevs_operational": 3, 00:24:32.192 "base_bdevs_list": [ 00:24:32.192 { 00:24:32.192 "name": null, 00:24:32.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:32.192 "is_configured": false, 00:24:32.192 "data_offset": 2048, 00:24:32.192 "data_size": 63488 00:24:32.192 }, 00:24:32.192 { 00:24:32.192 "name": "pt2", 00:24:32.192 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:32.192 "is_configured": true, 00:24:32.192 "data_offset": 2048, 00:24:32.192 "data_size": 63488 00:24:32.192 }, 00:24:32.192 { 00:24:32.192 "name": "pt3", 00:24:32.192 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:32.192 "is_configured": true, 00:24:32.192 "data_offset": 2048, 00:24:32.192 "data_size": 63488 00:24:32.192 }, 00:24:32.192 { 00:24:32.192 "name": "pt4", 00:24:32.192 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:32.192 "is_configured": true, 00:24:32.192 "data_offset": 2048, 00:24:32.192 "data_size": 63488 00:24:32.192 } 00:24:32.192 ] 00:24:32.192 }' 00:24:32.192 06:40:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:32.192 06:40:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:32.758 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:33.016 [2024-07-25 06:40:46.441501] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:33.016 [2024-07-25 06:40:46.441526] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:33.016 [2024-07-25 06:40:46.441581] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:33.016 [2024-07-25 06:40:46.441647] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:33.016 [2024-07-25 06:40:46.441657] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2284af0 name raid_bdev1, state offline 00:24:33.016 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.016 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:24:33.273 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:24:33.273 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:24:33.273 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:24:33.273 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:24:33.273 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:33.531 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:24:33.531 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:24:33.531 06:40:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:24:33.789 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:24:33.789 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:24:33.789 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:24:33.789 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:24:33.789 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:24:33.789 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:24:33.789 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:24:33.789 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:34.047 [2024-07-25 06:40:47.536308] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:34.047 [2024-07-25 06:40:47.536348] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:34.047 [2024-07-25 06:40:47.536364] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2284d70 00:24:34.047 [2024-07-25 06:40:47.536375] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:34.047 [2024-07-25 06:40:47.537848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:34.047 [2024-07-25 06:40:47.537877] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:34.047 [2024-07-25 06:40:47.537936] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:34.047 [2024-07-25 06:40:47.537960] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:34.047 pt2 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.047 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.305 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.305 "name": "raid_bdev1", 00:24:34.305 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:34.305 "strip_size_kb": 0, 00:24:34.305 "state": "configuring", 00:24:34.305 "raid_level": "raid1", 00:24:34.305 "superblock": true, 00:24:34.305 "num_base_bdevs": 4, 00:24:34.305 "num_base_bdevs_discovered": 1, 00:24:34.305 "num_base_bdevs_operational": 3, 00:24:34.305 "base_bdevs_list": [ 00:24:34.305 { 00:24:34.305 "name": null, 00:24:34.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.305 "is_configured": false, 00:24:34.305 "data_offset": 2048, 00:24:34.305 "data_size": 63488 00:24:34.305 }, 00:24:34.305 { 00:24:34.305 "name": "pt2", 00:24:34.305 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:34.305 "is_configured": true, 00:24:34.305 "data_offset": 2048, 00:24:34.305 "data_size": 63488 00:24:34.305 }, 00:24:34.305 { 00:24:34.305 "name": null, 00:24:34.305 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:34.305 "is_configured": false, 00:24:34.305 "data_offset": 2048, 00:24:34.305 "data_size": 63488 00:24:34.305 }, 00:24:34.305 { 00:24:34.305 "name": null, 00:24:34.305 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:34.305 "is_configured": false, 00:24:34.305 "data_offset": 2048, 00:24:34.305 "data_size": 63488 00:24:34.305 } 00:24:34.305 ] 00:24:34.305 }' 00:24:34.305 06:40:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.305 06:40:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:34.870 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:24:34.870 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:24:34.870 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:24:35.128 [2024-07-25 06:40:48.567055] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:24:35.128 [2024-07-25 06:40:48.567100] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:35.128 [2024-07-25 06:40:48.567116] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d3ea0 00:24:35.128 [2024-07-25 06:40:48.567127] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:35.128 [2024-07-25 06:40:48.567440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:35.128 [2024-07-25 06:40:48.567456] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:24:35.128 [2024-07-25 06:40:48.567512] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:24:35.128 [2024-07-25 06:40:48.567529] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:35.128 pt3 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.128 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.386 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:35.386 "name": "raid_bdev1", 00:24:35.386 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:35.386 "strip_size_kb": 0, 00:24:35.386 "state": "configuring", 00:24:35.386 "raid_level": "raid1", 00:24:35.386 "superblock": true, 00:24:35.386 "num_base_bdevs": 4, 00:24:35.386 "num_base_bdevs_discovered": 2, 00:24:35.386 "num_base_bdevs_operational": 3, 00:24:35.386 "base_bdevs_list": [ 00:24:35.386 { 00:24:35.386 "name": null, 00:24:35.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.386 "is_configured": false, 00:24:35.386 "data_offset": 2048, 00:24:35.386 "data_size": 63488 00:24:35.386 }, 00:24:35.386 { 00:24:35.386 "name": "pt2", 00:24:35.386 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:35.386 "is_configured": true, 00:24:35.386 "data_offset": 2048, 00:24:35.386 "data_size": 63488 00:24:35.386 }, 00:24:35.386 { 00:24:35.386 "name": "pt3", 00:24:35.386 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:35.386 "is_configured": true, 00:24:35.386 "data_offset": 2048, 00:24:35.386 "data_size": 63488 00:24:35.386 }, 00:24:35.386 { 00:24:35.386 "name": null, 00:24:35.386 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:35.386 "is_configured": false, 00:24:35.386 "data_offset": 2048, 00:24:35.386 "data_size": 63488 00:24:35.386 } 00:24:35.386 ] 00:24:35.386 }' 00:24:35.386 06:40:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:35.387 06:40:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:35.953 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:24:35.953 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:24:35.953 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=3 00:24:35.953 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:36.210 [2024-07-25 06:40:49.601782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:36.210 [2024-07-25 06:40:49.601831] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:36.210 [2024-07-25 06:40:49.601849] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d5400 00:24:36.211 [2024-07-25 06:40:49.601861] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:36.211 [2024-07-25 06:40:49.602185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:36.211 [2024-07-25 06:40:49.602202] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:36.211 [2024-07-25 06:40:49.602259] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:24:36.211 [2024-07-25 06:40:49.602278] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:36.211 [2024-07-25 06:40:49.602383] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20d62e0 00:24:36.211 [2024-07-25 06:40:49.602392] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:36.211 [2024-07-25 06:40:49.602549] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d42f0 00:24:36.211 [2024-07-25 06:40:49.602670] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20d62e0 00:24:36.211 [2024-07-25 06:40:49.602679] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20d62e0 00:24:36.211 [2024-07-25 06:40:49.602772] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:36.211 pt4 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.211 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.469 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:36.469 "name": "raid_bdev1", 00:24:36.469 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:36.469 "strip_size_kb": 0, 00:24:36.469 "state": "online", 00:24:36.469 "raid_level": "raid1", 00:24:36.469 "superblock": true, 00:24:36.469 "num_base_bdevs": 4, 00:24:36.469 "num_base_bdevs_discovered": 3, 00:24:36.469 "num_base_bdevs_operational": 3, 00:24:36.469 "base_bdevs_list": [ 00:24:36.469 { 00:24:36.469 "name": null, 00:24:36.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.469 "is_configured": false, 00:24:36.469 "data_offset": 2048, 00:24:36.469 "data_size": 63488 00:24:36.469 }, 00:24:36.469 { 00:24:36.469 "name": "pt2", 00:24:36.469 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:36.469 "is_configured": true, 00:24:36.469 "data_offset": 2048, 00:24:36.469 "data_size": 63488 00:24:36.469 }, 00:24:36.469 { 00:24:36.469 "name": "pt3", 00:24:36.469 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:36.469 "is_configured": true, 00:24:36.469 "data_offset": 2048, 00:24:36.469 "data_size": 63488 00:24:36.469 }, 00:24:36.469 { 00:24:36.469 "name": "pt4", 00:24:36.469 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:36.469 "is_configured": true, 00:24:36.469 "data_offset": 2048, 00:24:36.469 "data_size": 63488 00:24:36.469 } 00:24:36.469 ] 00:24:36.469 }' 00:24:36.469 06:40:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:36.469 06:40:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:37.034 06:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:37.292 [2024-07-25 06:40:50.640520] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:37.292 [2024-07-25 06:40:50.640546] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:37.292 [2024-07-25 06:40:50.640600] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:37.292 [2024-07-25 06:40:50.640662] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:37.292 [2024-07-25 06:40:50.640672] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d62e0 name raid_bdev1, state offline 00:24:37.292 06:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.292 06:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:24:37.549 06:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:24:37.549 06:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:24:37.549 06:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 4 -gt 2 ']' 00:24:37.549 06:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # i=3 00:24:37.549 06:40:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@550 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:37.808 [2024-07-25 06:40:51.334310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:37.808 [2024-07-25 06:40:51.334357] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:37.808 [2024-07-25 06:40:51.334373] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d4450 00:24:37.808 [2024-07-25 06:40:51.334384] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:37.808 [2024-07-25 06:40:51.335874] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:37.808 [2024-07-25 06:40:51.335904] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:37.808 [2024-07-25 06:40:51.335964] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:37.808 [2024-07-25 06:40:51.335988] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:37.808 [2024-07-25 06:40:51.336079] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:24:37.808 [2024-07-25 06:40:51.336091] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:37.808 [2024-07-25 06:40:51.336104] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20dca20 name raid_bdev1, state configuring 00:24:37.808 [2024-07-25 06:40:51.336124] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:37.808 [2024-07-25 06:40:51.336204] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:37.808 pt1 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 4 -gt 2 ']' 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@560 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.808 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.066 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:38.066 "name": "raid_bdev1", 00:24:38.066 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:38.066 "strip_size_kb": 0, 00:24:38.066 "state": "configuring", 00:24:38.066 "raid_level": "raid1", 00:24:38.066 "superblock": true, 00:24:38.066 "num_base_bdevs": 4, 00:24:38.066 "num_base_bdevs_discovered": 2, 00:24:38.066 "num_base_bdevs_operational": 3, 00:24:38.066 "base_bdevs_list": [ 00:24:38.066 { 00:24:38.066 "name": null, 00:24:38.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.066 "is_configured": false, 00:24:38.066 "data_offset": 2048, 00:24:38.066 "data_size": 63488 00:24:38.066 }, 00:24:38.066 { 00:24:38.066 "name": "pt2", 00:24:38.066 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:38.066 "is_configured": true, 00:24:38.066 "data_offset": 2048, 00:24:38.066 "data_size": 63488 00:24:38.066 }, 00:24:38.066 { 00:24:38.066 "name": "pt3", 00:24:38.066 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:38.066 "is_configured": true, 00:24:38.066 "data_offset": 2048, 00:24:38.066 "data_size": 63488 00:24:38.066 }, 00:24:38.066 { 00:24:38.066 "name": null, 00:24:38.066 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:38.066 "is_configured": false, 00:24:38.066 "data_offset": 2048, 00:24:38.066 "data_size": 63488 00:24:38.066 } 00:24:38.066 ] 00:24:38.066 }' 00:24:38.066 06:40:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:38.066 06:40:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:38.633 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:24:38.633 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:38.891 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # [[ false == \f\a\l\s\e ]] 00:24:38.891 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:39.150 [2024-07-25 06:40:52.593617] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:39.150 [2024-07-25 06:40:52.593667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.150 [2024-07-25 06:40:52.593684] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2282520 00:24:39.150 [2024-07-25 06:40:52.593696] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.150 [2024-07-25 06:40:52.594020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.150 [2024-07-25 06:40:52.594037] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:39.150 [2024-07-25 06:40:52.594095] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:24:39.150 [2024-07-25 06:40:52.594114] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:39.150 [2024-07-25 06:40:52.594232] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2282030 00:24:39.150 [2024-07-25 06:40:52.594242] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:39.150 [2024-07-25 06:40:52.594398] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20da660 00:24:39.150 [2024-07-25 06:40:52.594520] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2282030 00:24:39.150 [2024-07-25 06:40:52.594529] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2282030 00:24:39.150 [2024-07-25 06:40:52.594621] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.150 pt4 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.150 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.408 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.408 "name": "raid_bdev1", 00:24:39.408 "uuid": "79c5bd43-3208-4f02-88a5-ed603d17858f", 00:24:39.408 "strip_size_kb": 0, 00:24:39.408 "state": "online", 00:24:39.408 "raid_level": "raid1", 00:24:39.408 "superblock": true, 00:24:39.408 "num_base_bdevs": 4, 00:24:39.408 "num_base_bdevs_discovered": 3, 00:24:39.408 "num_base_bdevs_operational": 3, 00:24:39.408 "base_bdevs_list": [ 00:24:39.408 { 00:24:39.408 "name": null, 00:24:39.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.408 "is_configured": false, 00:24:39.408 "data_offset": 2048, 00:24:39.408 "data_size": 63488 00:24:39.408 }, 00:24:39.408 { 00:24:39.408 "name": "pt2", 00:24:39.408 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:39.408 "is_configured": true, 00:24:39.408 "data_offset": 2048, 00:24:39.408 "data_size": 63488 00:24:39.408 }, 00:24:39.408 { 00:24:39.408 "name": "pt3", 00:24:39.408 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:39.408 "is_configured": true, 00:24:39.408 "data_offset": 2048, 00:24:39.408 "data_size": 63488 00:24:39.408 }, 00:24:39.408 { 00:24:39.408 "name": "pt4", 00:24:39.408 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:39.408 "is_configured": true, 00:24:39.408 "data_offset": 2048, 00:24:39.408 "data_size": 63488 00:24:39.408 } 00:24:39.408 ] 00:24:39.408 }' 00:24:39.408 06:40:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.408 06:40:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:39.982 06:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:39.982 06:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:40.240 06:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:24:40.240 06:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:40.240 06:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:24:40.498 [2024-07-25 06:40:53.849224] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' 79c5bd43-3208-4f02-88a5-ed603d17858f '!=' 79c5bd43-3208-4f02-88a5-ed603d17858f ']' 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1214020 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1214020 ']' 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1214020 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1214020 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1214020' 00:24:40.498 killing process with pid 1214020 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1214020 00:24:40.498 [2024-07-25 06:40:53.930804] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:40.498 [2024-07-25 06:40:53.930857] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:40.498 [2024-07-25 06:40:53.930919] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:40.498 [2024-07-25 06:40:53.930929] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2282030 name raid_bdev1, state offline 00:24:40.498 06:40:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1214020 00:24:40.498 [2024-07-25 06:40:53.963482] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:40.757 06:40:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:24:40.757 00:24:40.757 real 0m24.063s 00:24:40.757 user 0m44.008s 00:24:40.757 sys 0m4.410s 00:24:40.757 06:40:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:40.757 06:40:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:40.757 ************************************ 00:24:40.757 END TEST raid_superblock_test 00:24:40.757 ************************************ 00:24:40.757 06:40:54 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:24:40.757 06:40:54 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:24:40.757 06:40:54 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:40.757 06:40:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:40.757 ************************************ 00:24:40.757 START TEST raid_read_error_test 00:24:40.757 ************************************ 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 read 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.t8MEm1LZrf 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1218646 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1218646 /var/tmp/spdk-raid.sock 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1218646 ']' 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:40.757 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:40.757 06:40:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:40.757 [2024-07-25 06:40:54.301759] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:24:40.757 [2024-07-25 06:40:54.301820] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1218646 ] 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:01.0 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:01.1 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:01.2 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:01.3 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:01.4 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:01.5 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:01.6 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:01.7 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:02.0 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:02.1 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:02.2 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:02.3 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:02.4 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:02.5 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:02.6 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3d:02.7 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:01.0 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:01.1 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:01.2 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:01.3 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:01.4 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:01.5 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:01.6 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:01.7 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:02.0 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:02.1 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:02.2 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:02.3 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:02.4 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:02.5 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:02.6 cannot be used 00:24:41.016 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:41.016 EAL: Requested device 0000:3f:02.7 cannot be used 00:24:41.016 [2024-07-25 06:40:54.424289] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:41.016 [2024-07-25 06:40:54.468388] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:41.016 [2024-07-25 06:40:54.530435] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:41.016 [2024-07-25 06:40:54.530469] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:41.951 06:40:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:41.951 06:40:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:24:41.951 06:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:24:41.952 06:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:41.952 BaseBdev1_malloc 00:24:41.952 06:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:24:42.210 true 00:24:42.210 06:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:24:42.504 [2024-07-25 06:40:55.882903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:24:42.504 [2024-07-25 06:40:55.882948] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:42.504 [2024-07-25 06:40:55.882966] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25bfa60 00:24:42.504 [2024-07-25 06:40:55.882977] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:42.504 [2024-07-25 06:40:55.884405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:42.504 [2024-07-25 06:40:55.884434] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:42.504 BaseBdev1 00:24:42.504 06:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:24:42.504 06:40:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:42.784 BaseBdev2_malloc 00:24:42.784 06:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:24:43.042 true 00:24:43.042 06:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:24:43.042 [2024-07-25 06:40:56.572964] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:24:43.042 [2024-07-25 06:40:56.573009] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:43.042 [2024-07-25 06:40:56.573027] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c4dc0 00:24:43.042 [2024-07-25 06:40:56.573039] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:43.042 [2024-07-25 06:40:56.574330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:43.042 [2024-07-25 06:40:56.574358] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:43.042 BaseBdev2 00:24:43.042 06:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:24:43.042 06:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:43.301 BaseBdev3_malloc 00:24:43.301 06:40:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:24:43.558 true 00:24:43.558 06:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:24:43.816 [2024-07-25 06:40:57.255027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:24:43.816 [2024-07-25 06:40:57.255066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:43.816 [2024-07-25 06:40:57.255081] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c5420 00:24:43.816 [2024-07-25 06:40:57.255093] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:43.816 [2024-07-25 06:40:57.256340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:43.816 [2024-07-25 06:40:57.256368] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:43.816 BaseBdev3 00:24:43.816 06:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:24:43.816 06:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:44.074 BaseBdev4_malloc 00:24:44.074 06:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:24:44.332 true 00:24:44.332 06:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:24:44.591 [2024-07-25 06:40:57.944983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:24:44.591 [2024-07-25 06:40:57.945022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:44.591 [2024-07-25 06:40:57.945039] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c89b0 00:24:44.591 [2024-07-25 06:40:57.945051] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:44.591 [2024-07-25 06:40:57.946293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:44.591 [2024-07-25 06:40:57.946318] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:44.591 BaseBdev4 00:24:44.591 06:40:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:24:44.849 [2024-07-25 06:40:58.169606] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:44.849 [2024-07-25 06:40:58.170649] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:44.849 [2024-07-25 06:40:58.170710] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:44.849 [2024-07-25 06:40:58.170764] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:44.849 [2024-07-25 06:40:58.170970] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25c8ec0 00:24:44.849 [2024-07-25 06:40:58.170981] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:44.849 [2024-07-25 06:40:58.171133] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x241b4b0 00:24:44.849 [2024-07-25 06:40:58.171277] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25c8ec0 00:24:44.849 [2024-07-25 06:40:58.171286] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25c8ec0 00:24:44.849 [2024-07-25 06:40:58.171374] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.849 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.108 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:45.108 "name": "raid_bdev1", 00:24:45.108 "uuid": "1ea47669-9fe1-4a62-8c12-a1ca50d57817", 00:24:45.108 "strip_size_kb": 0, 00:24:45.108 "state": "online", 00:24:45.108 "raid_level": "raid1", 00:24:45.108 "superblock": true, 00:24:45.108 "num_base_bdevs": 4, 00:24:45.108 "num_base_bdevs_discovered": 4, 00:24:45.108 "num_base_bdevs_operational": 4, 00:24:45.108 "base_bdevs_list": [ 00:24:45.108 { 00:24:45.108 "name": "BaseBdev1", 00:24:45.108 "uuid": "0009f2b2-808b-5f0c-9e8a-82f49fc1f17e", 00:24:45.108 "is_configured": true, 00:24:45.108 "data_offset": 2048, 00:24:45.108 "data_size": 63488 00:24:45.108 }, 00:24:45.108 { 00:24:45.108 "name": "BaseBdev2", 00:24:45.108 "uuid": "22606d2d-8d1c-5307-8583-ea713cbad460", 00:24:45.108 "is_configured": true, 00:24:45.108 "data_offset": 2048, 00:24:45.108 "data_size": 63488 00:24:45.108 }, 00:24:45.108 { 00:24:45.108 "name": "BaseBdev3", 00:24:45.108 "uuid": "249151ed-fd2d-5eb8-8c65-b4b7782f8445", 00:24:45.108 "is_configured": true, 00:24:45.108 "data_offset": 2048, 00:24:45.108 "data_size": 63488 00:24:45.108 }, 00:24:45.108 { 00:24:45.108 "name": "BaseBdev4", 00:24:45.108 "uuid": "e85a4cb7-62af-5408-82b8-2c190316faaf", 00:24:45.108 "is_configured": true, 00:24:45.108 "data_offset": 2048, 00:24:45.108 "data_size": 63488 00:24:45.108 } 00:24:45.108 ] 00:24:45.108 }' 00:24:45.108 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:45.108 06:40:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:45.674 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:24:45.674 06:40:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:45.674 [2024-07-25 06:40:59.080248] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x241e3b0 00:24:46.609 06:40:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.867 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.126 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:47.126 "name": "raid_bdev1", 00:24:47.126 "uuid": "1ea47669-9fe1-4a62-8c12-a1ca50d57817", 00:24:47.126 "strip_size_kb": 0, 00:24:47.126 "state": "online", 00:24:47.126 "raid_level": "raid1", 00:24:47.126 "superblock": true, 00:24:47.126 "num_base_bdevs": 4, 00:24:47.126 "num_base_bdevs_discovered": 4, 00:24:47.126 "num_base_bdevs_operational": 4, 00:24:47.126 "base_bdevs_list": [ 00:24:47.126 { 00:24:47.126 "name": "BaseBdev1", 00:24:47.126 "uuid": "0009f2b2-808b-5f0c-9e8a-82f49fc1f17e", 00:24:47.126 "is_configured": true, 00:24:47.126 "data_offset": 2048, 00:24:47.126 "data_size": 63488 00:24:47.126 }, 00:24:47.126 { 00:24:47.126 "name": "BaseBdev2", 00:24:47.126 "uuid": "22606d2d-8d1c-5307-8583-ea713cbad460", 00:24:47.126 "is_configured": true, 00:24:47.126 "data_offset": 2048, 00:24:47.126 "data_size": 63488 00:24:47.126 }, 00:24:47.126 { 00:24:47.126 "name": "BaseBdev3", 00:24:47.126 "uuid": "249151ed-fd2d-5eb8-8c65-b4b7782f8445", 00:24:47.126 "is_configured": true, 00:24:47.126 "data_offset": 2048, 00:24:47.126 "data_size": 63488 00:24:47.126 }, 00:24:47.126 { 00:24:47.126 "name": "BaseBdev4", 00:24:47.126 "uuid": "e85a4cb7-62af-5408-82b8-2c190316faaf", 00:24:47.126 "is_configured": true, 00:24:47.126 "data_offset": 2048, 00:24:47.126 "data_size": 63488 00:24:47.126 } 00:24:47.126 ] 00:24:47.126 }' 00:24:47.126 06:41:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:47.126 06:41:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:47.692 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:47.692 [2024-07-25 06:41:01.242412] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:47.692 [2024-07-25 06:41:01.242448] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:47.692 [2024-07-25 06:41:01.245378] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:47.692 [2024-07-25 06:41:01.245417] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:47.692 [2024-07-25 06:41:01.245525] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:47.692 [2024-07-25 06:41:01.245536] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c8ec0 name raid_bdev1, state offline 00:24:47.692 0 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1218646 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1218646 ']' 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1218646 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1218646 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1218646' 00:24:47.951 killing process with pid 1218646 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1218646 00:24:47.951 [2024-07-25 06:41:01.321262] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:47.951 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1218646 00:24:47.951 [2024-07-25 06:41:01.347827] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.t8MEm1LZrf 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:24:48.210 00:24:48.210 real 0m7.310s 00:24:48.210 user 0m11.694s 00:24:48.210 sys 0m1.268s 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:48.210 06:41:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:48.210 ************************************ 00:24:48.210 END TEST raid_read_error_test 00:24:48.210 ************************************ 00:24:48.210 06:41:01 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:24:48.210 06:41:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:24:48.210 06:41:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:48.210 06:41:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:48.210 ************************************ 00:24:48.210 START TEST raid_write_error_test 00:24:48.210 ************************************ 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 write 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:48.210 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.m45lobTwhv 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1219833 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1219833 /var/tmp/spdk-raid.sock 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1219833 ']' 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:48.211 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:48.211 06:41:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:48.211 [2024-07-25 06:41:01.683334] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:24:48.211 [2024-07-25 06:41:01.683391] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1219833 ] 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:01.0 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:01.1 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:01.2 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:01.3 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:01.4 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:01.5 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:01.6 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:01.7 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:02.0 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:02.1 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:02.2 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:02.3 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:02.4 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:02.5 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:02.6 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3d:02.7 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:01.0 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:01.1 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:01.2 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:01.3 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:01.4 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:01.5 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:01.6 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:01.7 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:02.0 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:02.1 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:02.2 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:02.3 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:02.4 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:02.5 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:02.6 cannot be used 00:24:48.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:48.211 EAL: Requested device 0000:3f:02.7 cannot be used 00:24:48.469 [2024-07-25 06:41:01.820812] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:48.469 [2024-07-25 06:41:01.865989] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:48.469 [2024-07-25 06:41:01.931839] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:48.469 [2024-07-25 06:41:01.931881] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:49.035 06:41:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:49.035 06:41:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:24:49.035 06:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:24:49.035 06:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:49.293 BaseBdev1_malloc 00:24:49.293 06:41:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:24:49.551 true 00:24:49.551 06:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:24:49.809 [2024-07-25 06:41:03.248689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:24:49.809 [2024-07-25 06:41:03.248732] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:49.809 [2024-07-25 06:41:03.248749] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259fa60 00:24:49.809 [2024-07-25 06:41:03.248761] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:49.809 [2024-07-25 06:41:03.250294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:49.809 [2024-07-25 06:41:03.250322] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:49.809 BaseBdev1 00:24:49.809 06:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:24:49.809 06:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:50.066 BaseBdev2_malloc 00:24:50.066 06:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:24:50.323 true 00:24:50.323 06:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:24:50.581 [2024-07-25 06:41:03.894538] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:24:50.581 [2024-07-25 06:41:03.894576] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:50.581 [2024-07-25 06:41:03.894594] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a4dc0 00:24:50.581 [2024-07-25 06:41:03.894605] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:50.581 [2024-07-25 06:41:03.895961] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:50.581 [2024-07-25 06:41:03.895992] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:50.581 BaseBdev2 00:24:50.581 06:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:24:50.581 06:41:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:50.581 BaseBdev3_malloc 00:24:50.838 06:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:24:50.838 true 00:24:50.838 06:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:24:51.096 [2024-07-25 06:41:04.576667] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:24:51.096 [2024-07-25 06:41:04.576706] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.096 [2024-07-25 06:41:04.576724] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a5420 00:24:51.096 [2024-07-25 06:41:04.576735] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.096 [2024-07-25 06:41:04.578121] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.096 [2024-07-25 06:41:04.578154] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:51.096 BaseBdev3 00:24:51.096 06:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:24:51.096 06:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:51.353 BaseBdev4_malloc 00:24:51.353 06:41:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:24:51.611 true 00:24:51.611 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:24:51.869 [2024-07-25 06:41:05.250624] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:24:51.869 [2024-07-25 06:41:05.250665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.869 [2024-07-25 06:41:05.250684] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a89b0 00:24:51.869 [2024-07-25 06:41:05.250695] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.869 [2024-07-25 06:41:05.252110] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.869 [2024-07-25 06:41:05.252151] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:51.869 BaseBdev4 00:24:51.869 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:24:52.127 [2024-07-25 06:41:05.467221] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:52.127 [2024-07-25 06:41:05.468334] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:52.127 [2024-07-25 06:41:05.468398] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:52.127 [2024-07-25 06:41:05.468453] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:52.127 [2024-07-25 06:41:05.468665] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25a8ec0 00:24:52.127 [2024-07-25 06:41:05.468675] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:52.127 [2024-07-25 06:41:05.468843] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23fb4b0 00:24:52.127 [2024-07-25 06:41:05.468983] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25a8ec0 00:24:52.127 [2024-07-25 06:41:05.468997] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25a8ec0 00:24:52.127 [2024-07-25 06:41:05.469089] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:52.127 "name": "raid_bdev1", 00:24:52.127 "uuid": "6be9c07e-7a56-4114-828a-9c0f803a5ba0", 00:24:52.127 "strip_size_kb": 0, 00:24:52.127 "state": "online", 00:24:52.127 "raid_level": "raid1", 00:24:52.127 "superblock": true, 00:24:52.127 "num_base_bdevs": 4, 00:24:52.127 "num_base_bdevs_discovered": 4, 00:24:52.127 "num_base_bdevs_operational": 4, 00:24:52.127 "base_bdevs_list": [ 00:24:52.127 { 00:24:52.127 "name": "BaseBdev1", 00:24:52.127 "uuid": "a95f1921-af12-570e-8abd-f61213a9770e", 00:24:52.127 "is_configured": true, 00:24:52.127 "data_offset": 2048, 00:24:52.127 "data_size": 63488 00:24:52.127 }, 00:24:52.127 { 00:24:52.127 "name": "BaseBdev2", 00:24:52.127 "uuid": "7aa9c427-d303-5fb4-8d29-4dfe98e8fe93", 00:24:52.127 "is_configured": true, 00:24:52.127 "data_offset": 2048, 00:24:52.127 "data_size": 63488 00:24:52.127 }, 00:24:52.127 { 00:24:52.127 "name": "BaseBdev3", 00:24:52.127 "uuid": "9f561f52-e38a-51f6-be84-92351da20c7b", 00:24:52.127 "is_configured": true, 00:24:52.127 "data_offset": 2048, 00:24:52.127 "data_size": 63488 00:24:52.127 }, 00:24:52.127 { 00:24:52.127 "name": "BaseBdev4", 00:24:52.127 "uuid": "5d686e4b-218d-5452-9c24-cb66df64b9e4", 00:24:52.127 "is_configured": true, 00:24:52.127 "data_offset": 2048, 00:24:52.127 "data_size": 63488 00:24:52.127 } 00:24:52.127 ] 00:24:52.127 }' 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:52.127 06:41:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:53.060 06:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:24:53.060 06:41:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:53.060 [2024-07-25 06:41:06.614485] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23fe3b0 00:24:53.992 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:24:54.250 [2024-07-25 06:41:07.729664] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:24:54.250 [2024-07-25 06:41:07.729722] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:54.250 [2024-07-25 06:41:07.729928] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x23fe3b0 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=3 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.250 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.508 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:54.508 "name": "raid_bdev1", 00:24:54.508 "uuid": "6be9c07e-7a56-4114-828a-9c0f803a5ba0", 00:24:54.508 "strip_size_kb": 0, 00:24:54.508 "state": "online", 00:24:54.508 "raid_level": "raid1", 00:24:54.508 "superblock": true, 00:24:54.508 "num_base_bdevs": 4, 00:24:54.508 "num_base_bdevs_discovered": 3, 00:24:54.508 "num_base_bdevs_operational": 3, 00:24:54.508 "base_bdevs_list": [ 00:24:54.508 { 00:24:54.508 "name": null, 00:24:54.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.508 "is_configured": false, 00:24:54.508 "data_offset": 2048, 00:24:54.508 "data_size": 63488 00:24:54.508 }, 00:24:54.508 { 00:24:54.508 "name": "BaseBdev2", 00:24:54.508 "uuid": "7aa9c427-d303-5fb4-8d29-4dfe98e8fe93", 00:24:54.508 "is_configured": true, 00:24:54.508 "data_offset": 2048, 00:24:54.508 "data_size": 63488 00:24:54.508 }, 00:24:54.508 { 00:24:54.508 "name": "BaseBdev3", 00:24:54.508 "uuid": "9f561f52-e38a-51f6-be84-92351da20c7b", 00:24:54.508 "is_configured": true, 00:24:54.508 "data_offset": 2048, 00:24:54.508 "data_size": 63488 00:24:54.508 }, 00:24:54.508 { 00:24:54.508 "name": "BaseBdev4", 00:24:54.508 "uuid": "5d686e4b-218d-5452-9c24-cb66df64b9e4", 00:24:54.508 "is_configured": true, 00:24:54.508 "data_offset": 2048, 00:24:54.508 "data_size": 63488 00:24:54.508 } 00:24:54.508 ] 00:24:54.508 }' 00:24:54.508 06:41:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:54.508 06:41:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:55.073 06:41:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:55.331 [2024-07-25 06:41:08.766849] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:55.331 [2024-07-25 06:41:08.766882] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:55.331 [2024-07-25 06:41:08.769816] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:55.331 [2024-07-25 06:41:08.769852] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:55.331 [2024-07-25 06:41:08.769939] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:55.331 [2024-07-25 06:41:08.769949] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25a8ec0 name raid_bdev1, state offline 00:24:55.331 0 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1219833 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1219833 ']' 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1219833 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1219833 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1219833' 00:24:55.331 killing process with pid 1219833 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1219833 00:24:55.331 [2024-07-25 06:41:08.826905] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:55.331 06:41:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1219833 00:24:55.331 [2024-07-25 06:41:08.853601] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.m45lobTwhv 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:24:55.591 00:24:55.591 real 0m7.442s 00:24:55.591 user 0m11.935s 00:24:55.591 sys 0m1.266s 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:55.591 06:41:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:55.591 ************************************ 00:24:55.591 END TEST raid_write_error_test 00:24:55.591 ************************************ 00:24:55.591 06:41:09 bdev_raid -- bdev/bdev_raid.sh@955 -- # '[' true = true ']' 00:24:55.591 06:41:09 bdev_raid -- bdev/bdev_raid.sh@956 -- # for n in 2 4 00:24:55.591 06:41:09 bdev_raid -- bdev/bdev_raid.sh@957 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:24:55.591 06:41:09 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:24:55.591 06:41:09 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:55.591 06:41:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:55.591 ************************************ 00:24:55.591 START TEST raid_rebuild_test 00:24:55.591 ************************************ 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false false true 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # local verify=true 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:55.591 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # local strip_size 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # local create_arg 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # local data_offset 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:24:55.868 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # raid_pid=1221258 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # waitforlisten 1221258 /var/tmp/spdk-raid.sock 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 1221258 ']' 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:55.869 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:55.869 06:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:55.869 [2024-07-25 06:41:09.209961] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:24:55.869 [2024-07-25 06:41:09.210022] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1221258 ] 00:24:55.869 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:55.869 Zero copy mechanism will not be used. 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:01.0 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:01.1 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:01.2 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:01.3 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:01.4 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:01.5 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:01.6 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:01.7 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:02.0 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:02.1 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:02.2 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:02.3 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:02.4 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:02.5 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:02.6 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3d:02.7 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:01.0 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:01.1 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:01.2 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:01.3 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:01.4 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:01.5 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:01.6 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:01.7 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:02.0 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:02.1 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:02.2 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:02.3 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:02.4 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:02.5 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:02.6 cannot be used 00:24:55.869 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:24:55.869 EAL: Requested device 0000:3f:02.7 cannot be used 00:24:55.869 [2024-07-25 06:41:09.346660] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:55.869 [2024-07-25 06:41:09.390698] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:56.139 [2024-07-25 06:41:09.450798] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:56.139 [2024-07-25 06:41:09.450842] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:56.707 06:41:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:56.707 06:41:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:24:56.707 06:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:56.707 06:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:56.965 BaseBdev1_malloc 00:24:56.965 06:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:56.965 [2024-07-25 06:41:10.452807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:56.965 [2024-07-25 06:41:10.452854] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:56.965 [2024-07-25 06:41:10.452875] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b47b0 00:24:56.965 [2024-07-25 06:41:10.452886] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:56.965 [2024-07-25 06:41:10.454302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:56.965 [2024-07-25 06:41:10.454330] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:56.965 BaseBdev1 00:24:56.965 06:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:56.965 06:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:57.224 BaseBdev2_malloc 00:24:57.224 06:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:57.483 [2024-07-25 06:41:10.910252] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:57.483 [2024-07-25 06:41:10.910295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:57.483 [2024-07-25 06:41:10.910315] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24028f0 00:24:57.483 [2024-07-25 06:41:10.910326] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:57.483 [2024-07-25 06:41:10.911553] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:57.483 [2024-07-25 06:41:10.911579] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:57.483 BaseBdev2 00:24:57.483 06:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:57.741 spare_malloc 00:24:57.741 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:58.000 spare_delay 00:24:58.000 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:58.259 [2024-07-25 06:41:11.592072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:58.259 [2024-07-25 06:41:11.592107] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:58.259 [2024-07-25 06:41:11.592124] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23f9c10 00:24:58.259 [2024-07-25 06:41:11.592135] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:58.259 [2024-07-25 06:41:11.593386] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:58.259 [2024-07-25 06:41:11.593411] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:58.259 spare 00:24:58.259 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:58.518 [2024-07-25 06:41:11.820691] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:58.518 [2024-07-25 06:41:11.821768] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:58.518 [2024-07-25 06:41:11.821843] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23fad20 00:24:58.518 [2024-07-25 06:41:11.821853] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:58.518 [2024-07-25 06:41:11.822025] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23fd0e0 00:24:58.518 [2024-07-25 06:41:11.822159] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23fad20 00:24:58.518 [2024-07-25 06:41:11.822170] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23fad20 00:24:58.518 [2024-07-25 06:41:11.822264] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.518 06:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.777 06:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:58.777 "name": "raid_bdev1", 00:24:58.777 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:24:58.777 "strip_size_kb": 0, 00:24:58.777 "state": "online", 00:24:58.777 "raid_level": "raid1", 00:24:58.777 "superblock": false, 00:24:58.777 "num_base_bdevs": 2, 00:24:58.777 "num_base_bdevs_discovered": 2, 00:24:58.777 "num_base_bdevs_operational": 2, 00:24:58.777 "base_bdevs_list": [ 00:24:58.777 { 00:24:58.777 "name": "BaseBdev1", 00:24:58.777 "uuid": "40aec9b4-537a-556e-aa2d-55a8856f1ab6", 00:24:58.777 "is_configured": true, 00:24:58.777 "data_offset": 0, 00:24:58.777 "data_size": 65536 00:24:58.777 }, 00:24:58.777 { 00:24:58.777 "name": "BaseBdev2", 00:24:58.777 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:24:58.777 "is_configured": true, 00:24:58.777 "data_offset": 0, 00:24:58.777 "data_size": 65536 00:24:58.777 } 00:24:58.777 ] 00:24:58.777 }' 00:24:58.777 06:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:58.777 06:41:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:59.343 06:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:59.343 06:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:24:59.343 [2024-07-25 06:41:12.815521] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:59.343 06:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:24:59.343 06:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:59.343 06:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:59.910 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:00.171 [2024-07-25 06:41:13.561454] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23f8c00 00:25:00.171 /dev/nbd0 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:00.171 1+0 records in 00:25:00.171 1+0 records out 00:25:00.171 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245372 s, 16.7 MB/s 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:25:00.171 06:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:25:04.360 65536+0 records in 00:25:04.360 65536+0 records out 00:25:04.360 33554432 bytes (34 MB, 32 MiB) copied, 4.11736 s, 8.1 MB/s 00:25:04.361 06:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:04.361 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.361 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:04.361 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:04.361 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:04.361 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:04.361 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:04.619 [2024-07-25 06:41:17.988800] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:04.619 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:04.619 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:04.619 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:04.620 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:04.620 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:04.620 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:04.620 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:04.620 06:41:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:04.620 06:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:04.879 [2024-07-25 06:41:18.209422] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.879 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.138 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:05.138 "name": "raid_bdev1", 00:25:05.138 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:05.138 "strip_size_kb": 0, 00:25:05.138 "state": "online", 00:25:05.138 "raid_level": "raid1", 00:25:05.138 "superblock": false, 00:25:05.138 "num_base_bdevs": 2, 00:25:05.138 "num_base_bdevs_discovered": 1, 00:25:05.138 "num_base_bdevs_operational": 1, 00:25:05.138 "base_bdevs_list": [ 00:25:05.138 { 00:25:05.138 "name": null, 00:25:05.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.138 "is_configured": false, 00:25:05.138 "data_offset": 0, 00:25:05.138 "data_size": 65536 00:25:05.138 }, 00:25:05.138 { 00:25:05.138 "name": "BaseBdev2", 00:25:05.138 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:05.138 "is_configured": true, 00:25:05.138 "data_offset": 0, 00:25:05.138 "data_size": 65536 00:25:05.138 } 00:25:05.138 ] 00:25:05.138 }' 00:25:05.138 06:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:05.139 06:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:05.708 06:41:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:05.708 [2024-07-25 06:41:19.256196] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.708 [2024-07-25 06:41:19.260842] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25a9bf0 00:25:05.708 [2024-07-25 06:41:19.262862] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:05.967 06:41:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:06.902 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:06.902 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.902 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:06.902 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:06.902 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.902 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.902 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.161 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.161 "name": "raid_bdev1", 00:25:07.161 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:07.161 "strip_size_kb": 0, 00:25:07.161 "state": "online", 00:25:07.161 "raid_level": "raid1", 00:25:07.161 "superblock": false, 00:25:07.161 "num_base_bdevs": 2, 00:25:07.161 "num_base_bdevs_discovered": 2, 00:25:07.161 "num_base_bdevs_operational": 2, 00:25:07.161 "process": { 00:25:07.161 "type": "rebuild", 00:25:07.161 "target": "spare", 00:25:07.161 "progress": { 00:25:07.161 "blocks": 24576, 00:25:07.161 "percent": 37 00:25:07.161 } 00:25:07.161 }, 00:25:07.161 "base_bdevs_list": [ 00:25:07.161 { 00:25:07.161 "name": "spare", 00:25:07.161 "uuid": "52c73813-0e80-5def-ab99-b564952a0d35", 00:25:07.161 "is_configured": true, 00:25:07.161 "data_offset": 0, 00:25:07.161 "data_size": 65536 00:25:07.161 }, 00:25:07.161 { 00:25:07.161 "name": "BaseBdev2", 00:25:07.161 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:07.161 "is_configured": true, 00:25:07.161 "data_offset": 0, 00:25:07.161 "data_size": 65536 00:25:07.161 } 00:25:07.161 ] 00:25:07.161 }' 00:25:07.161 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:07.161 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:07.161 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:07.161 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:07.161 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:07.421 [2024-07-25 06:41:20.809594] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.421 [2024-07-25 06:41:20.874626] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:07.421 [2024-07-25 06:41:20.874670] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:07.421 [2024-07-25 06:41:20.874684] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.421 [2024-07-25 06:41:20.874692] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.421 06:41:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.680 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.680 "name": "raid_bdev1", 00:25:07.680 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:07.680 "strip_size_kb": 0, 00:25:07.680 "state": "online", 00:25:07.680 "raid_level": "raid1", 00:25:07.680 "superblock": false, 00:25:07.680 "num_base_bdevs": 2, 00:25:07.680 "num_base_bdevs_discovered": 1, 00:25:07.680 "num_base_bdevs_operational": 1, 00:25:07.680 "base_bdevs_list": [ 00:25:07.680 { 00:25:07.680 "name": null, 00:25:07.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.680 "is_configured": false, 00:25:07.680 "data_offset": 0, 00:25:07.680 "data_size": 65536 00:25:07.680 }, 00:25:07.680 { 00:25:07.680 "name": "BaseBdev2", 00:25:07.680 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:07.680 "is_configured": true, 00:25:07.680 "data_offset": 0, 00:25:07.680 "data_size": 65536 00:25:07.680 } 00:25:07.680 ] 00:25:07.680 }' 00:25:07.680 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.680 06:41:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:08.248 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:08.248 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:08.248 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:08.248 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:08.248 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:08.248 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.248 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.507 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:08.507 "name": "raid_bdev1", 00:25:08.507 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:08.507 "strip_size_kb": 0, 00:25:08.507 "state": "online", 00:25:08.507 "raid_level": "raid1", 00:25:08.507 "superblock": false, 00:25:08.507 "num_base_bdevs": 2, 00:25:08.507 "num_base_bdevs_discovered": 1, 00:25:08.507 "num_base_bdevs_operational": 1, 00:25:08.507 "base_bdevs_list": [ 00:25:08.507 { 00:25:08.507 "name": null, 00:25:08.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:08.507 "is_configured": false, 00:25:08.507 "data_offset": 0, 00:25:08.508 "data_size": 65536 00:25:08.508 }, 00:25:08.508 { 00:25:08.508 "name": "BaseBdev2", 00:25:08.508 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:08.508 "is_configured": true, 00:25:08.508 "data_offset": 0, 00:25:08.508 "data_size": 65536 00:25:08.508 } 00:25:08.508 ] 00:25:08.508 }' 00:25:08.508 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:08.508 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:08.508 06:41:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:08.508 06:41:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:08.508 06:41:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:08.767 [2024-07-25 06:41:22.230524] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.767 [2024-07-25 06:41:22.235245] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25a9bf0 00:25:08.767 [2024-07-25 06:41:22.236584] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.767 06:41:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@678 -- # sleep 1 00:25:09.703 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.703 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.703 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.703 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.703 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.961 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.962 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.962 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.962 "name": "raid_bdev1", 00:25:09.962 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:09.962 "strip_size_kb": 0, 00:25:09.962 "state": "online", 00:25:09.962 "raid_level": "raid1", 00:25:09.962 "superblock": false, 00:25:09.962 "num_base_bdevs": 2, 00:25:09.962 "num_base_bdevs_discovered": 2, 00:25:09.962 "num_base_bdevs_operational": 2, 00:25:09.962 "process": { 00:25:09.962 "type": "rebuild", 00:25:09.962 "target": "spare", 00:25:09.962 "progress": { 00:25:09.962 "blocks": 24576, 00:25:09.962 "percent": 37 00:25:09.962 } 00:25:09.962 }, 00:25:09.962 "base_bdevs_list": [ 00:25:09.962 { 00:25:09.962 "name": "spare", 00:25:09.962 "uuid": "52c73813-0e80-5def-ab99-b564952a0d35", 00:25:09.962 "is_configured": true, 00:25:09.962 "data_offset": 0, 00:25:09.962 "data_size": 65536 00:25:09.962 }, 00:25:09.962 { 00:25:09.962 "name": "BaseBdev2", 00:25:09.962 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:09.962 "is_configured": true, 00:25:09.962 "data_offset": 0, 00:25:09.962 "data_size": 65536 00:25:09.962 } 00:25:09.962 ] 00:25:09.962 }' 00:25:09.962 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # local timeout=732 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.220 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.478 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.478 "name": "raid_bdev1", 00:25:10.478 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:10.478 "strip_size_kb": 0, 00:25:10.478 "state": "online", 00:25:10.478 "raid_level": "raid1", 00:25:10.478 "superblock": false, 00:25:10.478 "num_base_bdevs": 2, 00:25:10.478 "num_base_bdevs_discovered": 2, 00:25:10.478 "num_base_bdevs_operational": 2, 00:25:10.478 "process": { 00:25:10.478 "type": "rebuild", 00:25:10.478 "target": "spare", 00:25:10.478 "progress": { 00:25:10.478 "blocks": 30720, 00:25:10.478 "percent": 46 00:25:10.478 } 00:25:10.478 }, 00:25:10.478 "base_bdevs_list": [ 00:25:10.478 { 00:25:10.478 "name": "spare", 00:25:10.478 "uuid": "52c73813-0e80-5def-ab99-b564952a0d35", 00:25:10.478 "is_configured": true, 00:25:10.478 "data_offset": 0, 00:25:10.478 "data_size": 65536 00:25:10.478 }, 00:25:10.478 { 00:25:10.478 "name": "BaseBdev2", 00:25:10.478 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:10.478 "is_configured": true, 00:25:10.478 "data_offset": 0, 00:25:10.478 "data_size": 65536 00:25:10.478 } 00:25:10.478 ] 00:25:10.478 }' 00:25:10.478 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.478 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.478 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.478 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.478 06:41:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:11.445 06:41:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:11.445 06:41:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:11.445 06:41:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:11.445 06:41:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:11.445 06:41:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:11.445 06:41:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:11.445 06:41:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.445 06:41:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.704 06:41:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:11.704 "name": "raid_bdev1", 00:25:11.704 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:11.704 "strip_size_kb": 0, 00:25:11.704 "state": "online", 00:25:11.704 "raid_level": "raid1", 00:25:11.704 "superblock": false, 00:25:11.704 "num_base_bdevs": 2, 00:25:11.704 "num_base_bdevs_discovered": 2, 00:25:11.704 "num_base_bdevs_operational": 2, 00:25:11.704 "process": { 00:25:11.704 "type": "rebuild", 00:25:11.704 "target": "spare", 00:25:11.704 "progress": { 00:25:11.704 "blocks": 57344, 00:25:11.704 "percent": 87 00:25:11.704 } 00:25:11.704 }, 00:25:11.704 "base_bdevs_list": [ 00:25:11.704 { 00:25:11.704 "name": "spare", 00:25:11.704 "uuid": "52c73813-0e80-5def-ab99-b564952a0d35", 00:25:11.704 "is_configured": true, 00:25:11.704 "data_offset": 0, 00:25:11.704 "data_size": 65536 00:25:11.704 }, 00:25:11.704 { 00:25:11.704 "name": "BaseBdev2", 00:25:11.704 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:11.704 "is_configured": true, 00:25:11.704 "data_offset": 0, 00:25:11.704 "data_size": 65536 00:25:11.704 } 00:25:11.704 ] 00:25:11.704 }' 00:25:11.704 06:41:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:11.704 06:41:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:11.704 06:41:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:11.704 06:41:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:11.704 06:41:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:11.963 [2024-07-25 06:41:25.459747] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:11.963 [2024-07-25 06:41:25.459807] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:11.963 [2024-07-25 06:41:25.459843] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:12.898 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:12.898 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:12.898 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.898 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:12.898 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:12.898 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.898 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.898 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.156 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.156 "name": "raid_bdev1", 00:25:13.156 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:13.156 "strip_size_kb": 0, 00:25:13.156 "state": "online", 00:25:13.156 "raid_level": "raid1", 00:25:13.156 "superblock": false, 00:25:13.156 "num_base_bdevs": 2, 00:25:13.156 "num_base_bdevs_discovered": 2, 00:25:13.156 "num_base_bdevs_operational": 2, 00:25:13.156 "base_bdevs_list": [ 00:25:13.156 { 00:25:13.156 "name": "spare", 00:25:13.156 "uuid": "52c73813-0e80-5def-ab99-b564952a0d35", 00:25:13.156 "is_configured": true, 00:25:13.156 "data_offset": 0, 00:25:13.156 "data_size": 65536 00:25:13.156 }, 00:25:13.156 { 00:25:13.156 "name": "BaseBdev2", 00:25:13.156 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:13.156 "is_configured": true, 00:25:13.156 "data_offset": 0, 00:25:13.156 "data_size": 65536 00:25:13.156 } 00:25:13.156 ] 00:25:13.156 }' 00:25:13.156 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # break 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.157 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.415 "name": "raid_bdev1", 00:25:13.415 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:13.415 "strip_size_kb": 0, 00:25:13.415 "state": "online", 00:25:13.415 "raid_level": "raid1", 00:25:13.415 "superblock": false, 00:25:13.415 "num_base_bdevs": 2, 00:25:13.415 "num_base_bdevs_discovered": 2, 00:25:13.415 "num_base_bdevs_operational": 2, 00:25:13.415 "base_bdevs_list": [ 00:25:13.415 { 00:25:13.415 "name": "spare", 00:25:13.415 "uuid": "52c73813-0e80-5def-ab99-b564952a0d35", 00:25:13.415 "is_configured": true, 00:25:13.415 "data_offset": 0, 00:25:13.415 "data_size": 65536 00:25:13.415 }, 00:25:13.415 { 00:25:13.415 "name": "BaseBdev2", 00:25:13.415 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:13.415 "is_configured": true, 00:25:13.415 "data_offset": 0, 00:25:13.415 "data_size": 65536 00:25:13.415 } 00:25:13.415 ] 00:25:13.415 }' 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.415 06:41:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.674 06:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:13.674 "name": "raid_bdev1", 00:25:13.674 "uuid": "260bab70-748f-45df-afee-84213b8896bc", 00:25:13.674 "strip_size_kb": 0, 00:25:13.674 "state": "online", 00:25:13.674 "raid_level": "raid1", 00:25:13.674 "superblock": false, 00:25:13.674 "num_base_bdevs": 2, 00:25:13.674 "num_base_bdevs_discovered": 2, 00:25:13.674 "num_base_bdevs_operational": 2, 00:25:13.674 "base_bdevs_list": [ 00:25:13.674 { 00:25:13.674 "name": "spare", 00:25:13.674 "uuid": "52c73813-0e80-5def-ab99-b564952a0d35", 00:25:13.674 "is_configured": true, 00:25:13.674 "data_offset": 0, 00:25:13.674 "data_size": 65536 00:25:13.674 }, 00:25:13.674 { 00:25:13.674 "name": "BaseBdev2", 00:25:13.674 "uuid": "57f5b719-cb90-5593-983d-5f6334a2be4f", 00:25:13.674 "is_configured": true, 00:25:13.674 "data_offset": 0, 00:25:13.674 "data_size": 65536 00:25:13.674 } 00:25:13.674 ] 00:25:13.674 }' 00:25:13.674 06:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:13.674 06:41:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:14.241 06:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:14.499 [2024-07-25 06:41:27.894174] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:14.499 [2024-07-25 06:41:27.894200] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:14.499 [2024-07-25 06:41:27.894257] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:14.499 [2024-07-25 06:41:27.894310] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:14.499 [2024-07-25 06:41:27.894321] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23fad20 name raid_bdev1, state offline 00:25:14.499 06:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.499 06:41:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # jq length 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:14.758 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:15.016 /dev/nbd0 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.016 1+0 records in 00:25:15.016 1+0 records out 00:25:15.016 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241238 s, 17.0 MB/s 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:15.016 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:15.275 /dev/nbd1 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.275 1+0 records in 00:25:15.275 1+0 records out 00:25:15.275 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331519 s, 12.4 MB/s 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@753 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:15.275 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:15.533 06:41:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:15.533 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@798 -- # killprocess 1221258 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 1221258 ']' 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 1221258 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1221258 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1221258' 00:25:15.792 killing process with pid 1221258 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 1221258 00:25:15.792 Received shutdown signal, test time was about 60.000000 seconds 00:25:15.792 00:25:15.792 Latency(us) 00:25:15.792 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:15.792 =================================================================================================================== 00:25:15.792 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:15.792 [2024-07-25 06:41:29.319399] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:15.792 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 1221258 00:25:15.792 [2024-07-25 06:41:29.343012] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:16.050 06:41:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@800 -- # return 0 00:25:16.050 00:25:16.050 real 0m20.386s 00:25:16.050 user 0m28.088s 00:25:16.050 sys 0m4.389s 00:25:16.050 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:16.050 06:41:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:16.051 ************************************ 00:25:16.051 END TEST raid_rebuild_test 00:25:16.051 ************************************ 00:25:16.051 06:41:29 bdev_raid -- bdev/bdev_raid.sh@958 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:25:16.051 06:41:29 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:16.051 06:41:29 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:16.051 06:41:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:16.310 ************************************ 00:25:16.310 START TEST raid_rebuild_test_sb 00:25:16.310 ************************************ 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # local verify=true 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # local strip_size 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # local create_arg 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # local data_offset 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # raid_pid=1224931 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # waitforlisten 1224931 /var/tmp/spdk-raid.sock 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1224931 ']' 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:16.310 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:16.310 06:41:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:16.310 [2024-07-25 06:41:29.684389] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:25:16.310 [2024-07-25 06:41:29.684451] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1224931 ] 00:25:16.310 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:16.310 Zero copy mechanism will not be used. 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:01.0 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:01.1 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:01.2 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:01.3 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:01.4 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:01.5 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:01.6 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:01.7 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:02.0 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:02.1 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:02.2 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:02.3 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:02.4 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:02.5 cannot be used 00:25:16.310 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.310 EAL: Requested device 0000:3d:02.6 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3d:02.7 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:01.0 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:01.1 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:01.2 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:01.3 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:01.4 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:01.5 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:01.6 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:01.7 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:02.0 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:02.1 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:02.2 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:02.3 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:02.4 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:02.5 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:02.6 cannot be used 00:25:16.311 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:16.311 EAL: Requested device 0000:3f:02.7 cannot be used 00:25:16.311 [2024-07-25 06:41:29.823294] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:16.569 [2024-07-25 06:41:29.866351] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:16.569 [2024-07-25 06:41:29.926708] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:16.569 [2024-07-25 06:41:29.926745] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:17.136 06:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:17.136 06:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:25:17.136 06:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:17.136 06:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:17.394 BaseBdev1_malloc 00:25:17.394 06:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:17.653 [2024-07-25 06:41:30.962446] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:17.653 [2024-07-25 06:41:30.962491] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.653 [2024-07-25 06:41:30.962512] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259d7b0 00:25:17.653 [2024-07-25 06:41:30.962523] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.653 [2024-07-25 06:41:30.964019] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.653 [2024-07-25 06:41:30.964048] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:17.653 BaseBdev1 00:25:17.653 06:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:17.653 06:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:17.653 BaseBdev2_malloc 00:25:17.912 06:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:17.912 [2024-07-25 06:41:31.419824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:17.912 [2024-07-25 06:41:31.419863] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.912 [2024-07-25 06:41:31.419883] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23eb8f0 00:25:17.912 [2024-07-25 06:41:31.419894] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.912 [2024-07-25 06:41:31.421125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.912 [2024-07-25 06:41:31.421160] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:17.912 BaseBdev2 00:25:17.912 06:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:18.170 spare_malloc 00:25:18.170 06:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:18.429 spare_delay 00:25:18.429 06:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:18.687 [2024-07-25 06:41:32.101879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:18.687 [2024-07-25 06:41:32.101916] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.687 [2024-07-25 06:41:32.101933] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e2c10 00:25:18.687 [2024-07-25 06:41:32.101944] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.687 [2024-07-25 06:41:32.103192] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.687 [2024-07-25 06:41:32.103218] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:18.687 spare 00:25:18.687 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:18.946 [2024-07-25 06:41:32.334512] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:18.946 [2024-07-25 06:41:32.335529] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:18.946 [2024-07-25 06:41:32.335668] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23e3d20 00:25:18.946 [2024-07-25 06:41:32.335680] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:18.946 [2024-07-25 06:41:32.335831] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x258fe20 00:25:18.946 [2024-07-25 06:41:32.335950] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23e3d20 00:25:18.946 [2024-07-25 06:41:32.335960] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23e3d20 00:25:18.946 [2024-07-25 06:41:32.336040] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.946 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.204 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:19.205 "name": "raid_bdev1", 00:25:19.205 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:19.205 "strip_size_kb": 0, 00:25:19.205 "state": "online", 00:25:19.205 "raid_level": "raid1", 00:25:19.205 "superblock": true, 00:25:19.205 "num_base_bdevs": 2, 00:25:19.205 "num_base_bdevs_discovered": 2, 00:25:19.205 "num_base_bdevs_operational": 2, 00:25:19.205 "base_bdevs_list": [ 00:25:19.205 { 00:25:19.205 "name": "BaseBdev1", 00:25:19.205 "uuid": "377c727e-e6b9-50b6-8e1c-f31d24aa8da2", 00:25:19.205 "is_configured": true, 00:25:19.205 "data_offset": 2048, 00:25:19.205 "data_size": 63488 00:25:19.205 }, 00:25:19.205 { 00:25:19.205 "name": "BaseBdev2", 00:25:19.205 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:19.205 "is_configured": true, 00:25:19.205 "data_offset": 2048, 00:25:19.205 "data_size": 63488 00:25:19.205 } 00:25:19.205 ] 00:25:19.205 }' 00:25:19.205 06:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:19.205 06:41:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:19.771 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:25:19.771 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:20.028 [2024-07-25 06:41:33.369459] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:20.028 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:25:20.028 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.028 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:20.286 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:20.286 [2024-07-25 06:41:33.822467] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x258fab0 00:25:20.286 /dev/nbd0 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:20.543 1+0 records in 00:25:20.543 1+0 records out 00:25:20.543 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228836 s, 17.9 MB/s 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:25:20.543 06:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:25.807 63488+0 records in 00:25:25.807 63488+0 records out 00:25:25.807 32505856 bytes (33 MB, 31 MiB) copied, 4.89387 s, 6.6 MB/s 00:25:25.807 06:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:25.807 06:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:25.807 06:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:25.807 06:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:25.807 06:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:25.807 06:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:25.808 06:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:25.808 [2024-07-25 06:41:39.029907] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:25.808 [2024-07-25 06:41:39.254537] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.808 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.066 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.066 "name": "raid_bdev1", 00:25:26.066 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:26.066 "strip_size_kb": 0, 00:25:26.066 "state": "online", 00:25:26.066 "raid_level": "raid1", 00:25:26.066 "superblock": true, 00:25:26.066 "num_base_bdevs": 2, 00:25:26.066 "num_base_bdevs_discovered": 1, 00:25:26.066 "num_base_bdevs_operational": 1, 00:25:26.066 "base_bdevs_list": [ 00:25:26.066 { 00:25:26.066 "name": null, 00:25:26.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.066 "is_configured": false, 00:25:26.066 "data_offset": 2048, 00:25:26.066 "data_size": 63488 00:25:26.066 }, 00:25:26.066 { 00:25:26.066 "name": "BaseBdev2", 00:25:26.066 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:26.066 "is_configured": true, 00:25:26.066 "data_offset": 2048, 00:25:26.066 "data_size": 63488 00:25:26.066 } 00:25:26.066 ] 00:25:26.067 }' 00:25:26.067 06:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.067 06:41:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:26.633 06:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:26.892 [2024-07-25 06:41:40.293295] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:26.892 [2024-07-25 06:41:40.298017] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2592bf0 00:25:26.892 [2024-07-25 06:41:40.300085] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:26.892 06:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:27.888 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:27.888 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.888 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:27.888 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:27.888 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.888 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.888 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.146 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:28.146 "name": "raid_bdev1", 00:25:28.146 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:28.146 "strip_size_kb": 0, 00:25:28.146 "state": "online", 00:25:28.146 "raid_level": "raid1", 00:25:28.146 "superblock": true, 00:25:28.146 "num_base_bdevs": 2, 00:25:28.146 "num_base_bdevs_discovered": 2, 00:25:28.146 "num_base_bdevs_operational": 2, 00:25:28.146 "process": { 00:25:28.146 "type": "rebuild", 00:25:28.146 "target": "spare", 00:25:28.146 "progress": { 00:25:28.146 "blocks": 24576, 00:25:28.146 "percent": 38 00:25:28.146 } 00:25:28.146 }, 00:25:28.146 "base_bdevs_list": [ 00:25:28.146 { 00:25:28.146 "name": "spare", 00:25:28.146 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:28.146 "is_configured": true, 00:25:28.146 "data_offset": 2048, 00:25:28.146 "data_size": 63488 00:25:28.146 }, 00:25:28.146 { 00:25:28.146 "name": "BaseBdev2", 00:25:28.146 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:28.146 "is_configured": true, 00:25:28.146 "data_offset": 2048, 00:25:28.146 "data_size": 63488 00:25:28.146 } 00:25:28.146 ] 00:25:28.146 }' 00:25:28.146 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:28.146 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:28.146 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:28.146 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:28.146 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:28.404 [2024-07-25 06:41:41.851250] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:28.404 [2024-07-25 06:41:41.911856] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:28.404 [2024-07-25 06:41:41.911905] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:28.404 [2024-07-25 06:41:41.911919] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:28.404 [2024-07-25 06:41:41.911926] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.404 06:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.662 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:28.662 "name": "raid_bdev1", 00:25:28.662 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:28.662 "strip_size_kb": 0, 00:25:28.662 "state": "online", 00:25:28.662 "raid_level": "raid1", 00:25:28.662 "superblock": true, 00:25:28.662 "num_base_bdevs": 2, 00:25:28.662 "num_base_bdevs_discovered": 1, 00:25:28.662 "num_base_bdevs_operational": 1, 00:25:28.662 "base_bdevs_list": [ 00:25:28.662 { 00:25:28.662 "name": null, 00:25:28.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.662 "is_configured": false, 00:25:28.662 "data_offset": 2048, 00:25:28.662 "data_size": 63488 00:25:28.662 }, 00:25:28.662 { 00:25:28.662 "name": "BaseBdev2", 00:25:28.662 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:28.662 "is_configured": true, 00:25:28.662 "data_offset": 2048, 00:25:28.662 "data_size": 63488 00:25:28.662 } 00:25:28.662 ] 00:25:28.662 }' 00:25:28.662 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:28.662 06:41:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:29.228 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:29.228 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.228 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:29.228 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:29.228 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.228 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.228 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.486 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.486 "name": "raid_bdev1", 00:25:29.486 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:29.486 "strip_size_kb": 0, 00:25:29.486 "state": "online", 00:25:29.486 "raid_level": "raid1", 00:25:29.486 "superblock": true, 00:25:29.486 "num_base_bdevs": 2, 00:25:29.486 "num_base_bdevs_discovered": 1, 00:25:29.486 "num_base_bdevs_operational": 1, 00:25:29.486 "base_bdevs_list": [ 00:25:29.486 { 00:25:29.486 "name": null, 00:25:29.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.486 "is_configured": false, 00:25:29.486 "data_offset": 2048, 00:25:29.486 "data_size": 63488 00:25:29.486 }, 00:25:29.486 { 00:25:29.486 "name": "BaseBdev2", 00:25:29.486 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:29.486 "is_configured": true, 00:25:29.486 "data_offset": 2048, 00:25:29.486 "data_size": 63488 00:25:29.486 } 00:25:29.486 ] 00:25:29.486 }' 00:25:29.486 06:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.486 06:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:29.486 06:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.744 06:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:29.745 06:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:29.745 [2024-07-25 06:41:43.259714] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:29.745 [2024-07-25 06:41:43.264453] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2592bf0 00:25:29.745 [2024-07-25 06:41:43.265844] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:29.745 06:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@678 -- # sleep 1 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.114 "name": "raid_bdev1", 00:25:31.114 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:31.114 "strip_size_kb": 0, 00:25:31.114 "state": "online", 00:25:31.114 "raid_level": "raid1", 00:25:31.114 "superblock": true, 00:25:31.114 "num_base_bdevs": 2, 00:25:31.114 "num_base_bdevs_discovered": 2, 00:25:31.114 "num_base_bdevs_operational": 2, 00:25:31.114 "process": { 00:25:31.114 "type": "rebuild", 00:25:31.114 "target": "spare", 00:25:31.114 "progress": { 00:25:31.114 "blocks": 24576, 00:25:31.114 "percent": 38 00:25:31.114 } 00:25:31.114 }, 00:25:31.114 "base_bdevs_list": [ 00:25:31.114 { 00:25:31.114 "name": "spare", 00:25:31.114 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:31.114 "is_configured": true, 00:25:31.114 "data_offset": 2048, 00:25:31.114 "data_size": 63488 00:25:31.114 }, 00:25:31.114 { 00:25:31.114 "name": "BaseBdev2", 00:25:31.114 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:31.114 "is_configured": true, 00:25:31.114 "data_offset": 2048, 00:25:31.114 "data_size": 63488 00:25:31.114 } 00:25:31.114 ] 00:25:31.114 }' 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:25:31.114 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # local timeout=753 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.114 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.371 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.371 "name": "raid_bdev1", 00:25:31.371 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:31.371 "strip_size_kb": 0, 00:25:31.371 "state": "online", 00:25:31.371 "raid_level": "raid1", 00:25:31.371 "superblock": true, 00:25:31.371 "num_base_bdevs": 2, 00:25:31.371 "num_base_bdevs_discovered": 2, 00:25:31.371 "num_base_bdevs_operational": 2, 00:25:31.371 "process": { 00:25:31.371 "type": "rebuild", 00:25:31.371 "target": "spare", 00:25:31.371 "progress": { 00:25:31.371 "blocks": 30720, 00:25:31.371 "percent": 48 00:25:31.371 } 00:25:31.371 }, 00:25:31.371 "base_bdevs_list": [ 00:25:31.371 { 00:25:31.371 "name": "spare", 00:25:31.371 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:31.371 "is_configured": true, 00:25:31.371 "data_offset": 2048, 00:25:31.371 "data_size": 63488 00:25:31.371 }, 00:25:31.371 { 00:25:31.371 "name": "BaseBdev2", 00:25:31.371 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:31.371 "is_configured": true, 00:25:31.371 "data_offset": 2048, 00:25:31.371 "data_size": 63488 00:25:31.371 } 00:25:31.371 ] 00:25:31.371 }' 00:25:31.371 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.371 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:31.371 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.371 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:31.371 06:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:32.743 06:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:32.743 06:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:32.743 06:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.743 06:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:32.743 06:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:32.743 06:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.743 06:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.743 06:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.743 06:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:32.743 "name": "raid_bdev1", 00:25:32.743 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:32.743 "strip_size_kb": 0, 00:25:32.743 "state": "online", 00:25:32.743 "raid_level": "raid1", 00:25:32.743 "superblock": true, 00:25:32.743 "num_base_bdevs": 2, 00:25:32.743 "num_base_bdevs_discovered": 2, 00:25:32.743 "num_base_bdevs_operational": 2, 00:25:32.743 "process": { 00:25:32.743 "type": "rebuild", 00:25:32.743 "target": "spare", 00:25:32.743 "progress": { 00:25:32.743 "blocks": 57344, 00:25:32.743 "percent": 90 00:25:32.743 } 00:25:32.743 }, 00:25:32.743 "base_bdevs_list": [ 00:25:32.743 { 00:25:32.743 "name": "spare", 00:25:32.743 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:32.743 "is_configured": true, 00:25:32.743 "data_offset": 2048, 00:25:32.743 "data_size": 63488 00:25:32.743 }, 00:25:32.743 { 00:25:32.743 "name": "BaseBdev2", 00:25:32.743 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:32.743 "is_configured": true, 00:25:32.743 "data_offset": 2048, 00:25:32.743 "data_size": 63488 00:25:32.743 } 00:25:32.743 ] 00:25:32.743 }' 00:25:32.743 06:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:32.743 06:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:32.743 06:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:32.743 06:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:32.743 06:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:33.000 [2024-07-25 06:41:46.388410] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:33.000 [2024-07-25 06:41:46.388463] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:33.000 [2024-07-25 06:41:46.388541] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:33.933 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:33.933 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:33.933 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:33.933 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:33.933 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:33.933 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:33.933 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.934 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.934 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.934 "name": "raid_bdev1", 00:25:33.934 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:33.934 "strip_size_kb": 0, 00:25:33.934 "state": "online", 00:25:33.934 "raid_level": "raid1", 00:25:33.934 "superblock": true, 00:25:33.934 "num_base_bdevs": 2, 00:25:33.934 "num_base_bdevs_discovered": 2, 00:25:33.934 "num_base_bdevs_operational": 2, 00:25:33.934 "base_bdevs_list": [ 00:25:33.934 { 00:25:33.934 "name": "spare", 00:25:33.934 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:33.934 "is_configured": true, 00:25:33.934 "data_offset": 2048, 00:25:33.934 "data_size": 63488 00:25:33.934 }, 00:25:33.934 { 00:25:33.934 "name": "BaseBdev2", 00:25:33.934 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:33.934 "is_configured": true, 00:25:33.934 "data_offset": 2048, 00:25:33.934 "data_size": 63488 00:25:33.934 } 00:25:33.934 ] 00:25:33.934 }' 00:25:33.934 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # break 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.192 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:34.451 "name": "raid_bdev1", 00:25:34.451 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:34.451 "strip_size_kb": 0, 00:25:34.451 "state": "online", 00:25:34.451 "raid_level": "raid1", 00:25:34.451 "superblock": true, 00:25:34.451 "num_base_bdevs": 2, 00:25:34.451 "num_base_bdevs_discovered": 2, 00:25:34.451 "num_base_bdevs_operational": 2, 00:25:34.451 "base_bdevs_list": [ 00:25:34.451 { 00:25:34.451 "name": "spare", 00:25:34.451 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:34.451 "is_configured": true, 00:25:34.451 "data_offset": 2048, 00:25:34.451 "data_size": 63488 00:25:34.451 }, 00:25:34.451 { 00:25:34.451 "name": "BaseBdev2", 00:25:34.451 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:34.451 "is_configured": true, 00:25:34.451 "data_offset": 2048, 00:25:34.451 "data_size": 63488 00:25:34.451 } 00:25:34.451 ] 00:25:34.451 }' 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.451 06:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.709 06:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:34.709 "name": "raid_bdev1", 00:25:34.709 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:34.709 "strip_size_kb": 0, 00:25:34.709 "state": "online", 00:25:34.709 "raid_level": "raid1", 00:25:34.709 "superblock": true, 00:25:34.709 "num_base_bdevs": 2, 00:25:34.709 "num_base_bdevs_discovered": 2, 00:25:34.709 "num_base_bdevs_operational": 2, 00:25:34.709 "base_bdevs_list": [ 00:25:34.709 { 00:25:34.709 "name": "spare", 00:25:34.709 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:34.709 "is_configured": true, 00:25:34.709 "data_offset": 2048, 00:25:34.709 "data_size": 63488 00:25:34.709 }, 00:25:34.709 { 00:25:34.710 "name": "BaseBdev2", 00:25:34.710 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:34.710 "is_configured": true, 00:25:34.710 "data_offset": 2048, 00:25:34.710 "data_size": 63488 00:25:34.710 } 00:25:34.710 ] 00:25:34.710 }' 00:25:34.710 06:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:34.710 06:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:35.276 06:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:35.535 [2024-07-25 06:41:48.871537] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:35.535 [2024-07-25 06:41:48.871564] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:35.535 [2024-07-25 06:41:48.871622] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:35.535 [2024-07-25 06:41:48.871673] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:35.535 [2024-07-25 06:41:48.871683] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e3d20 name raid_bdev1, state offline 00:25:35.535 06:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.535 06:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # jq length 00:25:35.793 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:35.793 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:35.793 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:25:35.793 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:35.793 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:35.794 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:35.794 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:35.794 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:35.794 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:35.794 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:35.794 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:35.794 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:35.794 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:36.052 /dev/nbd0 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:36.052 1+0 records in 00:25:36.052 1+0 records out 00:25:36.052 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235733 s, 17.4 MB/s 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:36.052 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:36.311 /dev/nbd1 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:36.311 1+0 records in 00:25:36.311 1+0 records out 00:25:36.311 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000318109 s, 12.9 MB/s 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:36.311 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:36.569 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:36.569 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:36.569 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:36.569 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:36.570 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:36.570 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:36.570 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:36.570 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:36.570 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:36.570 06:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:25:36.828 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:37.086 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:37.344 [2024-07-25 06:41:50.691648] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:37.345 [2024-07-25 06:41:50.691692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.345 [2024-07-25 06:41:50.691715] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e2130 00:25:37.345 [2024-07-25 06:41:50.691727] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.345 [2024-07-25 06:41:50.693227] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.345 [2024-07-25 06:41:50.693254] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:37.345 [2024-07-25 06:41:50.693330] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:37.345 [2024-07-25 06:41:50.693355] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:37.345 [2024-07-25 06:41:50.693448] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:37.345 spare 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.345 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.345 [2024-07-25 06:41:50.793754] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x258fdf0 00:25:37.345 [2024-07-25 06:41:50.793768] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:37.345 [2024-07-25 06:41:50.793930] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ec5a0 00:25:37.345 [2024-07-25 06:41:50.794058] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x258fdf0 00:25:37.345 [2024-07-25 06:41:50.794067] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x258fdf0 00:25:37.345 [2024-07-25 06:41:50.794165] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:37.603 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:37.603 "name": "raid_bdev1", 00:25:37.603 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:37.603 "strip_size_kb": 0, 00:25:37.603 "state": "online", 00:25:37.603 "raid_level": "raid1", 00:25:37.603 "superblock": true, 00:25:37.603 "num_base_bdevs": 2, 00:25:37.603 "num_base_bdevs_discovered": 2, 00:25:37.603 "num_base_bdevs_operational": 2, 00:25:37.603 "base_bdevs_list": [ 00:25:37.603 { 00:25:37.603 "name": "spare", 00:25:37.603 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:37.603 "is_configured": true, 00:25:37.603 "data_offset": 2048, 00:25:37.603 "data_size": 63488 00:25:37.603 }, 00:25:37.603 { 00:25:37.603 "name": "BaseBdev2", 00:25:37.603 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:37.603 "is_configured": true, 00:25:37.603 "data_offset": 2048, 00:25:37.603 "data_size": 63488 00:25:37.603 } 00:25:37.603 ] 00:25:37.603 }' 00:25:37.603 06:41:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:37.603 06:41:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:38.168 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:38.168 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:38.168 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:38.168 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:38.168 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:38.168 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.168 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.427 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.427 "name": "raid_bdev1", 00:25:38.427 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:38.427 "strip_size_kb": 0, 00:25:38.427 "state": "online", 00:25:38.427 "raid_level": "raid1", 00:25:38.427 "superblock": true, 00:25:38.427 "num_base_bdevs": 2, 00:25:38.427 "num_base_bdevs_discovered": 2, 00:25:38.427 "num_base_bdevs_operational": 2, 00:25:38.427 "base_bdevs_list": [ 00:25:38.427 { 00:25:38.427 "name": "spare", 00:25:38.427 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:38.427 "is_configured": true, 00:25:38.427 "data_offset": 2048, 00:25:38.427 "data_size": 63488 00:25:38.427 }, 00:25:38.427 { 00:25:38.427 "name": "BaseBdev2", 00:25:38.427 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:38.427 "is_configured": true, 00:25:38.427 "data_offset": 2048, 00:25:38.427 "data_size": 63488 00:25:38.427 } 00:25:38.427 ] 00:25:38.427 }' 00:25:38.427 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.427 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:38.427 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.427 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:38.427 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.427 06:41:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:38.685 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:25:38.685 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:38.944 [2024-07-25 06:41:52.287942] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.944 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.203 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:39.203 "name": "raid_bdev1", 00:25:39.203 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:39.203 "strip_size_kb": 0, 00:25:39.203 "state": "online", 00:25:39.203 "raid_level": "raid1", 00:25:39.203 "superblock": true, 00:25:39.203 "num_base_bdevs": 2, 00:25:39.203 "num_base_bdevs_discovered": 1, 00:25:39.203 "num_base_bdevs_operational": 1, 00:25:39.203 "base_bdevs_list": [ 00:25:39.203 { 00:25:39.203 "name": null, 00:25:39.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.203 "is_configured": false, 00:25:39.203 "data_offset": 2048, 00:25:39.203 "data_size": 63488 00:25:39.203 }, 00:25:39.203 { 00:25:39.203 "name": "BaseBdev2", 00:25:39.203 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:39.203 "is_configured": true, 00:25:39.203 "data_offset": 2048, 00:25:39.203 "data_size": 63488 00:25:39.203 } 00:25:39.203 ] 00:25:39.203 }' 00:25:39.203 06:41:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:39.203 06:41:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:39.770 06:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:39.770 [2024-07-25 06:41:53.318672] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:39.770 [2024-07-25 06:41:53.318815] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:39.770 [2024-07-25 06:41:53.318829] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:39.770 [2024-07-25 06:41:53.318857] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:39.770 [2024-07-25 06:41:53.323423] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ec5a0 00:25:39.770 [2024-07-25 06:41:53.324692] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:40.030 06:41:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # sleep 1 00:25:40.964 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:40.964 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:40.964 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:40.964 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:40.964 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:40.964 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.964 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.258 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.258 "name": "raid_bdev1", 00:25:41.258 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:41.258 "strip_size_kb": 0, 00:25:41.258 "state": "online", 00:25:41.258 "raid_level": "raid1", 00:25:41.258 "superblock": true, 00:25:41.258 "num_base_bdevs": 2, 00:25:41.258 "num_base_bdevs_discovered": 2, 00:25:41.258 "num_base_bdevs_operational": 2, 00:25:41.258 "process": { 00:25:41.258 "type": "rebuild", 00:25:41.258 "target": "spare", 00:25:41.258 "progress": { 00:25:41.258 "blocks": 22528, 00:25:41.258 "percent": 35 00:25:41.258 } 00:25:41.258 }, 00:25:41.258 "base_bdevs_list": [ 00:25:41.258 { 00:25:41.258 "name": "spare", 00:25:41.258 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:41.258 "is_configured": true, 00:25:41.258 "data_offset": 2048, 00:25:41.258 "data_size": 63488 00:25:41.258 }, 00:25:41.258 { 00:25:41.258 "name": "BaseBdev2", 00:25:41.258 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:41.258 "is_configured": true, 00:25:41.258 "data_offset": 2048, 00:25:41.258 "data_size": 63488 00:25:41.258 } 00:25:41.258 ] 00:25:41.258 }' 00:25:41.258 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.258 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:41.258 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:41.258 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:41.258 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:41.546 [2024-07-25 06:41:54.819838] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:41.546 [2024-07-25 06:41:54.835664] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:41.546 [2024-07-25 06:41:54.835706] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:41.546 [2024-07-25 06:41:54.835721] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:41.546 [2024-07-25 06:41:54.835728] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.546 06:41:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.546 06:41:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:41.546 "name": "raid_bdev1", 00:25:41.546 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:41.546 "strip_size_kb": 0, 00:25:41.546 "state": "online", 00:25:41.546 "raid_level": "raid1", 00:25:41.546 "superblock": true, 00:25:41.546 "num_base_bdevs": 2, 00:25:41.546 "num_base_bdevs_discovered": 1, 00:25:41.547 "num_base_bdevs_operational": 1, 00:25:41.547 "base_bdevs_list": [ 00:25:41.547 { 00:25:41.547 "name": null, 00:25:41.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.547 "is_configured": false, 00:25:41.547 "data_offset": 2048, 00:25:41.547 "data_size": 63488 00:25:41.547 }, 00:25:41.547 { 00:25:41.547 "name": "BaseBdev2", 00:25:41.547 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:41.547 "is_configured": true, 00:25:41.547 "data_offset": 2048, 00:25:41.547 "data_size": 63488 00:25:41.547 } 00:25:41.547 ] 00:25:41.547 }' 00:25:41.547 06:41:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:41.547 06:41:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:42.481 06:41:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:42.481 [2024-07-25 06:41:55.894798] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:42.481 [2024-07-25 06:41:55.894854] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.481 [2024-07-25 06:41:55.894876] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259c480 00:25:42.481 [2024-07-25 06:41:55.894887] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.481 [2024-07-25 06:41:55.895242] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.481 [2024-07-25 06:41:55.895260] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:42.481 [2024-07-25 06:41:55.895338] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:42.481 [2024-07-25 06:41:55.895349] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:42.481 [2024-07-25 06:41:55.895358] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:42.481 [2024-07-25 06:41:55.895375] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:42.481 [2024-07-25 06:41:55.899999] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259c750 00:25:42.481 spare 00:25:42.481 [2024-07-25 06:41:55.901285] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:42.481 06:41:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:25:43.415 06:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:43.416 06:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.416 06:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:43.416 06:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:43.416 06:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.416 06:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.416 06:41:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.674 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:43.674 "name": "raid_bdev1", 00:25:43.674 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:43.674 "strip_size_kb": 0, 00:25:43.674 "state": "online", 00:25:43.674 "raid_level": "raid1", 00:25:43.674 "superblock": true, 00:25:43.674 "num_base_bdevs": 2, 00:25:43.674 "num_base_bdevs_discovered": 2, 00:25:43.674 "num_base_bdevs_operational": 2, 00:25:43.674 "process": { 00:25:43.674 "type": "rebuild", 00:25:43.674 "target": "spare", 00:25:43.674 "progress": { 00:25:43.674 "blocks": 24576, 00:25:43.674 "percent": 38 00:25:43.674 } 00:25:43.674 }, 00:25:43.674 "base_bdevs_list": [ 00:25:43.674 { 00:25:43.674 "name": "spare", 00:25:43.674 "uuid": "d0949695-74dd-5e01-ad68-2615d0566835", 00:25:43.674 "is_configured": true, 00:25:43.674 "data_offset": 2048, 00:25:43.674 "data_size": 63488 00:25:43.674 }, 00:25:43.674 { 00:25:43.674 "name": "BaseBdev2", 00:25:43.674 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:43.674 "is_configured": true, 00:25:43.674 "data_offset": 2048, 00:25:43.674 "data_size": 63488 00:25:43.674 } 00:25:43.674 ] 00:25:43.674 }' 00:25:43.674 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:43.674 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:43.674 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:43.932 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:43.932 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:43.932 [2024-07-25 06:41:57.456865] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:44.190 [2024-07-25 06:41:57.512976] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:44.190 [2024-07-25 06:41:57.513019] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:44.190 [2024-07-25 06:41:57.513033] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:44.190 [2024-07-25 06:41:57.513041] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.190 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.448 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:44.448 "name": "raid_bdev1", 00:25:44.448 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:44.448 "strip_size_kb": 0, 00:25:44.448 "state": "online", 00:25:44.448 "raid_level": "raid1", 00:25:44.448 "superblock": true, 00:25:44.448 "num_base_bdevs": 2, 00:25:44.448 "num_base_bdevs_discovered": 1, 00:25:44.448 "num_base_bdevs_operational": 1, 00:25:44.448 "base_bdevs_list": [ 00:25:44.448 { 00:25:44.448 "name": null, 00:25:44.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.448 "is_configured": false, 00:25:44.448 "data_offset": 2048, 00:25:44.448 "data_size": 63488 00:25:44.448 }, 00:25:44.448 { 00:25:44.448 "name": "BaseBdev2", 00:25:44.448 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:44.448 "is_configured": true, 00:25:44.448 "data_offset": 2048, 00:25:44.448 "data_size": 63488 00:25:44.448 } 00:25:44.448 ] 00:25:44.448 }' 00:25:44.448 06:41:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:44.448 06:41:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:45.013 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:45.013 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.013 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:45.013 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:45.013 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.013 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.013 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.271 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:45.271 "name": "raid_bdev1", 00:25:45.271 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:45.271 "strip_size_kb": 0, 00:25:45.271 "state": "online", 00:25:45.271 "raid_level": "raid1", 00:25:45.271 "superblock": true, 00:25:45.271 "num_base_bdevs": 2, 00:25:45.271 "num_base_bdevs_discovered": 1, 00:25:45.271 "num_base_bdevs_operational": 1, 00:25:45.271 "base_bdevs_list": [ 00:25:45.271 { 00:25:45.271 "name": null, 00:25:45.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.271 "is_configured": false, 00:25:45.271 "data_offset": 2048, 00:25:45.271 "data_size": 63488 00:25:45.271 }, 00:25:45.271 { 00:25:45.271 "name": "BaseBdev2", 00:25:45.271 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:45.271 "is_configured": true, 00:25:45.271 "data_offset": 2048, 00:25:45.271 "data_size": 63488 00:25:45.271 } 00:25:45.271 ] 00:25:45.271 }' 00:25:45.271 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:45.271 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:45.271 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:45.271 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:45.271 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:45.529 06:41:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:45.787 [2024-07-25 06:41:59.105571] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:45.787 [2024-07-25 06:41:59.105617] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:45.787 [2024-07-25 06:41:59.105636] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e48d0 00:25:45.787 [2024-07-25 06:41:59.105647] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:45.787 [2024-07-25 06:41:59.105978] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:45.787 [2024-07-25 06:41:59.105995] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:45.787 [2024-07-25 06:41:59.106057] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:45.787 [2024-07-25 06:41:59.106068] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:45.787 [2024-07-25 06:41:59.106077] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:45.787 BaseBdev1 00:25:45.787 06:41:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # sleep 1 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.721 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.979 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:46.979 "name": "raid_bdev1", 00:25:46.979 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:46.979 "strip_size_kb": 0, 00:25:46.979 "state": "online", 00:25:46.979 "raid_level": "raid1", 00:25:46.979 "superblock": true, 00:25:46.979 "num_base_bdevs": 2, 00:25:46.979 "num_base_bdevs_discovered": 1, 00:25:46.979 "num_base_bdevs_operational": 1, 00:25:46.979 "base_bdevs_list": [ 00:25:46.979 { 00:25:46.979 "name": null, 00:25:46.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.979 "is_configured": false, 00:25:46.979 "data_offset": 2048, 00:25:46.979 "data_size": 63488 00:25:46.979 }, 00:25:46.979 { 00:25:46.979 "name": "BaseBdev2", 00:25:46.979 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:46.979 "is_configured": true, 00:25:46.979 "data_offset": 2048, 00:25:46.979 "data_size": 63488 00:25:46.979 } 00:25:46.979 ] 00:25:46.979 }' 00:25:46.979 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:46.979 06:42:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:47.545 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:47.545 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:47.545 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:47.545 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:47.545 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:47.545 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.545 06:42:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:47.803 "name": "raid_bdev1", 00:25:47.803 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:47.803 "strip_size_kb": 0, 00:25:47.803 "state": "online", 00:25:47.803 "raid_level": "raid1", 00:25:47.803 "superblock": true, 00:25:47.803 "num_base_bdevs": 2, 00:25:47.803 "num_base_bdevs_discovered": 1, 00:25:47.803 "num_base_bdevs_operational": 1, 00:25:47.803 "base_bdevs_list": [ 00:25:47.803 { 00:25:47.803 "name": null, 00:25:47.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:47.803 "is_configured": false, 00:25:47.803 "data_offset": 2048, 00:25:47.803 "data_size": 63488 00:25:47.803 }, 00:25:47.803 { 00:25:47.803 "name": "BaseBdev2", 00:25:47.803 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:47.803 "is_configured": true, 00:25:47.803 "data_offset": 2048, 00:25:47.803 "data_size": 63488 00:25:47.803 } 00:25:47.803 ] 00:25:47.803 }' 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:47.803 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:48.061 [2024-07-25 06:42:01.459790] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:48.061 [2024-07-25 06:42:01.459905] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:48.061 [2024-07-25 06:42:01.459921] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:48.061 request: 00:25:48.061 { 00:25:48.061 "base_bdev": "BaseBdev1", 00:25:48.061 "raid_bdev": "raid_bdev1", 00:25:48.061 "method": "bdev_raid_add_base_bdev", 00:25:48.061 "req_id": 1 00:25:48.061 } 00:25:48.061 Got JSON-RPC error response 00:25:48.061 response: 00:25:48.061 { 00:25:48.061 "code": -22, 00:25:48.061 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:48.061 } 00:25:48.061 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:25:48.061 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:25:48.061 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:25:48.061 06:42:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:25:48.061 06:42:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@793 -- # sleep 1 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.995 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.253 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:49.253 "name": "raid_bdev1", 00:25:49.253 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:49.253 "strip_size_kb": 0, 00:25:49.253 "state": "online", 00:25:49.253 "raid_level": "raid1", 00:25:49.253 "superblock": true, 00:25:49.253 "num_base_bdevs": 2, 00:25:49.253 "num_base_bdevs_discovered": 1, 00:25:49.253 "num_base_bdevs_operational": 1, 00:25:49.253 "base_bdevs_list": [ 00:25:49.253 { 00:25:49.253 "name": null, 00:25:49.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.253 "is_configured": false, 00:25:49.253 "data_offset": 2048, 00:25:49.253 "data_size": 63488 00:25:49.253 }, 00:25:49.253 { 00:25:49.253 "name": "BaseBdev2", 00:25:49.253 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:49.253 "is_configured": true, 00:25:49.253 "data_offset": 2048, 00:25:49.253 "data_size": 63488 00:25:49.253 } 00:25:49.253 ] 00:25:49.253 }' 00:25:49.253 06:42:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:49.253 06:42:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:49.820 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:49.820 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:49.820 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:49.820 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:49.820 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:49.820 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.820 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.078 "name": "raid_bdev1", 00:25:50.078 "uuid": "b32f1f8f-3170-45ea-8553-05019b318421", 00:25:50.078 "strip_size_kb": 0, 00:25:50.078 "state": "online", 00:25:50.078 "raid_level": "raid1", 00:25:50.078 "superblock": true, 00:25:50.078 "num_base_bdevs": 2, 00:25:50.078 "num_base_bdevs_discovered": 1, 00:25:50.078 "num_base_bdevs_operational": 1, 00:25:50.078 "base_bdevs_list": [ 00:25:50.078 { 00:25:50.078 "name": null, 00:25:50.078 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:50.078 "is_configured": false, 00:25:50.078 "data_offset": 2048, 00:25:50.078 "data_size": 63488 00:25:50.078 }, 00:25:50.078 { 00:25:50.078 "name": "BaseBdev2", 00:25:50.078 "uuid": "f7c4dc5d-c7ab-5953-875e-2852eddb9c9f", 00:25:50.078 "is_configured": true, 00:25:50.078 "data_offset": 2048, 00:25:50.078 "data_size": 63488 00:25:50.078 } 00:25:50.078 ] 00:25:50.078 }' 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@798 -- # killprocess 1224931 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1224931 ']' 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 1224931 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:50.078 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1224931 00:25:50.335 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:50.335 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:50.335 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1224931' 00:25:50.335 killing process with pid 1224931 00:25:50.335 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 1224931 00:25:50.335 Received shutdown signal, test time was about 60.000000 seconds 00:25:50.335 00:25:50.335 Latency(us) 00:25:50.335 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:50.335 =================================================================================================================== 00:25:50.335 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:50.335 [2024-07-25 06:42:03.648367] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:50.335 [2024-07-25 06:42:03.648453] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:50.335 [2024-07-25 06:42:03.648494] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:50.335 [2024-07-25 06:42:03.648505] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x258fdf0 name raid_bdev1, state offline 00:25:50.335 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 1224931 00:25:50.335 [2024-07-25 06:42:03.672258] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:50.335 06:42:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@800 -- # return 0 00:25:50.335 00:25:50.335 real 0m34.234s 00:25:50.335 user 0m49.371s 00:25:50.335 sys 0m6.452s 00:25:50.335 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:50.335 06:42:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:50.335 ************************************ 00:25:50.335 END TEST raid_rebuild_test_sb 00:25:50.335 ************************************ 00:25:50.594 06:42:03 bdev_raid -- bdev/bdev_raid.sh@959 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:25:50.594 06:42:03 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:50.594 06:42:03 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:50.594 06:42:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:50.594 ************************************ 00:25:50.594 START TEST raid_rebuild_test_io 00:25:50.594 ************************************ 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false true true 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1231251 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1231251 /var/tmp/spdk-raid.sock 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 1231251 ']' 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:50.594 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:50.594 06:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:50.594 [2024-07-25 06:42:04.002685] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:25:50.594 [2024-07-25 06:42:04.002740] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1231251 ] 00:25:50.594 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:50.594 Zero copy mechanism will not be used. 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:01.0 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:01.1 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:01.2 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:01.3 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:01.4 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:01.5 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:01.6 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:01.7 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:02.0 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:02.1 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:02.2 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:02.3 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:02.4 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:02.5 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:02.6 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3d:02.7 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.594 EAL: Requested device 0000:3f:01.0 cannot be used 00:25:50.594 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:01.1 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:01.2 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:01.3 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:01.4 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:01.5 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:01.6 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:01.7 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:02.0 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:02.1 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:02.2 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:02.3 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:02.4 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:02.5 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:02.6 cannot be used 00:25:50.595 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:25:50.595 EAL: Requested device 0000:3f:02.7 cannot be used 00:25:50.595 [2024-07-25 06:42:04.127137] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:50.853 [2024-07-25 06:42:04.172621] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:50.853 [2024-07-25 06:42:04.232820] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:50.853 [2024-07-25 06:42:04.232846] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:51.418 06:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:51.418 06:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:25:51.418 06:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:51.418 06:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:51.685 BaseBdev1_malloc 00:25:51.685 06:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:51.946 [2024-07-25 06:42:05.346530] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:51.946 [2024-07-25 06:42:05.346573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:51.946 [2024-07-25 06:42:05.346596] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21677b0 00:25:51.946 [2024-07-25 06:42:05.346608] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:51.946 [2024-07-25 06:42:05.348163] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:51.946 [2024-07-25 06:42:05.348189] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:51.946 BaseBdev1 00:25:51.946 06:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:51.946 06:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:52.204 BaseBdev2_malloc 00:25:52.204 06:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:52.471 [2024-07-25 06:42:05.792017] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:52.471 [2024-07-25 06:42:05.792057] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:52.471 [2024-07-25 06:42:05.792078] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fb58f0 00:25:52.471 [2024-07-25 06:42:05.792090] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:52.471 [2024-07-25 06:42:05.793424] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:52.471 [2024-07-25 06:42:05.793454] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:52.471 BaseBdev2 00:25:52.471 06:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:52.471 spare_malloc 00:25:52.729 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:52.729 spare_delay 00:25:52.729 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:52.987 [2024-07-25 06:42:06.486024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:52.987 [2024-07-25 06:42:06.486062] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:52.988 [2024-07-25 06:42:06.486079] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1facc10 00:25:52.988 [2024-07-25 06:42:06.486091] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:52.988 [2024-07-25 06:42:06.487426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:52.988 [2024-07-25 06:42:06.487453] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:52.988 spare 00:25:52.988 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:53.246 [2024-07-25 06:42:06.710624] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:53.246 [2024-07-25 06:42:06.711753] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:53.246 [2024-07-25 06:42:06.711827] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fadd20 00:25:53.246 [2024-07-25 06:42:06.711837] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:53.246 [2024-07-25 06:42:06.712027] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fb00e0 00:25:53.246 [2024-07-25 06:42:06.712164] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fadd20 00:25:53.246 [2024-07-25 06:42:06.712174] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fadd20 00:25:53.246 [2024-07-25 06:42:06.712272] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.246 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.505 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.505 "name": "raid_bdev1", 00:25:53.505 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:25:53.505 "strip_size_kb": 0, 00:25:53.505 "state": "online", 00:25:53.505 "raid_level": "raid1", 00:25:53.505 "superblock": false, 00:25:53.505 "num_base_bdevs": 2, 00:25:53.505 "num_base_bdevs_discovered": 2, 00:25:53.505 "num_base_bdevs_operational": 2, 00:25:53.505 "base_bdevs_list": [ 00:25:53.505 { 00:25:53.505 "name": "BaseBdev1", 00:25:53.505 "uuid": "d29a7c89-89f8-5f57-bb8a-1b855881f805", 00:25:53.505 "is_configured": true, 00:25:53.505 "data_offset": 0, 00:25:53.505 "data_size": 65536 00:25:53.505 }, 00:25:53.505 { 00:25:53.505 "name": "BaseBdev2", 00:25:53.505 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:25:53.505 "is_configured": true, 00:25:53.505 "data_offset": 0, 00:25:53.505 "data_size": 65536 00:25:53.505 } 00:25:53.505 ] 00:25:53.505 }' 00:25:53.505 06:42:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.505 06:42:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:54.072 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:54.072 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:25:54.331 [2024-07-25 06:42:07.745568] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:54.331 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:25:54.331 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.331 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:54.589 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:25:54.589 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:25:54.589 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:54.589 06:42:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:54.589 [2024-07-25 06:42:08.104248] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fb0570 00:25:54.589 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:54.589 Zero copy mechanism will not be used. 00:25:54.589 Running I/O for 60 seconds... 00:25:54.910 [2024-07-25 06:42:08.208996] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:54.910 [2024-07-25 06:42:08.216492] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1fb0570 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.910 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.170 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.170 "name": "raid_bdev1", 00:25:55.170 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:25:55.170 "strip_size_kb": 0, 00:25:55.170 "state": "online", 00:25:55.170 "raid_level": "raid1", 00:25:55.170 "superblock": false, 00:25:55.170 "num_base_bdevs": 2, 00:25:55.170 "num_base_bdevs_discovered": 1, 00:25:55.170 "num_base_bdevs_operational": 1, 00:25:55.170 "base_bdevs_list": [ 00:25:55.170 { 00:25:55.170 "name": null, 00:25:55.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.170 "is_configured": false, 00:25:55.170 "data_offset": 0, 00:25:55.170 "data_size": 65536 00:25:55.170 }, 00:25:55.170 { 00:25:55.170 "name": "BaseBdev2", 00:25:55.170 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:25:55.170 "is_configured": true, 00:25:55.170 "data_offset": 0, 00:25:55.170 "data_size": 65536 00:25:55.170 } 00:25:55.170 ] 00:25:55.170 }' 00:25:55.170 06:42:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.170 06:42:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:55.737 06:42:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:55.995 [2024-07-25 06:42:09.308330] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:55.995 [2024-07-25 06:42:09.354569] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x215cbf0 00:25:55.995 [2024-07-25 06:42:09.356890] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:55.995 06:42:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:55.996 [2024-07-25 06:42:09.473252] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:55.996 [2024-07-25 06:42:09.473532] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:56.254 [2024-07-25 06:42:09.598611] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:56.254 [2024-07-25 06:42:09.598729] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:56.514 [2024-07-25 06:42:09.943176] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:56.514 [2024-07-25 06:42:09.943389] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:56.773 [2024-07-25 06:42:10.161074] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:56.773 [2024-07-25 06:42:10.161315] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:57.032 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:57.032 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.032 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:57.032 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:57.032 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.032 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.032 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.032 [2024-07-25 06:42:10.482009] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:57.290 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:57.290 "name": "raid_bdev1", 00:25:57.290 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:25:57.290 "strip_size_kb": 0, 00:25:57.290 "state": "online", 00:25:57.290 "raid_level": "raid1", 00:25:57.290 "superblock": false, 00:25:57.290 "num_base_bdevs": 2, 00:25:57.290 "num_base_bdevs_discovered": 2, 00:25:57.290 "num_base_bdevs_operational": 2, 00:25:57.290 "process": { 00:25:57.290 "type": "rebuild", 00:25:57.290 "target": "spare", 00:25:57.290 "progress": { 00:25:57.290 "blocks": 14336, 00:25:57.290 "percent": 21 00:25:57.290 } 00:25:57.290 }, 00:25:57.290 "base_bdevs_list": [ 00:25:57.290 { 00:25:57.290 "name": "spare", 00:25:57.290 "uuid": "0f963509-7bd4-52c6-bf64-4b4f86a85b5f", 00:25:57.290 "is_configured": true, 00:25:57.290 "data_offset": 0, 00:25:57.290 "data_size": 65536 00:25:57.290 }, 00:25:57.290 { 00:25:57.290 "name": "BaseBdev2", 00:25:57.290 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:25:57.290 "is_configured": true, 00:25:57.290 "data_offset": 0, 00:25:57.290 "data_size": 65536 00:25:57.290 } 00:25:57.290 ] 00:25:57.290 }' 00:25:57.290 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.290 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:57.290 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:57.290 [2024-07-25 06:42:10.699024] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:57.290 [2024-07-25 06:42:10.699227] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:57.290 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:57.290 06:42:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:57.549 [2024-07-25 06:42:10.914605] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:57.549 [2024-07-25 06:42:11.011250] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:57.549 [2024-07-25 06:42:11.011498] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:57.807 [2024-07-25 06:42:11.119878] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:57.807 [2024-07-25 06:42:11.128980] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:57.807 [2024-07-25 06:42:11.129004] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:57.807 [2024-07-25 06:42:11.129013] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:57.807 [2024-07-25 06:42:11.149144] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1fb0570 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.807 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.066 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:58.066 "name": "raid_bdev1", 00:25:58.066 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:25:58.066 "strip_size_kb": 0, 00:25:58.066 "state": "online", 00:25:58.066 "raid_level": "raid1", 00:25:58.066 "superblock": false, 00:25:58.066 "num_base_bdevs": 2, 00:25:58.066 "num_base_bdevs_discovered": 1, 00:25:58.066 "num_base_bdevs_operational": 1, 00:25:58.066 "base_bdevs_list": [ 00:25:58.066 { 00:25:58.066 "name": null, 00:25:58.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.066 "is_configured": false, 00:25:58.066 "data_offset": 0, 00:25:58.066 "data_size": 65536 00:25:58.066 }, 00:25:58.066 { 00:25:58.066 "name": "BaseBdev2", 00:25:58.066 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:25:58.066 "is_configured": true, 00:25:58.066 "data_offset": 0, 00:25:58.066 "data_size": 65536 00:25:58.066 } 00:25:58.066 ] 00:25:58.066 }' 00:25:58.066 06:42:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:58.066 06:42:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:58.633 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:58.633 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.633 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:58.633 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:58.633 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.633 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.633 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.892 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.892 "name": "raid_bdev1", 00:25:58.892 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:25:58.892 "strip_size_kb": 0, 00:25:58.892 "state": "online", 00:25:58.892 "raid_level": "raid1", 00:25:58.892 "superblock": false, 00:25:58.892 "num_base_bdevs": 2, 00:25:58.892 "num_base_bdevs_discovered": 1, 00:25:58.892 "num_base_bdevs_operational": 1, 00:25:58.892 "base_bdevs_list": [ 00:25:58.892 { 00:25:58.892 "name": null, 00:25:58.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.892 "is_configured": false, 00:25:58.892 "data_offset": 0, 00:25:58.892 "data_size": 65536 00:25:58.892 }, 00:25:58.892 { 00:25:58.892 "name": "BaseBdev2", 00:25:58.892 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:25:58.892 "is_configured": true, 00:25:58.892 "data_offset": 0, 00:25:58.892 "data_size": 65536 00:25:58.892 } 00:25:58.892 ] 00:25:58.892 }' 00:25:58.892 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.892 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:58.892 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.892 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:58.892 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:59.151 [2024-07-25 06:42:12.580256] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:59.151 [2024-07-25 06:42:12.641027] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x215cbf0 00:25:59.151 [2024-07-25 06:42:12.642381] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:59.151 06:42:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:25:59.409 [2024-07-25 06:42:12.794064] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:59.409 [2024-07-25 06:42:12.925874] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:59.409 [2024-07-25 06:42:12.926076] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:59.976 [2024-07-25 06:42:13.246619] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:59.976 [2024-07-25 06:42:13.246849] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:59.976 [2024-07-25 06:42:13.449642] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:59.976 [2024-07-25 06:42:13.449793] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:00.234 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:00.234 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:00.234 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:00.234 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:00.234 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:00.234 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.234 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.492 "name": "raid_bdev1", 00:26:00.492 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:26:00.492 "strip_size_kb": 0, 00:26:00.492 "state": "online", 00:26:00.492 "raid_level": "raid1", 00:26:00.492 "superblock": false, 00:26:00.492 "num_base_bdevs": 2, 00:26:00.492 "num_base_bdevs_discovered": 2, 00:26:00.492 "num_base_bdevs_operational": 2, 00:26:00.492 "process": { 00:26:00.492 "type": "rebuild", 00:26:00.492 "target": "spare", 00:26:00.492 "progress": { 00:26:00.492 "blocks": 14336, 00:26:00.492 "percent": 21 00:26:00.492 } 00:26:00.492 }, 00:26:00.492 "base_bdevs_list": [ 00:26:00.492 { 00:26:00.492 "name": "spare", 00:26:00.492 "uuid": "0f963509-7bd4-52c6-bf64-4b4f86a85b5f", 00:26:00.492 "is_configured": true, 00:26:00.492 "data_offset": 0, 00:26:00.492 "data_size": 65536 00:26:00.492 }, 00:26:00.492 { 00:26:00.492 "name": "BaseBdev2", 00:26:00.492 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:26:00.492 "is_configured": true, 00:26:00.492 "data_offset": 0, 00:26:00.492 "data_size": 65536 00:26:00.492 } 00:26:00.492 ] 00:26:00.492 }' 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # local timeout=782 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.492 06:42:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.750 06:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.750 "name": "raid_bdev1", 00:26:00.750 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:26:00.750 "strip_size_kb": 0, 00:26:00.750 "state": "online", 00:26:00.750 "raid_level": "raid1", 00:26:00.750 "superblock": false, 00:26:00.750 "num_base_bdevs": 2, 00:26:00.750 "num_base_bdevs_discovered": 2, 00:26:00.750 "num_base_bdevs_operational": 2, 00:26:00.750 "process": { 00:26:00.750 "type": "rebuild", 00:26:00.750 "target": "spare", 00:26:00.750 "progress": { 00:26:00.750 "blocks": 20480, 00:26:00.750 "percent": 31 00:26:00.750 } 00:26:00.750 }, 00:26:00.750 "base_bdevs_list": [ 00:26:00.750 { 00:26:00.750 "name": "spare", 00:26:00.750 "uuid": "0f963509-7bd4-52c6-bf64-4b4f86a85b5f", 00:26:00.750 "is_configured": true, 00:26:00.750 "data_offset": 0, 00:26:00.750 "data_size": 65536 00:26:00.750 }, 00:26:00.750 { 00:26:00.750 "name": "BaseBdev2", 00:26:00.750 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:26:00.750 "is_configured": true, 00:26:00.750 "data_offset": 0, 00:26:00.750 "data_size": 65536 00:26:00.750 } 00:26:00.750 ] 00:26:00.750 }' 00:26:00.750 06:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.750 06:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:00.750 06:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.750 [2024-07-25 06:42:14.262829] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:00.750 06:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:00.750 06:42:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:02.125 "name": "raid_bdev1", 00:26:02.125 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:26:02.125 "strip_size_kb": 0, 00:26:02.125 "state": "online", 00:26:02.125 "raid_level": "raid1", 00:26:02.125 "superblock": false, 00:26:02.125 "num_base_bdevs": 2, 00:26:02.125 "num_base_bdevs_discovered": 2, 00:26:02.125 "num_base_bdevs_operational": 2, 00:26:02.125 "process": { 00:26:02.125 "type": "rebuild", 00:26:02.125 "target": "spare", 00:26:02.125 "progress": { 00:26:02.125 "blocks": 43008, 00:26:02.125 "percent": 65 00:26:02.125 } 00:26:02.125 }, 00:26:02.125 "base_bdevs_list": [ 00:26:02.125 { 00:26:02.125 "name": "spare", 00:26:02.125 "uuid": "0f963509-7bd4-52c6-bf64-4b4f86a85b5f", 00:26:02.125 "is_configured": true, 00:26:02.125 "data_offset": 0, 00:26:02.125 "data_size": 65536 00:26:02.125 }, 00:26:02.125 { 00:26:02.125 "name": "BaseBdev2", 00:26:02.125 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:26:02.125 "is_configured": true, 00:26:02.125 "data_offset": 0, 00:26:02.125 "data_size": 65536 00:26:02.125 } 00:26:02.125 ] 00:26:02.125 }' 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:02.125 06:42:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:02.125 [2024-07-25 06:42:15.676445] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:02.383 [2024-07-25 06:42:15.903426] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:02.950 [2024-07-25 06:42:16.364697] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:26:03.208 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:03.208 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:03.208 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:03.208 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:03.208 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:03.208 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:03.208 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.208 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.467 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:03.467 "name": "raid_bdev1", 00:26:03.467 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:26:03.467 "strip_size_kb": 0, 00:26:03.467 "state": "online", 00:26:03.467 "raid_level": "raid1", 00:26:03.467 "superblock": false, 00:26:03.467 "num_base_bdevs": 2, 00:26:03.467 "num_base_bdevs_discovered": 2, 00:26:03.467 "num_base_bdevs_operational": 2, 00:26:03.467 "process": { 00:26:03.467 "type": "rebuild", 00:26:03.467 "target": "spare", 00:26:03.467 "progress": { 00:26:03.467 "blocks": 63488, 00:26:03.467 "percent": 96 00:26:03.467 } 00:26:03.467 }, 00:26:03.467 "base_bdevs_list": [ 00:26:03.467 { 00:26:03.467 "name": "spare", 00:26:03.467 "uuid": "0f963509-7bd4-52c6-bf64-4b4f86a85b5f", 00:26:03.467 "is_configured": true, 00:26:03.467 "data_offset": 0, 00:26:03.467 "data_size": 65536 00:26:03.467 }, 00:26:03.467 { 00:26:03.467 "name": "BaseBdev2", 00:26:03.467 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:26:03.467 "is_configured": true, 00:26:03.467 "data_offset": 0, 00:26:03.467 "data_size": 65536 00:26:03.467 } 00:26:03.467 ] 00:26:03.467 }' 00:26:03.467 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:03.467 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:03.467 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:03.467 [2024-07-25 06:42:16.908476] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:03.467 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:03.467 06:42:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:03.467 [2024-07-25 06:42:17.008710] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:03.467 [2024-07-25 06:42:17.017819] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:04.402 06:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:04.402 06:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:04.402 06:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.402 06:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:04.402 06:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:04.402 06:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.402 06:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.402 06:42:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.660 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.660 "name": "raid_bdev1", 00:26:04.660 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:26:04.660 "strip_size_kb": 0, 00:26:04.660 "state": "online", 00:26:04.660 "raid_level": "raid1", 00:26:04.660 "superblock": false, 00:26:04.660 "num_base_bdevs": 2, 00:26:04.660 "num_base_bdevs_discovered": 2, 00:26:04.660 "num_base_bdevs_operational": 2, 00:26:04.660 "base_bdevs_list": [ 00:26:04.661 { 00:26:04.661 "name": "spare", 00:26:04.661 "uuid": "0f963509-7bd4-52c6-bf64-4b4f86a85b5f", 00:26:04.661 "is_configured": true, 00:26:04.661 "data_offset": 0, 00:26:04.661 "data_size": 65536 00:26:04.661 }, 00:26:04.661 { 00:26:04.661 "name": "BaseBdev2", 00:26:04.661 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:26:04.661 "is_configured": true, 00:26:04.661 "data_offset": 0, 00:26:04.661 "data_size": 65536 00:26:04.661 } 00:26:04.661 ] 00:26:04.661 }' 00:26:04.661 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # break 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.919 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.920 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:05.178 "name": "raid_bdev1", 00:26:05.178 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:26:05.178 "strip_size_kb": 0, 00:26:05.178 "state": "online", 00:26:05.178 "raid_level": "raid1", 00:26:05.178 "superblock": false, 00:26:05.178 "num_base_bdevs": 2, 00:26:05.178 "num_base_bdevs_discovered": 2, 00:26:05.178 "num_base_bdevs_operational": 2, 00:26:05.178 "base_bdevs_list": [ 00:26:05.178 { 00:26:05.178 "name": "spare", 00:26:05.178 "uuid": "0f963509-7bd4-52c6-bf64-4b4f86a85b5f", 00:26:05.178 "is_configured": true, 00:26:05.178 "data_offset": 0, 00:26:05.178 "data_size": 65536 00:26:05.178 }, 00:26:05.178 { 00:26:05.178 "name": "BaseBdev2", 00:26:05.178 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:26:05.178 "is_configured": true, 00:26:05.178 "data_offset": 0, 00:26:05.178 "data_size": 65536 00:26:05.178 } 00:26:05.178 ] 00:26:05.178 }' 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.178 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.437 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:05.437 "name": "raid_bdev1", 00:26:05.437 "uuid": "3a59553f-c3b3-4294-9214-afd4bd2dcfa0", 00:26:05.437 "strip_size_kb": 0, 00:26:05.437 "state": "online", 00:26:05.437 "raid_level": "raid1", 00:26:05.437 "superblock": false, 00:26:05.437 "num_base_bdevs": 2, 00:26:05.437 "num_base_bdevs_discovered": 2, 00:26:05.437 "num_base_bdevs_operational": 2, 00:26:05.437 "base_bdevs_list": [ 00:26:05.437 { 00:26:05.437 "name": "spare", 00:26:05.437 "uuid": "0f963509-7bd4-52c6-bf64-4b4f86a85b5f", 00:26:05.437 "is_configured": true, 00:26:05.437 "data_offset": 0, 00:26:05.437 "data_size": 65536 00:26:05.437 }, 00:26:05.437 { 00:26:05.437 "name": "BaseBdev2", 00:26:05.437 "uuid": "abb9816f-e89c-5587-b5c1-a5ca21526513", 00:26:05.437 "is_configured": true, 00:26:05.437 "data_offset": 0, 00:26:05.437 "data_size": 65536 00:26:05.437 } 00:26:05.437 ] 00:26:05.437 }' 00:26:05.437 06:42:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:05.437 06:42:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:06.005 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:06.264 [2024-07-25 06:42:19.576700] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:06.264 [2024-07-25 06:42:19.576728] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:06.264 00:26:06.264 Latency(us) 00:26:06.264 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:06.264 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:06.264 raid_bdev1 : 11.51 97.77 293.31 0.00 0.00 14007.07 271.97 109890.76 00:26:06.264 =================================================================================================================== 00:26:06.264 Total : 97.77 293.31 0.00 0.00 14007.07 271.97 109890.76 00:26:06.264 [2024-07-25 06:42:19.644525] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:06.264 [2024-07-25 06:42:19.644550] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:06.264 [2024-07-25 06:42:19.644614] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:06.264 [2024-07-25 06:42:19.644625] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fadd20 name raid_bdev1, state offline 00:26:06.264 0 00:26:06.264 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.264 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # jq length 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:06.522 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:06.523 06:42:19 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:06.781 /dev/nbd0 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:06.781 1+0 records in 00:26:06.781 1+0 records out 00:26:06.781 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024898 s, 16.5 MB/s 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev2 ']' 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:06.781 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:26:07.040 /dev/nbd1 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:07.040 1+0 records in 00:26:07.040 1+0 records out 00:26:07.040 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260589 s, 15.7 MB/s 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:07.040 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:07.041 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:07.041 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:07.041 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:07.300 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:07.561 06:42:20 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@798 -- # killprocess 1231251 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 1231251 ']' 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 1231251 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1231251 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1231251' 00:26:07.561 killing process with pid 1231251 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 1231251 00:26:07.561 Received shutdown signal, test time was about 12.925044 seconds 00:26:07.561 00:26:07.561 Latency(us) 00:26:07.561 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:07.561 =================================================================================================================== 00:26:07.561 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:07.561 [2024-07-25 06:42:21.062888] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:07.561 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 1231251 00:26:07.561 [2024-07-25 06:42:21.081282] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@800 -- # return 0 00:26:07.852 00:26:07.852 real 0m17.324s 00:26:07.852 user 0m26.149s 00:26:07.852 sys 0m2.707s 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:07.852 ************************************ 00:26:07.852 END TEST raid_rebuild_test_io 00:26:07.852 ************************************ 00:26:07.852 06:42:21 bdev_raid -- bdev/bdev_raid.sh@960 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:26:07.852 06:42:21 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:26:07.852 06:42:21 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:07.852 06:42:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:07.852 ************************************ 00:26:07.852 START TEST raid_rebuild_test_sb_io 00:26:07.852 ************************************ 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true true true 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1234690 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1234690 /var/tmp/spdk-raid.sock 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 1234690 ']' 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:07.852 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:07.852 06:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:08.112 [2024-07-25 06:42:21.418698] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:26:08.112 [2024-07-25 06:42:21.418756] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1234690 ] 00:26:08.112 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:08.112 Zero copy mechanism will not be used. 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:01.0 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:01.1 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:01.2 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:01.3 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:01.4 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:01.5 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:01.6 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:01.7 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:02.0 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:02.1 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:02.2 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:02.3 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:02.4 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:02.5 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:02.6 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3d:02.7 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:01.0 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:01.1 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:01.2 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:01.3 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:01.4 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:01.5 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:01.6 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:01.7 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:02.0 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:02.1 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:02.2 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:02.3 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:02.4 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:02.5 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:02.6 cannot be used 00:26:08.112 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:08.112 EAL: Requested device 0000:3f:02.7 cannot be used 00:26:08.112 [2024-07-25 06:42:21.554314] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:08.112 [2024-07-25 06:42:21.599841] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:08.112 [2024-07-25 06:42:21.661435] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:08.112 [2024-07-25 06:42:21.661472] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:09.047 06:42:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:09.047 06:42:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:26:09.047 06:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:09.048 06:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:09.048 BaseBdev1_malloc 00:26:09.048 06:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:09.306 [2024-07-25 06:42:22.752516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:09.306 [2024-07-25 06:42:22.752558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:09.306 [2024-07-25 06:42:22.752579] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x29477b0 00:26:09.306 [2024-07-25 06:42:22.752592] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:09.306 [2024-07-25 06:42:22.754075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:09.306 [2024-07-25 06:42:22.754103] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:09.306 BaseBdev1 00:26:09.306 06:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:09.306 06:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:09.564 BaseBdev2_malloc 00:26:09.564 06:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:09.822 [2024-07-25 06:42:23.218051] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:09.822 [2024-07-25 06:42:23.218092] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:09.822 [2024-07-25 06:42:23.218114] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27958f0 00:26:09.822 [2024-07-25 06:42:23.218125] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:09.822 [2024-07-25 06:42:23.219468] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:09.822 [2024-07-25 06:42:23.219495] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:09.822 BaseBdev2 00:26:09.822 06:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:10.080 spare_malloc 00:26:10.080 06:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:10.338 spare_delay 00:26:10.338 06:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:10.338 [2024-07-25 06:42:23.887976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:10.338 [2024-07-25 06:42:23.888014] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:10.339 [2024-07-25 06:42:23.888032] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278cc10 00:26:10.339 [2024-07-25 06:42:23.888043] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:10.339 [2024-07-25 06:42:23.889421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:10.339 [2024-07-25 06:42:23.889449] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:10.339 spare 00:26:10.597 06:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:10.597 [2024-07-25 06:42:24.112589] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:10.597 [2024-07-25 06:42:24.113713] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:10.597 [2024-07-25 06:42:24.113858] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x278dd20 00:26:10.597 [2024-07-25 06:42:24.113871] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:10.597 [2024-07-25 06:42:24.114047] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2939e20 00:26:10.597 [2024-07-25 06:42:24.114181] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x278dd20 00:26:10.597 [2024-07-25 06:42:24.114191] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x278dd20 00:26:10.597 [2024-07-25 06:42:24.114278] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.597 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.856 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:10.856 "name": "raid_bdev1", 00:26:10.856 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:10.856 "strip_size_kb": 0, 00:26:10.856 "state": "online", 00:26:10.856 "raid_level": "raid1", 00:26:10.856 "superblock": true, 00:26:10.856 "num_base_bdevs": 2, 00:26:10.856 "num_base_bdevs_discovered": 2, 00:26:10.856 "num_base_bdevs_operational": 2, 00:26:10.856 "base_bdevs_list": [ 00:26:10.856 { 00:26:10.856 "name": "BaseBdev1", 00:26:10.856 "uuid": "290f923d-866d-5cd0-9ab8-90f017d76959", 00:26:10.856 "is_configured": true, 00:26:10.856 "data_offset": 2048, 00:26:10.856 "data_size": 63488 00:26:10.856 }, 00:26:10.856 { 00:26:10.856 "name": "BaseBdev2", 00:26:10.856 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:10.856 "is_configured": true, 00:26:10.856 "data_offset": 2048, 00:26:10.856 "data_size": 63488 00:26:10.856 } 00:26:10.856 ] 00:26:10.856 }' 00:26:10.856 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:10.856 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:11.422 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:11.422 06:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:26:11.681 [2024-07-25 06:42:25.131472] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:11.681 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:26:11.681 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.681 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:11.939 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:26:11.939 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:26:11.939 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:11.939 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:11.939 [2024-07-25 06:42:25.482114] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x278bc00 00:26:11.939 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:11.939 Zero copy mechanism will not be used. 00:26:11.939 Running I/O for 60 seconds... 00:26:12.198 [2024-07-25 06:42:25.587196] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:12.198 [2024-07-25 06:42:25.594704] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x278bc00 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.198 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.457 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:12.457 "name": "raid_bdev1", 00:26:12.457 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:12.457 "strip_size_kb": 0, 00:26:12.457 "state": "online", 00:26:12.457 "raid_level": "raid1", 00:26:12.457 "superblock": true, 00:26:12.457 "num_base_bdevs": 2, 00:26:12.457 "num_base_bdevs_discovered": 1, 00:26:12.457 "num_base_bdevs_operational": 1, 00:26:12.457 "base_bdevs_list": [ 00:26:12.457 { 00:26:12.457 "name": null, 00:26:12.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.457 "is_configured": false, 00:26:12.457 "data_offset": 2048, 00:26:12.457 "data_size": 63488 00:26:12.457 }, 00:26:12.457 { 00:26:12.457 "name": "BaseBdev2", 00:26:12.457 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:12.457 "is_configured": true, 00:26:12.457 "data_offset": 2048, 00:26:12.457 "data_size": 63488 00:26:12.457 } 00:26:12.457 ] 00:26:12.457 }' 00:26:12.457 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:12.457 06:42:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:13.025 06:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:13.284 [2024-07-25 06:42:26.660030] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:13.284 [2024-07-25 06:42:26.713627] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x293cbf0 00:26:13.284 [2024-07-25 06:42:26.715885] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:13.284 06:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:13.284 [2024-07-25 06:42:26.832835] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:13.284 [2024-07-25 06:42:26.833128] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:13.543 [2024-07-25 06:42:27.057618] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:13.543 [2024-07-25 06:42:27.057734] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:14.110 [2024-07-25 06:42:27.403783] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:14.110 [2024-07-25 06:42:27.621236] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:14.368 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:14.368 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:14.368 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:14.368 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:14.368 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:14.368 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.368 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.368 [2024-07-25 06:42:27.847662] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:14.627 [2024-07-25 06:42:27.956233] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:14.627 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:14.627 "name": "raid_bdev1", 00:26:14.627 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:14.627 "strip_size_kb": 0, 00:26:14.627 "state": "online", 00:26:14.627 "raid_level": "raid1", 00:26:14.627 "superblock": true, 00:26:14.627 "num_base_bdevs": 2, 00:26:14.627 "num_base_bdevs_discovered": 2, 00:26:14.627 "num_base_bdevs_operational": 2, 00:26:14.627 "process": { 00:26:14.627 "type": "rebuild", 00:26:14.627 "target": "spare", 00:26:14.627 "progress": { 00:26:14.627 "blocks": 14336, 00:26:14.627 "percent": 22 00:26:14.627 } 00:26:14.628 }, 00:26:14.628 "base_bdevs_list": [ 00:26:14.628 { 00:26:14.628 "name": "spare", 00:26:14.628 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:14.628 "is_configured": true, 00:26:14.628 "data_offset": 2048, 00:26:14.628 "data_size": 63488 00:26:14.628 }, 00:26:14.628 { 00:26:14.628 "name": "BaseBdev2", 00:26:14.628 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:14.628 "is_configured": true, 00:26:14.628 "data_offset": 2048, 00:26:14.628 "data_size": 63488 00:26:14.628 } 00:26:14.628 ] 00:26:14.628 }' 00:26:14.628 06:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:14.628 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:14.628 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:14.628 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:14.628 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:14.887 [2024-07-25 06:42:28.259507] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:14.887 [2024-07-25 06:42:28.291747] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:14.887 [2024-07-25 06:42:28.292106] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:14.887 [2024-07-25 06:42:28.393132] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:14.887 [2024-07-25 06:42:28.394530] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:14.887 [2024-07-25 06:42:28.394553] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:14.887 [2024-07-25 06:42:28.394561] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:14.887 [2024-07-25 06:42:28.415113] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x278bc00 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:15.146 "name": "raid_bdev1", 00:26:15.146 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:15.146 "strip_size_kb": 0, 00:26:15.146 "state": "online", 00:26:15.146 "raid_level": "raid1", 00:26:15.146 "superblock": true, 00:26:15.146 "num_base_bdevs": 2, 00:26:15.146 "num_base_bdevs_discovered": 1, 00:26:15.146 "num_base_bdevs_operational": 1, 00:26:15.146 "base_bdevs_list": [ 00:26:15.146 { 00:26:15.146 "name": null, 00:26:15.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.146 "is_configured": false, 00:26:15.146 "data_offset": 2048, 00:26:15.146 "data_size": 63488 00:26:15.146 }, 00:26:15.146 { 00:26:15.146 "name": "BaseBdev2", 00:26:15.146 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:15.146 "is_configured": true, 00:26:15.146 "data_offset": 2048, 00:26:15.146 "data_size": 63488 00:26:15.146 } 00:26:15.146 ] 00:26:15.146 }' 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:15.146 06:42:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:16.083 "name": "raid_bdev1", 00:26:16.083 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:16.083 "strip_size_kb": 0, 00:26:16.083 "state": "online", 00:26:16.083 "raid_level": "raid1", 00:26:16.083 "superblock": true, 00:26:16.083 "num_base_bdevs": 2, 00:26:16.083 "num_base_bdevs_discovered": 1, 00:26:16.083 "num_base_bdevs_operational": 1, 00:26:16.083 "base_bdevs_list": [ 00:26:16.083 { 00:26:16.083 "name": null, 00:26:16.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.083 "is_configured": false, 00:26:16.083 "data_offset": 2048, 00:26:16.083 "data_size": 63488 00:26:16.083 }, 00:26:16.083 { 00:26:16.083 "name": "BaseBdev2", 00:26:16.083 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:16.083 "is_configured": true, 00:26:16.083 "data_offset": 2048, 00:26:16.083 "data_size": 63488 00:26:16.083 } 00:26:16.083 ] 00:26:16.083 }' 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:16.083 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:16.342 [2024-07-25 06:42:29.822689] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:16.342 06:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:26:16.342 [2024-07-25 06:42:29.875935] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x293cbf0 00:26:16.342 [2024-07-25 06:42:29.877271] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:16.600 [2024-07-25 06:42:30.013413] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:16.600 [2024-07-25 06:42:30.139190] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:16.600 [2024-07-25 06:42:30.139376] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:17.168 [2024-07-25 06:42:30.524911] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:17.426 [2024-07-25 06:42:30.760605] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:17.426 06:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:17.426 06:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:17.426 06:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:17.426 06:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:17.426 06:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:17.426 [2024-07-25 06:42:30.870133] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:17.426 06:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.426 06:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:17.685 "name": "raid_bdev1", 00:26:17.685 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:17.685 "strip_size_kb": 0, 00:26:17.685 "state": "online", 00:26:17.685 "raid_level": "raid1", 00:26:17.685 "superblock": true, 00:26:17.685 "num_base_bdevs": 2, 00:26:17.685 "num_base_bdevs_discovered": 2, 00:26:17.685 "num_base_bdevs_operational": 2, 00:26:17.685 "process": { 00:26:17.685 "type": "rebuild", 00:26:17.685 "target": "spare", 00:26:17.685 "progress": { 00:26:17.685 "blocks": 18432, 00:26:17.685 "percent": 29 00:26:17.685 } 00:26:17.685 }, 00:26:17.685 "base_bdevs_list": [ 00:26:17.685 { 00:26:17.685 "name": "spare", 00:26:17.685 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:17.685 "is_configured": true, 00:26:17.685 "data_offset": 2048, 00:26:17.685 "data_size": 63488 00:26:17.685 }, 00:26:17.685 { 00:26:17.685 "name": "BaseBdev2", 00:26:17.685 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:17.685 "is_configured": true, 00:26:17.685 "data_offset": 2048, 00:26:17.685 "data_size": 63488 00:26:17.685 } 00:26:17.685 ] 00:26:17.685 }' 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:26:17.685 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # local timeout=800 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.685 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.944 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:17.944 "name": "raid_bdev1", 00:26:17.944 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:17.944 "strip_size_kb": 0, 00:26:17.944 "state": "online", 00:26:17.944 "raid_level": "raid1", 00:26:17.944 "superblock": true, 00:26:17.944 "num_base_bdevs": 2, 00:26:17.944 "num_base_bdevs_discovered": 2, 00:26:17.944 "num_base_bdevs_operational": 2, 00:26:17.944 "process": { 00:26:17.944 "type": "rebuild", 00:26:17.944 "target": "spare", 00:26:17.944 "progress": { 00:26:17.944 "blocks": 24576, 00:26:17.944 "percent": 38 00:26:17.944 } 00:26:17.944 }, 00:26:17.944 "base_bdevs_list": [ 00:26:17.944 { 00:26:17.944 "name": "spare", 00:26:17.944 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:17.944 "is_configured": true, 00:26:17.944 "data_offset": 2048, 00:26:17.944 "data_size": 63488 00:26:17.944 }, 00:26:17.944 { 00:26:17.944 "name": "BaseBdev2", 00:26:17.944 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:17.944 "is_configured": true, 00:26:17.944 "data_offset": 2048, 00:26:17.944 "data_size": 63488 00:26:17.944 } 00:26:17.944 ] 00:26:17.944 }' 00:26:17.944 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:17.944 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:17.944 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:18.202 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:18.202 06:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:18.202 [2024-07-25 06:42:31.580180] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:18.770 [2024-07-25 06:42:32.247564] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:19.029 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:19.029 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:19.029 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:19.029 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:19.029 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:19.029 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:19.029 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.029 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.029 [2024-07-25 06:42:32.583911] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:26:19.288 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:19.288 "name": "raid_bdev1", 00:26:19.288 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:19.288 "strip_size_kb": 0, 00:26:19.288 "state": "online", 00:26:19.288 "raid_level": "raid1", 00:26:19.288 "superblock": true, 00:26:19.288 "num_base_bdevs": 2, 00:26:19.288 "num_base_bdevs_discovered": 2, 00:26:19.288 "num_base_bdevs_operational": 2, 00:26:19.288 "process": { 00:26:19.288 "type": "rebuild", 00:26:19.288 "target": "spare", 00:26:19.288 "progress": { 00:26:19.289 "blocks": 45056, 00:26:19.289 "percent": 70 00:26:19.289 } 00:26:19.289 }, 00:26:19.289 "base_bdevs_list": [ 00:26:19.289 { 00:26:19.289 "name": "spare", 00:26:19.289 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:19.289 "is_configured": true, 00:26:19.289 "data_offset": 2048, 00:26:19.289 "data_size": 63488 00:26:19.289 }, 00:26:19.289 { 00:26:19.289 "name": "BaseBdev2", 00:26:19.289 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:19.289 "is_configured": true, 00:26:19.289 "data_offset": 2048, 00:26:19.289 "data_size": 63488 00:26:19.289 } 00:26:19.289 ] 00:26:19.289 }' 00:26:19.289 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:19.289 [2024-07-25 06:42:32.785942] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:19.289 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:19.289 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:19.289 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:19.289 06:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:19.548 [2024-07-25 06:42:33.020704] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:19.807 [2024-07-25 06:42:33.229549] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:19.807 [2024-07-25 06:42:33.229666] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:20.065 [2024-07-25 06:42:33.466175] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:26:20.322 [2024-07-25 06:42:33.691036] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:20.322 06:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:20.323 06:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:20.323 06:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.323 06:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:20.323 06:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:20.323 06:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.323 06:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.323 06:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.581 [2024-07-25 06:42:34.026020] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:20.581 06:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:20.581 "name": "raid_bdev1", 00:26:20.581 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:20.581 "strip_size_kb": 0, 00:26:20.581 "state": "online", 00:26:20.581 "raid_level": "raid1", 00:26:20.581 "superblock": true, 00:26:20.581 "num_base_bdevs": 2, 00:26:20.581 "num_base_bdevs_discovered": 2, 00:26:20.581 "num_base_bdevs_operational": 2, 00:26:20.581 "process": { 00:26:20.581 "type": "rebuild", 00:26:20.581 "target": "spare", 00:26:20.581 "progress": { 00:26:20.581 "blocks": 63488, 00:26:20.581 "percent": 100 00:26:20.581 } 00:26:20.581 }, 00:26:20.581 "base_bdevs_list": [ 00:26:20.581 { 00:26:20.581 "name": "spare", 00:26:20.581 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:20.581 "is_configured": true, 00:26:20.581 "data_offset": 2048, 00:26:20.581 "data_size": 63488 00:26:20.581 }, 00:26:20.581 { 00:26:20.581 "name": "BaseBdev2", 00:26:20.581 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:20.581 "is_configured": true, 00:26:20.581 "data_offset": 2048, 00:26:20.581 "data_size": 63488 00:26:20.581 } 00:26:20.581 ] 00:26:20.581 }' 00:26:20.581 06:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:20.581 [2024-07-25 06:42:34.126356] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:20.581 [2024-07-25 06:42:34.127567] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:20.581 06:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:20.581 06:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:20.847 06:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:20.847 06:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:21.796 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:21.796 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:21.796 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:21.796 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:21.796 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:21.796 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:21.796 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.796 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:22.055 "name": "raid_bdev1", 00:26:22.055 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:22.055 "strip_size_kb": 0, 00:26:22.055 "state": "online", 00:26:22.055 "raid_level": "raid1", 00:26:22.055 "superblock": true, 00:26:22.055 "num_base_bdevs": 2, 00:26:22.055 "num_base_bdevs_discovered": 2, 00:26:22.055 "num_base_bdevs_operational": 2, 00:26:22.055 "base_bdevs_list": [ 00:26:22.055 { 00:26:22.055 "name": "spare", 00:26:22.055 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:22.055 "is_configured": true, 00:26:22.055 "data_offset": 2048, 00:26:22.055 "data_size": 63488 00:26:22.055 }, 00:26:22.055 { 00:26:22.055 "name": "BaseBdev2", 00:26:22.055 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:22.055 "is_configured": true, 00:26:22.055 "data_offset": 2048, 00:26:22.055 "data_size": 63488 00:26:22.055 } 00:26:22.055 ] 00:26:22.055 }' 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # break 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.055 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:22.315 "name": "raid_bdev1", 00:26:22.315 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:22.315 "strip_size_kb": 0, 00:26:22.315 "state": "online", 00:26:22.315 "raid_level": "raid1", 00:26:22.315 "superblock": true, 00:26:22.315 "num_base_bdevs": 2, 00:26:22.315 "num_base_bdevs_discovered": 2, 00:26:22.315 "num_base_bdevs_operational": 2, 00:26:22.315 "base_bdevs_list": [ 00:26:22.315 { 00:26:22.315 "name": "spare", 00:26:22.315 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:22.315 "is_configured": true, 00:26:22.315 "data_offset": 2048, 00:26:22.315 "data_size": 63488 00:26:22.315 }, 00:26:22.315 { 00:26:22.315 "name": "BaseBdev2", 00:26:22.315 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:22.315 "is_configured": true, 00:26:22.315 "data_offset": 2048, 00:26:22.315 "data_size": 63488 00:26:22.315 } 00:26:22.315 ] 00:26:22.315 }' 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.315 06:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.574 06:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.574 "name": "raid_bdev1", 00:26:22.574 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:22.574 "strip_size_kb": 0, 00:26:22.574 "state": "online", 00:26:22.574 "raid_level": "raid1", 00:26:22.574 "superblock": true, 00:26:22.574 "num_base_bdevs": 2, 00:26:22.574 "num_base_bdevs_discovered": 2, 00:26:22.574 "num_base_bdevs_operational": 2, 00:26:22.574 "base_bdevs_list": [ 00:26:22.574 { 00:26:22.574 "name": "spare", 00:26:22.574 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:22.574 "is_configured": true, 00:26:22.574 "data_offset": 2048, 00:26:22.574 "data_size": 63488 00:26:22.574 }, 00:26:22.574 { 00:26:22.574 "name": "BaseBdev2", 00:26:22.574 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:22.574 "is_configured": true, 00:26:22.574 "data_offset": 2048, 00:26:22.574 "data_size": 63488 00:26:22.574 } 00:26:22.574 ] 00:26:22.574 }' 00:26:22.574 06:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.574 06:42:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:23.142 06:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:23.402 [2024-07-25 06:42:36.807271] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:23.402 [2024-07-25 06:42:36.807300] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:23.402 00:26:23.402 Latency(us) 00:26:23.402 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:23.402 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:23.402 raid_bdev1 : 11.36 109.11 327.33 0.00 0.00 12592.72 268.70 116601.65 00:26:23.402 =================================================================================================================== 00:26:23.402 Total : 109.11 327.33 0.00 0.00 12592.72 268.70 116601.65 00:26:23.402 [2024-07-25 06:42:36.871063] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:23.402 [2024-07-25 06:42:36.871087] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:23.402 [2024-07-25 06:42:36.871160] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:23.402 [2024-07-25 06:42:36.871171] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x278dd20 name raid_bdev1, state offline 00:26:23.402 0 00:26:23.402 06:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.402 06:42:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # jq length 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:23.661 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:23.920 /dev/nbd0 00:26:23.920 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:23.920 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:23.920 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:23.920 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:23.921 1+0 records in 00:26:23.921 1+0 records out 00:26:23.921 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257594 s, 15.9 MB/s 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev2 ']' 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:23.921 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:26:24.180 /dev/nbd1 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:24.180 1+0 records in 00:26:24.180 1+0 records out 00:26:24.180 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233772 s, 17.5 MB/s 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:24.180 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:24.440 06:42:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:24.699 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:24.699 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:24.699 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:24.699 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:24.699 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:24.699 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:24.958 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:24.958 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:24.958 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:26:24.958 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:24.958 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:25.217 [2024-07-25 06:42:38.694510] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:25.217 [2024-07-25 06:42:38.694549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:25.217 [2024-07-25 06:42:38.694568] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x293c9f0 00:26:25.217 [2024-07-25 06:42:38.694580] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:25.217 [2024-07-25 06:42:38.696132] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:25.217 [2024-07-25 06:42:38.696165] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:25.217 [2024-07-25 06:42:38.696230] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:25.217 [2024-07-25 06:42:38.696253] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:25.217 [2024-07-25 06:42:38.696340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:25.217 spare 00:26:25.217 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:25.217 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.218 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.477 [2024-07-25 06:42:38.796644] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x278f220 00:26:25.477 [2024-07-25 06:42:38.796657] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:25.477 [2024-07-25 06:42:38.796824] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x293bcf0 00:26:25.477 [2024-07-25 06:42:38.796953] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x278f220 00:26:25.477 [2024-07-25 06:42:38.796963] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x278f220 00:26:25.477 [2024-07-25 06:42:38.797054] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:25.477 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:25.477 "name": "raid_bdev1", 00:26:25.477 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:25.477 "strip_size_kb": 0, 00:26:25.477 "state": "online", 00:26:25.477 "raid_level": "raid1", 00:26:25.477 "superblock": true, 00:26:25.477 "num_base_bdevs": 2, 00:26:25.477 "num_base_bdevs_discovered": 2, 00:26:25.477 "num_base_bdevs_operational": 2, 00:26:25.477 "base_bdevs_list": [ 00:26:25.477 { 00:26:25.477 "name": "spare", 00:26:25.477 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:25.477 "is_configured": true, 00:26:25.477 "data_offset": 2048, 00:26:25.477 "data_size": 63488 00:26:25.477 }, 00:26:25.477 { 00:26:25.477 "name": "BaseBdev2", 00:26:25.477 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:25.477 "is_configured": true, 00:26:25.477 "data_offset": 2048, 00:26:25.477 "data_size": 63488 00:26:25.477 } 00:26:25.477 ] 00:26:25.477 }' 00:26:25.477 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:25.477 06:42:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:26.045 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:26.045 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:26.045 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:26.045 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:26.045 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:26.045 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.045 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.305 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:26.305 "name": "raid_bdev1", 00:26:26.305 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:26.305 "strip_size_kb": 0, 00:26:26.305 "state": "online", 00:26:26.305 "raid_level": "raid1", 00:26:26.305 "superblock": true, 00:26:26.305 "num_base_bdevs": 2, 00:26:26.305 "num_base_bdevs_discovered": 2, 00:26:26.305 "num_base_bdevs_operational": 2, 00:26:26.305 "base_bdevs_list": [ 00:26:26.305 { 00:26:26.305 "name": "spare", 00:26:26.305 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:26.305 "is_configured": true, 00:26:26.305 "data_offset": 2048, 00:26:26.305 "data_size": 63488 00:26:26.305 }, 00:26:26.305 { 00:26:26.305 "name": "BaseBdev2", 00:26:26.305 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:26.305 "is_configured": true, 00:26:26.305 "data_offset": 2048, 00:26:26.305 "data_size": 63488 00:26:26.305 } 00:26:26.305 ] 00:26:26.305 }' 00:26:26.305 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:26.305 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:26.305 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:26.305 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:26.305 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:26.305 06:42:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.564 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:26:26.564 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:26.823 [2024-07-25 06:42:40.258893] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.823 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.082 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:27.082 "name": "raid_bdev1", 00:26:27.082 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:27.082 "strip_size_kb": 0, 00:26:27.082 "state": "online", 00:26:27.082 "raid_level": "raid1", 00:26:27.082 "superblock": true, 00:26:27.082 "num_base_bdevs": 2, 00:26:27.082 "num_base_bdevs_discovered": 1, 00:26:27.082 "num_base_bdevs_operational": 1, 00:26:27.082 "base_bdevs_list": [ 00:26:27.082 { 00:26:27.082 "name": null, 00:26:27.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.082 "is_configured": false, 00:26:27.082 "data_offset": 2048, 00:26:27.082 "data_size": 63488 00:26:27.082 }, 00:26:27.082 { 00:26:27.082 "name": "BaseBdev2", 00:26:27.082 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:27.082 "is_configured": true, 00:26:27.082 "data_offset": 2048, 00:26:27.082 "data_size": 63488 00:26:27.082 } 00:26:27.082 ] 00:26:27.082 }' 00:26:27.082 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:27.082 06:42:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:27.649 06:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:27.907 [2024-07-25 06:42:41.297905] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:27.907 [2024-07-25 06:42:41.298032] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:27.907 [2024-07-25 06:42:41.298047] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:27.907 [2024-07-25 06:42:41.298075] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:27.907 [2024-07-25 06:42:41.303037] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x278c0e0 00:26:27.907 [2024-07-25 06:42:41.305050] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:27.907 06:42:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # sleep 1 00:26:28.843 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:28.843 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.843 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:28.843 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:28.843 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.843 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.843 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.102 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:29.102 "name": "raid_bdev1", 00:26:29.102 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:29.102 "strip_size_kb": 0, 00:26:29.102 "state": "online", 00:26:29.102 "raid_level": "raid1", 00:26:29.102 "superblock": true, 00:26:29.102 "num_base_bdevs": 2, 00:26:29.102 "num_base_bdevs_discovered": 2, 00:26:29.102 "num_base_bdevs_operational": 2, 00:26:29.102 "process": { 00:26:29.102 "type": "rebuild", 00:26:29.102 "target": "spare", 00:26:29.102 "progress": { 00:26:29.102 "blocks": 24576, 00:26:29.102 "percent": 38 00:26:29.102 } 00:26:29.102 }, 00:26:29.102 "base_bdevs_list": [ 00:26:29.102 { 00:26:29.102 "name": "spare", 00:26:29.102 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:29.103 "is_configured": true, 00:26:29.103 "data_offset": 2048, 00:26:29.103 "data_size": 63488 00:26:29.103 }, 00:26:29.103 { 00:26:29.103 "name": "BaseBdev2", 00:26:29.103 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:29.103 "is_configured": true, 00:26:29.103 "data_offset": 2048, 00:26:29.103 "data_size": 63488 00:26:29.103 } 00:26:29.103 ] 00:26:29.103 }' 00:26:29.103 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:29.103 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:29.103 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:29.103 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:29.103 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:29.362 [2024-07-25 06:42:42.860295] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:29.362 [2024-07-25 06:42:42.916835] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:29.362 [2024-07-25 06:42:42.916880] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:29.362 [2024-07-25 06:42:42.916895] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:29.362 [2024-07-25 06:42:42.916903] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.621 06:42:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.881 06:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.881 "name": "raid_bdev1", 00:26:29.881 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:29.881 "strip_size_kb": 0, 00:26:29.881 "state": "online", 00:26:29.881 "raid_level": "raid1", 00:26:29.881 "superblock": true, 00:26:29.881 "num_base_bdevs": 2, 00:26:29.881 "num_base_bdevs_discovered": 1, 00:26:29.881 "num_base_bdevs_operational": 1, 00:26:29.881 "base_bdevs_list": [ 00:26:29.881 { 00:26:29.881 "name": null, 00:26:29.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.881 "is_configured": false, 00:26:29.881 "data_offset": 2048, 00:26:29.881 "data_size": 63488 00:26:29.881 }, 00:26:29.881 { 00:26:29.881 "name": "BaseBdev2", 00:26:29.881 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:29.881 "is_configured": true, 00:26:29.881 "data_offset": 2048, 00:26:29.881 "data_size": 63488 00:26:29.881 } 00:26:29.881 ] 00:26:29.881 }' 00:26:29.881 06:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.881 06:42:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:30.449 06:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:30.449 [2024-07-25 06:42:43.971900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:30.449 [2024-07-25 06:42:43.971945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:30.449 [2024-07-25 06:42:43.971964] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27bc0b0 00:26:30.449 [2024-07-25 06:42:43.971977] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:30.449 [2024-07-25 06:42:43.972320] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:30.449 [2024-07-25 06:42:43.972337] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:30.449 [2024-07-25 06:42:43.972410] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:30.449 [2024-07-25 06:42:43.972421] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:30.449 [2024-07-25 06:42:43.972431] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:30.449 [2024-07-25 06:42:43.972449] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:30.449 [2024-07-25 06:42:43.977397] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x278c0e0 00:26:30.449 spare 00:26:30.449 [2024-07-25 06:42:43.978724] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:30.449 06:42:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:26:31.828 06:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:31.828 06:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:31.828 06:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:31.828 06:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:31.828 06:42:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:31.828 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.828 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.828 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:31.828 "name": "raid_bdev1", 00:26:31.828 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:31.828 "strip_size_kb": 0, 00:26:31.828 "state": "online", 00:26:31.828 "raid_level": "raid1", 00:26:31.828 "superblock": true, 00:26:31.828 "num_base_bdevs": 2, 00:26:31.828 "num_base_bdevs_discovered": 2, 00:26:31.828 "num_base_bdevs_operational": 2, 00:26:31.828 "process": { 00:26:31.828 "type": "rebuild", 00:26:31.828 "target": "spare", 00:26:31.828 "progress": { 00:26:31.828 "blocks": 24576, 00:26:31.828 "percent": 38 00:26:31.828 } 00:26:31.828 }, 00:26:31.828 "base_bdevs_list": [ 00:26:31.828 { 00:26:31.828 "name": "spare", 00:26:31.828 "uuid": "2968ef5c-2ccc-528b-aa17-7dfb030b8d7c", 00:26:31.828 "is_configured": true, 00:26:31.828 "data_offset": 2048, 00:26:31.828 "data_size": 63488 00:26:31.828 }, 00:26:31.828 { 00:26:31.828 "name": "BaseBdev2", 00:26:31.828 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:31.828 "is_configured": true, 00:26:31.828 "data_offset": 2048, 00:26:31.828 "data_size": 63488 00:26:31.828 } 00:26:31.828 ] 00:26:31.828 }' 00:26:31.828 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:31.828 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:31.828 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.828 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:31.828 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:32.088 [2024-07-25 06:42:45.526322] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:32.088 [2024-07-25 06:42:45.590453] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:32.088 [2024-07-25 06:42:45.590497] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:32.088 [2024-07-25 06:42:45.590511] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:32.088 [2024-07-25 06:42:45.590518] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.088 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.347 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:32.347 "name": "raid_bdev1", 00:26:32.347 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:32.347 "strip_size_kb": 0, 00:26:32.347 "state": "online", 00:26:32.347 "raid_level": "raid1", 00:26:32.347 "superblock": true, 00:26:32.347 "num_base_bdevs": 2, 00:26:32.347 "num_base_bdevs_discovered": 1, 00:26:32.347 "num_base_bdevs_operational": 1, 00:26:32.347 "base_bdevs_list": [ 00:26:32.347 { 00:26:32.347 "name": null, 00:26:32.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.347 "is_configured": false, 00:26:32.347 "data_offset": 2048, 00:26:32.347 "data_size": 63488 00:26:32.347 }, 00:26:32.347 { 00:26:32.347 "name": "BaseBdev2", 00:26:32.347 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:32.347 "is_configured": true, 00:26:32.347 "data_offset": 2048, 00:26:32.347 "data_size": 63488 00:26:32.347 } 00:26:32.347 ] 00:26:32.347 }' 00:26:32.347 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:32.347 06:42:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:32.915 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:32.915 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:32.915 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:32.915 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:32.915 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:32.915 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.915 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.173 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:33.173 "name": "raid_bdev1", 00:26:33.173 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:33.173 "strip_size_kb": 0, 00:26:33.174 "state": "online", 00:26:33.174 "raid_level": "raid1", 00:26:33.174 "superblock": true, 00:26:33.174 "num_base_bdevs": 2, 00:26:33.174 "num_base_bdevs_discovered": 1, 00:26:33.174 "num_base_bdevs_operational": 1, 00:26:33.174 "base_bdevs_list": [ 00:26:33.174 { 00:26:33.174 "name": null, 00:26:33.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.174 "is_configured": false, 00:26:33.174 "data_offset": 2048, 00:26:33.174 "data_size": 63488 00:26:33.174 }, 00:26:33.174 { 00:26:33.174 "name": "BaseBdev2", 00:26:33.174 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:33.174 "is_configured": true, 00:26:33.174 "data_offset": 2048, 00:26:33.174 "data_size": 63488 00:26:33.174 } 00:26:33.174 ] 00:26:33.174 }' 00:26:33.174 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:33.174 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:33.174 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:33.174 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:33.174 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:33.431 06:42:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:33.689 [2024-07-25 06:42:47.155363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:33.689 [2024-07-25 06:42:47.155405] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.689 [2024-07-25 06:42:47.155424] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x29470f0 00:26:33.689 [2024-07-25 06:42:47.155436] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.689 [2024-07-25 06:42:47.155742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.689 [2024-07-25 06:42:47.155764] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:33.689 [2024-07-25 06:42:47.155822] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:33.689 [2024-07-25 06:42:47.155832] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:33.689 [2024-07-25 06:42:47.155842] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:33.689 BaseBdev1 00:26:33.689 06:42:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # sleep 1 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.679 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.945 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.945 "name": "raid_bdev1", 00:26:34.945 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:34.945 "strip_size_kb": 0, 00:26:34.945 "state": "online", 00:26:34.945 "raid_level": "raid1", 00:26:34.945 "superblock": true, 00:26:34.945 "num_base_bdevs": 2, 00:26:34.945 "num_base_bdevs_discovered": 1, 00:26:34.945 "num_base_bdevs_operational": 1, 00:26:34.945 "base_bdevs_list": [ 00:26:34.945 { 00:26:34.945 "name": null, 00:26:34.945 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.945 "is_configured": false, 00:26:34.945 "data_offset": 2048, 00:26:34.945 "data_size": 63488 00:26:34.945 }, 00:26:34.945 { 00:26:34.945 "name": "BaseBdev2", 00:26:34.945 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:34.945 "is_configured": true, 00:26:34.945 "data_offset": 2048, 00:26:34.945 "data_size": 63488 00:26:34.945 } 00:26:34.945 ] 00:26:34.945 }' 00:26:34.945 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.945 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:35.514 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:35.514 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:35.514 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:35.514 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:35.514 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:35.514 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.514 06:42:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:35.774 "name": "raid_bdev1", 00:26:35.774 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:35.774 "strip_size_kb": 0, 00:26:35.774 "state": "online", 00:26:35.774 "raid_level": "raid1", 00:26:35.774 "superblock": true, 00:26:35.774 "num_base_bdevs": 2, 00:26:35.774 "num_base_bdevs_discovered": 1, 00:26:35.774 "num_base_bdevs_operational": 1, 00:26:35.774 "base_bdevs_list": [ 00:26:35.774 { 00:26:35.774 "name": null, 00:26:35.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:35.774 "is_configured": false, 00:26:35.774 "data_offset": 2048, 00:26:35.774 "data_size": 63488 00:26:35.774 }, 00:26:35.774 { 00:26:35.774 "name": "BaseBdev2", 00:26:35.774 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:35.774 "is_configured": true, 00:26:35.774 "data_offset": 2048, 00:26:35.774 "data_size": 63488 00:26:35.774 } 00:26:35.774 ] 00:26:35.774 }' 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:35.774 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:36.033 [2024-07-25 06:42:49.505868] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:36.033 [2024-07-25 06:42:49.505972] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:36.033 [2024-07-25 06:42:49.505987] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:36.033 request: 00:26:36.033 { 00:26:36.033 "base_bdev": "BaseBdev1", 00:26:36.033 "raid_bdev": "raid_bdev1", 00:26:36.033 "method": "bdev_raid_add_base_bdev", 00:26:36.033 "req_id": 1 00:26:36.033 } 00:26:36.033 Got JSON-RPC error response 00:26:36.033 response: 00:26:36.033 { 00:26:36.033 "code": -22, 00:26:36.033 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:36.033 } 00:26:36.033 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:26:36.033 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:26:36.033 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:26:36.033 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:26:36.033 06:42:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@793 -- # sleep 1 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:37.412 "name": "raid_bdev1", 00:26:37.412 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:37.412 "strip_size_kb": 0, 00:26:37.412 "state": "online", 00:26:37.412 "raid_level": "raid1", 00:26:37.412 "superblock": true, 00:26:37.412 "num_base_bdevs": 2, 00:26:37.412 "num_base_bdevs_discovered": 1, 00:26:37.412 "num_base_bdevs_operational": 1, 00:26:37.412 "base_bdevs_list": [ 00:26:37.412 { 00:26:37.412 "name": null, 00:26:37.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.412 "is_configured": false, 00:26:37.412 "data_offset": 2048, 00:26:37.412 "data_size": 63488 00:26:37.412 }, 00:26:37.412 { 00:26:37.412 "name": "BaseBdev2", 00:26:37.412 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:37.412 "is_configured": true, 00:26:37.412 "data_offset": 2048, 00:26:37.412 "data_size": 63488 00:26:37.412 } 00:26:37.412 ] 00:26:37.412 }' 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:37.412 06:42:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:37.981 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:37.981 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:37.981 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:37.981 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:37.981 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:37.981 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.981 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.240 "name": "raid_bdev1", 00:26:38.240 "uuid": "0d002bfd-a143-45e3-8a08-97944b9766b6", 00:26:38.240 "strip_size_kb": 0, 00:26:38.240 "state": "online", 00:26:38.240 "raid_level": "raid1", 00:26:38.240 "superblock": true, 00:26:38.240 "num_base_bdevs": 2, 00:26:38.240 "num_base_bdevs_discovered": 1, 00:26:38.240 "num_base_bdevs_operational": 1, 00:26:38.240 "base_bdevs_list": [ 00:26:38.240 { 00:26:38.240 "name": null, 00:26:38.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.240 "is_configured": false, 00:26:38.240 "data_offset": 2048, 00:26:38.240 "data_size": 63488 00:26:38.240 }, 00:26:38.240 { 00:26:38.240 "name": "BaseBdev2", 00:26:38.240 "uuid": "5fa42158-e629-5fdd-b1b8-aa4fd2618917", 00:26:38.240 "is_configured": true, 00:26:38.240 "data_offset": 2048, 00:26:38.240 "data_size": 63488 00:26:38.240 } 00:26:38.240 ] 00:26:38.240 }' 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@798 -- # killprocess 1234690 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 1234690 ']' 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 1234690 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1234690 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1234690' 00:26:38.240 killing process with pid 1234690 00:26:38.240 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 1234690 00:26:38.240 Received shutdown signal, test time was about 26.153537 seconds 00:26:38.240 00:26:38.240 Latency(us) 00:26:38.240 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:38.240 =================================================================================================================== 00:26:38.240 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:38.241 [2024-07-25 06:42:51.701460] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:38.241 [2024-07-25 06:42:51.701546] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:38.241 [2024-07-25 06:42:51.701591] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:38.241 [2024-07-25 06:42:51.701602] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x278f220 name raid_bdev1, state offline 00:26:38.241 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 1234690 00:26:38.241 [2024-07-25 06:42:51.720577] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@800 -- # return 0 00:26:38.500 00:26:38.500 real 0m30.559s 00:26:38.500 user 0m47.376s 00:26:38.500 sys 0m4.497s 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:38.500 ************************************ 00:26:38.500 END TEST raid_rebuild_test_sb_io 00:26:38.500 ************************************ 00:26:38.500 06:42:51 bdev_raid -- bdev/bdev_raid.sh@956 -- # for n in 2 4 00:26:38.500 06:42:51 bdev_raid -- bdev/bdev_raid.sh@957 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:26:38.500 06:42:51 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:26:38.500 06:42:51 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:38.500 06:42:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:38.500 ************************************ 00:26:38.500 START TEST raid_rebuild_test 00:26:38.500 ************************************ 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false false true 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # local verify=true 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:38.500 06:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # local strip_size 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # local create_arg 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # local data_offset 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # raid_pid=1240317 00:26:38.500 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # waitforlisten 1240317 /var/tmp/spdk-raid.sock 00:26:38.501 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:38.501 06:42:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 1240317 ']' 00:26:38.501 06:42:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:38.501 06:42:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:38.501 06:42:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:38.501 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:38.501 06:42:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:38.501 06:42:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:38.761 [2024-07-25 06:42:52.064550] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:26:38.761 [2024-07-25 06:42:52.064607] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1240317 ] 00:26:38.761 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:38.761 Zero copy mechanism will not be used. 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:01.0 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:01.1 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:01.2 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:01.3 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:01.4 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:01.5 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:01.6 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:01.7 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:02.0 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:02.1 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:02.2 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:02.3 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:02.4 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:02.5 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:02.6 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3d:02.7 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:01.0 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:01.1 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:01.2 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:01.3 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:01.4 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:01.5 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:01.6 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:01.7 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:02.0 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:02.1 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:02.2 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:02.3 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:02.4 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:02.5 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:02.6 cannot be used 00:26:38.761 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:26:38.761 EAL: Requested device 0000:3f:02.7 cannot be used 00:26:38.761 [2024-07-25 06:42:52.200743] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:38.761 [2024-07-25 06:42:52.245319] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:38.761 [2024-07-25 06:42:52.307769] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:38.761 [2024-07-25 06:42:52.307826] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:39.699 06:42:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:39.699 06:42:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:26:39.699 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:39.699 06:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:39.699 BaseBdev1_malloc 00:26:39.699 06:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:39.958 [2024-07-25 06:42:53.407376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:39.958 [2024-07-25 06:42:53.407420] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:39.958 [2024-07-25 06:42:53.407444] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x29327b0 00:26:39.958 [2024-07-25 06:42:53.407461] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:39.958 [2024-07-25 06:42:53.408970] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:39.958 [2024-07-25 06:42:53.408997] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:39.958 BaseBdev1 00:26:39.958 06:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:39.958 06:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:40.216 BaseBdev2_malloc 00:26:40.216 06:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:40.474 [2024-07-25 06:42:53.856822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:40.474 [2024-07-25 06:42:53.856864] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:40.474 [2024-07-25 06:42:53.856888] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27808f0 00:26:40.474 [2024-07-25 06:42:53.856900] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:40.474 [2024-07-25 06:42:53.858196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:40.474 [2024-07-25 06:42:53.858222] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:40.474 BaseBdev2 00:26:40.474 06:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:40.474 06:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:40.733 BaseBdev3_malloc 00:26:40.733 06:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:40.991 [2024-07-25 06:42:54.314274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:40.991 [2024-07-25 06:42:54.314317] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:40.992 [2024-07-25 06:42:54.314336] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2927330 00:26:40.992 [2024-07-25 06:42:54.314348] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:40.992 [2024-07-25 06:42:54.315683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:40.992 [2024-07-25 06:42:54.315710] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:40.992 BaseBdev3 00:26:40.992 06:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:40.992 06:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:41.250 BaseBdev4_malloc 00:26:41.250 06:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:41.250 [2024-07-25 06:42:54.771766] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:41.250 [2024-07-25 06:42:54.771808] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:41.250 [2024-07-25 06:42:54.771826] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2778240 00:26:41.250 [2024-07-25 06:42:54.771838] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:41.250 [2024-07-25 06:42:54.773191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:41.250 [2024-07-25 06:42:54.773217] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:41.250 BaseBdev4 00:26:41.250 06:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:41.509 spare_malloc 00:26:41.509 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:41.768 spare_delay 00:26:41.768 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:42.027 [2024-07-25 06:42:55.457950] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:42.027 [2024-07-25 06:42:55.457993] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:42.027 [2024-07-25 06:42:55.458016] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2778e90 00:26:42.027 [2024-07-25 06:42:55.458028] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:42.027 [2024-07-25 06:42:55.459467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:42.027 [2024-07-25 06:42:55.459494] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:42.027 spare 00:26:42.027 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:42.286 [2024-07-25 06:42:55.682573] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:42.286 [2024-07-25 06:42:55.683702] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:42.286 [2024-07-25 06:42:55.683753] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:42.286 [2024-07-25 06:42:55.683793] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:42.286 [2024-07-25 06:42:55.683871] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x277a190 00:26:42.286 [2024-07-25 06:42:55.683881] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:42.286 [2024-07-25 06:42:55.684071] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2777a70 00:26:42.286 [2024-07-25 06:42:55.684214] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x277a190 00:26:42.286 [2024-07-25 06:42:55.684224] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x277a190 00:26:42.286 [2024-07-25 06:42:55.684326] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.286 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.544 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.544 "name": "raid_bdev1", 00:26:42.544 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:42.544 "strip_size_kb": 0, 00:26:42.544 "state": "online", 00:26:42.544 "raid_level": "raid1", 00:26:42.544 "superblock": false, 00:26:42.544 "num_base_bdevs": 4, 00:26:42.544 "num_base_bdevs_discovered": 4, 00:26:42.544 "num_base_bdevs_operational": 4, 00:26:42.544 "base_bdevs_list": [ 00:26:42.544 { 00:26:42.544 "name": "BaseBdev1", 00:26:42.544 "uuid": "6d496a3b-1d47-519c-8591-ce6017ab0dfb", 00:26:42.544 "is_configured": true, 00:26:42.544 "data_offset": 0, 00:26:42.544 "data_size": 65536 00:26:42.544 }, 00:26:42.544 { 00:26:42.544 "name": "BaseBdev2", 00:26:42.544 "uuid": "60c95487-8f97-5633-9257-bc035d116de9", 00:26:42.544 "is_configured": true, 00:26:42.544 "data_offset": 0, 00:26:42.544 "data_size": 65536 00:26:42.544 }, 00:26:42.544 { 00:26:42.544 "name": "BaseBdev3", 00:26:42.544 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:42.544 "is_configured": true, 00:26:42.544 "data_offset": 0, 00:26:42.544 "data_size": 65536 00:26:42.544 }, 00:26:42.544 { 00:26:42.544 "name": "BaseBdev4", 00:26:42.544 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:42.544 "is_configured": true, 00:26:42.544 "data_offset": 0, 00:26:42.544 "data_size": 65536 00:26:42.544 } 00:26:42.544 ] 00:26:42.544 }' 00:26:42.544 06:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.544 06:42:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:43.110 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:43.110 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:26:43.369 [2024-07-25 06:42:56.717542] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:43.369 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:26:43.369 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:43.369 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:43.628 06:42:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:43.628 [2024-07-25 06:42:57.178529] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2777a70 00:26:43.887 /dev/nbd0 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:43.887 1+0 records in 00:26:43.887 1+0 records out 00:26:43.887 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254225 s, 16.1 MB/s 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:26:43.887 06:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:26:50.456 65536+0 records in 00:26:50.456 65536+0 records out 00:26:50.456 33554432 bytes (34 MB, 32 MiB) copied, 5.86351 s, 5.7 MB/s 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:50.456 [2024-07-25 06:43:03.360925] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:50.456 [2024-07-25 06:43:03.577531] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.456 "name": "raid_bdev1", 00:26:50.456 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:50.456 "strip_size_kb": 0, 00:26:50.456 "state": "online", 00:26:50.456 "raid_level": "raid1", 00:26:50.456 "superblock": false, 00:26:50.456 "num_base_bdevs": 4, 00:26:50.456 "num_base_bdevs_discovered": 3, 00:26:50.456 "num_base_bdevs_operational": 3, 00:26:50.456 "base_bdevs_list": [ 00:26:50.456 { 00:26:50.456 "name": null, 00:26:50.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.456 "is_configured": false, 00:26:50.456 "data_offset": 0, 00:26:50.456 "data_size": 65536 00:26:50.456 }, 00:26:50.456 { 00:26:50.456 "name": "BaseBdev2", 00:26:50.456 "uuid": "60c95487-8f97-5633-9257-bc035d116de9", 00:26:50.456 "is_configured": true, 00:26:50.456 "data_offset": 0, 00:26:50.456 "data_size": 65536 00:26:50.456 }, 00:26:50.456 { 00:26:50.456 "name": "BaseBdev3", 00:26:50.456 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:50.456 "is_configured": true, 00:26:50.456 "data_offset": 0, 00:26:50.456 "data_size": 65536 00:26:50.456 }, 00:26:50.456 { 00:26:50.456 "name": "BaseBdev4", 00:26:50.456 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:50.456 "is_configured": true, 00:26:50.456 "data_offset": 0, 00:26:50.456 "data_size": 65536 00:26:50.456 } 00:26:50.456 ] 00:26:50.456 }' 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.456 06:43:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:51.023 06:43:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:51.283 [2024-07-25 06:43:04.604251] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:51.283 [2024-07-25 06:43:04.608076] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2777a70 00:26:51.283 [2024-07-25 06:43:04.610127] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:51.283 06:43:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:52.249 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:52.249 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:52.249 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:52.249 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:52.249 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:52.249 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.249 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.508 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:52.508 "name": "raid_bdev1", 00:26:52.508 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:52.508 "strip_size_kb": 0, 00:26:52.508 "state": "online", 00:26:52.508 "raid_level": "raid1", 00:26:52.508 "superblock": false, 00:26:52.508 "num_base_bdevs": 4, 00:26:52.508 "num_base_bdevs_discovered": 4, 00:26:52.509 "num_base_bdevs_operational": 4, 00:26:52.509 "process": { 00:26:52.509 "type": "rebuild", 00:26:52.509 "target": "spare", 00:26:52.509 "progress": { 00:26:52.509 "blocks": 24576, 00:26:52.509 "percent": 37 00:26:52.509 } 00:26:52.509 }, 00:26:52.509 "base_bdevs_list": [ 00:26:52.509 { 00:26:52.509 "name": "spare", 00:26:52.509 "uuid": "2e178e4f-4c99-55a1-b5aa-2bedf79d2407", 00:26:52.509 "is_configured": true, 00:26:52.509 "data_offset": 0, 00:26:52.509 "data_size": 65536 00:26:52.509 }, 00:26:52.509 { 00:26:52.509 "name": "BaseBdev2", 00:26:52.509 "uuid": "60c95487-8f97-5633-9257-bc035d116de9", 00:26:52.509 "is_configured": true, 00:26:52.509 "data_offset": 0, 00:26:52.509 "data_size": 65536 00:26:52.509 }, 00:26:52.509 { 00:26:52.509 "name": "BaseBdev3", 00:26:52.509 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:52.509 "is_configured": true, 00:26:52.509 "data_offset": 0, 00:26:52.509 "data_size": 65536 00:26:52.509 }, 00:26:52.509 { 00:26:52.509 "name": "BaseBdev4", 00:26:52.509 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:52.509 "is_configured": true, 00:26:52.509 "data_offset": 0, 00:26:52.509 "data_size": 65536 00:26:52.509 } 00:26:52.509 ] 00:26:52.509 }' 00:26:52.509 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:52.509 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:52.509 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:52.509 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:52.509 06:43:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:52.768 [2024-07-25 06:43:06.159122] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:52.768 [2024-07-25 06:43:06.221861] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:52.768 [2024-07-25 06:43:06.221907] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:52.768 [2024-07-25 06:43:06.221923] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:52.768 [2024-07-25 06:43:06.221931] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.768 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.026 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:53.026 "name": "raid_bdev1", 00:26:53.026 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:53.026 "strip_size_kb": 0, 00:26:53.026 "state": "online", 00:26:53.026 "raid_level": "raid1", 00:26:53.026 "superblock": false, 00:26:53.026 "num_base_bdevs": 4, 00:26:53.026 "num_base_bdevs_discovered": 3, 00:26:53.026 "num_base_bdevs_operational": 3, 00:26:53.026 "base_bdevs_list": [ 00:26:53.026 { 00:26:53.026 "name": null, 00:26:53.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.026 "is_configured": false, 00:26:53.026 "data_offset": 0, 00:26:53.026 "data_size": 65536 00:26:53.026 }, 00:26:53.026 { 00:26:53.026 "name": "BaseBdev2", 00:26:53.026 "uuid": "60c95487-8f97-5633-9257-bc035d116de9", 00:26:53.026 "is_configured": true, 00:26:53.026 "data_offset": 0, 00:26:53.026 "data_size": 65536 00:26:53.026 }, 00:26:53.026 { 00:26:53.026 "name": "BaseBdev3", 00:26:53.026 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:53.026 "is_configured": true, 00:26:53.026 "data_offset": 0, 00:26:53.026 "data_size": 65536 00:26:53.026 }, 00:26:53.026 { 00:26:53.026 "name": "BaseBdev4", 00:26:53.026 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:53.026 "is_configured": true, 00:26:53.026 "data_offset": 0, 00:26:53.026 "data_size": 65536 00:26:53.026 } 00:26:53.026 ] 00:26:53.026 }' 00:26:53.026 06:43:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:53.026 06:43:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:53.592 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:53.592 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.592 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:53.592 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:53.592 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.592 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.592 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.851 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.851 "name": "raid_bdev1", 00:26:53.851 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:53.851 "strip_size_kb": 0, 00:26:53.851 "state": "online", 00:26:53.851 "raid_level": "raid1", 00:26:53.851 "superblock": false, 00:26:53.851 "num_base_bdevs": 4, 00:26:53.851 "num_base_bdevs_discovered": 3, 00:26:53.851 "num_base_bdevs_operational": 3, 00:26:53.851 "base_bdevs_list": [ 00:26:53.851 { 00:26:53.851 "name": null, 00:26:53.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.851 "is_configured": false, 00:26:53.851 "data_offset": 0, 00:26:53.851 "data_size": 65536 00:26:53.851 }, 00:26:53.851 { 00:26:53.851 "name": "BaseBdev2", 00:26:53.851 "uuid": "60c95487-8f97-5633-9257-bc035d116de9", 00:26:53.851 "is_configured": true, 00:26:53.851 "data_offset": 0, 00:26:53.851 "data_size": 65536 00:26:53.851 }, 00:26:53.851 { 00:26:53.851 "name": "BaseBdev3", 00:26:53.851 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:53.851 "is_configured": true, 00:26:53.851 "data_offset": 0, 00:26:53.851 "data_size": 65536 00:26:53.851 }, 00:26:53.851 { 00:26:53.851 "name": "BaseBdev4", 00:26:53.851 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:53.851 "is_configured": true, 00:26:53.851 "data_offset": 0, 00:26:53.851 "data_size": 65536 00:26:53.851 } 00:26:53.851 ] 00:26:53.851 }' 00:26:53.851 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.851 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:53.851 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.851 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:53.851 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:54.110 [2024-07-25 06:43:07.553217] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:54.110 [2024-07-25 06:43:07.556991] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2777a70 00:26:54.110 [2024-07-25 06:43:07.558366] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:54.110 06:43:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@678 -- # sleep 1 00:26:55.048 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:55.048 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:55.048 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:55.048 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:55.048 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:55.048 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.048 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:55.307 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:55.307 "name": "raid_bdev1", 00:26:55.307 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:55.307 "strip_size_kb": 0, 00:26:55.307 "state": "online", 00:26:55.307 "raid_level": "raid1", 00:26:55.307 "superblock": false, 00:26:55.307 "num_base_bdevs": 4, 00:26:55.307 "num_base_bdevs_discovered": 4, 00:26:55.307 "num_base_bdevs_operational": 4, 00:26:55.307 "process": { 00:26:55.307 "type": "rebuild", 00:26:55.307 "target": "spare", 00:26:55.307 "progress": { 00:26:55.307 "blocks": 24576, 00:26:55.307 "percent": 37 00:26:55.307 } 00:26:55.307 }, 00:26:55.307 "base_bdevs_list": [ 00:26:55.307 { 00:26:55.307 "name": "spare", 00:26:55.307 "uuid": "2e178e4f-4c99-55a1-b5aa-2bedf79d2407", 00:26:55.307 "is_configured": true, 00:26:55.307 "data_offset": 0, 00:26:55.307 "data_size": 65536 00:26:55.307 }, 00:26:55.307 { 00:26:55.307 "name": "BaseBdev2", 00:26:55.307 "uuid": "60c95487-8f97-5633-9257-bc035d116de9", 00:26:55.307 "is_configured": true, 00:26:55.307 "data_offset": 0, 00:26:55.307 "data_size": 65536 00:26:55.307 }, 00:26:55.307 { 00:26:55.307 "name": "BaseBdev3", 00:26:55.307 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:55.307 "is_configured": true, 00:26:55.307 "data_offset": 0, 00:26:55.307 "data_size": 65536 00:26:55.307 }, 00:26:55.307 { 00:26:55.307 "name": "BaseBdev4", 00:26:55.307 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:55.307 "is_configured": true, 00:26:55.307 "data_offset": 0, 00:26:55.307 "data_size": 65536 00:26:55.307 } 00:26:55.307 ] 00:26:55.307 }' 00:26:55.307 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:55.307 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:55.307 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:55.566 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:55.566 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:26:55.566 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:26:55.566 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:26:55.566 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:26:55.566 06:43:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:55.566 [2024-07-25 06:43:09.111344] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:55.825 [2024-07-25 06:43:09.170059] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2777a70 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.825 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:56.085 "name": "raid_bdev1", 00:26:56.085 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:56.085 "strip_size_kb": 0, 00:26:56.085 "state": "online", 00:26:56.085 "raid_level": "raid1", 00:26:56.085 "superblock": false, 00:26:56.085 "num_base_bdevs": 4, 00:26:56.085 "num_base_bdevs_discovered": 3, 00:26:56.085 "num_base_bdevs_operational": 3, 00:26:56.085 "process": { 00:26:56.085 "type": "rebuild", 00:26:56.085 "target": "spare", 00:26:56.085 "progress": { 00:26:56.085 "blocks": 36864, 00:26:56.085 "percent": 56 00:26:56.085 } 00:26:56.085 }, 00:26:56.085 "base_bdevs_list": [ 00:26:56.085 { 00:26:56.085 "name": "spare", 00:26:56.085 "uuid": "2e178e4f-4c99-55a1-b5aa-2bedf79d2407", 00:26:56.085 "is_configured": true, 00:26:56.085 "data_offset": 0, 00:26:56.085 "data_size": 65536 00:26:56.085 }, 00:26:56.085 { 00:26:56.085 "name": null, 00:26:56.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.085 "is_configured": false, 00:26:56.085 "data_offset": 0, 00:26:56.085 "data_size": 65536 00:26:56.085 }, 00:26:56.085 { 00:26:56.085 "name": "BaseBdev3", 00:26:56.085 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:56.085 "is_configured": true, 00:26:56.085 "data_offset": 0, 00:26:56.085 "data_size": 65536 00:26:56.085 }, 00:26:56.085 { 00:26:56.085 "name": "BaseBdev4", 00:26:56.085 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:56.085 "is_configured": true, 00:26:56.085 "data_offset": 0, 00:26:56.085 "data_size": 65536 00:26:56.085 } 00:26:56.085 ] 00:26:56.085 }' 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # local timeout=838 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.085 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.345 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:56.345 "name": "raid_bdev1", 00:26:56.345 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:56.345 "strip_size_kb": 0, 00:26:56.345 "state": "online", 00:26:56.345 "raid_level": "raid1", 00:26:56.345 "superblock": false, 00:26:56.345 "num_base_bdevs": 4, 00:26:56.345 "num_base_bdevs_discovered": 3, 00:26:56.345 "num_base_bdevs_operational": 3, 00:26:56.345 "process": { 00:26:56.345 "type": "rebuild", 00:26:56.345 "target": "spare", 00:26:56.345 "progress": { 00:26:56.345 "blocks": 43008, 00:26:56.345 "percent": 65 00:26:56.345 } 00:26:56.345 }, 00:26:56.345 "base_bdevs_list": [ 00:26:56.345 { 00:26:56.345 "name": "spare", 00:26:56.345 "uuid": "2e178e4f-4c99-55a1-b5aa-2bedf79d2407", 00:26:56.345 "is_configured": true, 00:26:56.345 "data_offset": 0, 00:26:56.345 "data_size": 65536 00:26:56.345 }, 00:26:56.345 { 00:26:56.345 "name": null, 00:26:56.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.345 "is_configured": false, 00:26:56.345 "data_offset": 0, 00:26:56.345 "data_size": 65536 00:26:56.345 }, 00:26:56.345 { 00:26:56.345 "name": "BaseBdev3", 00:26:56.345 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:56.345 "is_configured": true, 00:26:56.345 "data_offset": 0, 00:26:56.345 "data_size": 65536 00:26:56.345 }, 00:26:56.345 { 00:26:56.345 "name": "BaseBdev4", 00:26:56.345 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:56.345 "is_configured": true, 00:26:56.345 "data_offset": 0, 00:26:56.345 "data_size": 65536 00:26:56.345 } 00:26:56.345 ] 00:26:56.345 }' 00:26:56.345 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:56.345 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:56.345 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:56.345 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:56.345 06:43:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:57.282 [2024-07-25 06:43:10.781749] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:57.282 [2024-07-25 06:43:10.781806] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:57.282 [2024-07-25 06:43:10.781841] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.542 06:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:57.542 06:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:57.542 06:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.542 06:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:57.542 06:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:57.542 06:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.542 06:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.542 06:43:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.542 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.542 "name": "raid_bdev1", 00:26:57.542 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:57.542 "strip_size_kb": 0, 00:26:57.542 "state": "online", 00:26:57.542 "raid_level": "raid1", 00:26:57.542 "superblock": false, 00:26:57.542 "num_base_bdevs": 4, 00:26:57.542 "num_base_bdevs_discovered": 3, 00:26:57.542 "num_base_bdevs_operational": 3, 00:26:57.542 "base_bdevs_list": [ 00:26:57.542 { 00:26:57.542 "name": "spare", 00:26:57.542 "uuid": "2e178e4f-4c99-55a1-b5aa-2bedf79d2407", 00:26:57.542 "is_configured": true, 00:26:57.542 "data_offset": 0, 00:26:57.542 "data_size": 65536 00:26:57.542 }, 00:26:57.542 { 00:26:57.542 "name": null, 00:26:57.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.542 "is_configured": false, 00:26:57.542 "data_offset": 0, 00:26:57.542 "data_size": 65536 00:26:57.542 }, 00:26:57.542 { 00:26:57.542 "name": "BaseBdev3", 00:26:57.542 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:57.542 "is_configured": true, 00:26:57.542 "data_offset": 0, 00:26:57.542 "data_size": 65536 00:26:57.542 }, 00:26:57.542 { 00:26:57.542 "name": "BaseBdev4", 00:26:57.542 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:57.542 "is_configured": true, 00:26:57.542 "data_offset": 0, 00:26:57.542 "data_size": 65536 00:26:57.542 } 00:26:57.542 ] 00:26:57.542 }' 00:26:57.542 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # break 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.801 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:58.061 "name": "raid_bdev1", 00:26:58.061 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:58.061 "strip_size_kb": 0, 00:26:58.061 "state": "online", 00:26:58.061 "raid_level": "raid1", 00:26:58.061 "superblock": false, 00:26:58.061 "num_base_bdevs": 4, 00:26:58.061 "num_base_bdevs_discovered": 3, 00:26:58.061 "num_base_bdevs_operational": 3, 00:26:58.061 "base_bdevs_list": [ 00:26:58.061 { 00:26:58.061 "name": "spare", 00:26:58.061 "uuid": "2e178e4f-4c99-55a1-b5aa-2bedf79d2407", 00:26:58.061 "is_configured": true, 00:26:58.061 "data_offset": 0, 00:26:58.061 "data_size": 65536 00:26:58.061 }, 00:26:58.061 { 00:26:58.061 "name": null, 00:26:58.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.061 "is_configured": false, 00:26:58.061 "data_offset": 0, 00:26:58.061 "data_size": 65536 00:26:58.061 }, 00:26:58.061 { 00:26:58.061 "name": "BaseBdev3", 00:26:58.061 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:58.061 "is_configured": true, 00:26:58.061 "data_offset": 0, 00:26:58.061 "data_size": 65536 00:26:58.061 }, 00:26:58.061 { 00:26:58.061 "name": "BaseBdev4", 00:26:58.061 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:58.061 "is_configured": true, 00:26:58.061 "data_offset": 0, 00:26:58.061 "data_size": 65536 00:26:58.061 } 00:26:58.061 ] 00:26:58.061 }' 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.061 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.321 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:58.321 "name": "raid_bdev1", 00:26:58.321 "uuid": "fffbb513-a5a6-446e-a58e-c84abf8b0174", 00:26:58.321 "strip_size_kb": 0, 00:26:58.321 "state": "online", 00:26:58.321 "raid_level": "raid1", 00:26:58.321 "superblock": false, 00:26:58.321 "num_base_bdevs": 4, 00:26:58.321 "num_base_bdevs_discovered": 3, 00:26:58.321 "num_base_bdevs_operational": 3, 00:26:58.321 "base_bdevs_list": [ 00:26:58.321 { 00:26:58.321 "name": "spare", 00:26:58.321 "uuid": "2e178e4f-4c99-55a1-b5aa-2bedf79d2407", 00:26:58.321 "is_configured": true, 00:26:58.321 "data_offset": 0, 00:26:58.321 "data_size": 65536 00:26:58.321 }, 00:26:58.321 { 00:26:58.321 "name": null, 00:26:58.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.321 "is_configured": false, 00:26:58.321 "data_offset": 0, 00:26:58.321 "data_size": 65536 00:26:58.321 }, 00:26:58.321 { 00:26:58.321 "name": "BaseBdev3", 00:26:58.321 "uuid": "24f60bfa-6530-5666-bda4-ec167405f670", 00:26:58.321 "is_configured": true, 00:26:58.321 "data_offset": 0, 00:26:58.321 "data_size": 65536 00:26:58.321 }, 00:26:58.321 { 00:26:58.321 "name": "BaseBdev4", 00:26:58.321 "uuid": "da008da1-754e-5a4d-a7dc-b2fcaadd7d15", 00:26:58.321 "is_configured": true, 00:26:58.321 "data_offset": 0, 00:26:58.321 "data_size": 65536 00:26:58.321 } 00:26:58.321 ] 00:26:58.321 }' 00:26:58.321 06:43:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:58.321 06:43:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:58.890 06:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:59.149 [2024-07-25 06:43:12.498699] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:59.149 [2024-07-25 06:43:12.498725] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:59.149 [2024-07-25 06:43:12.498782] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:59.149 [2024-07-25 06:43:12.498849] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:59.149 [2024-07-25 06:43:12.498860] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x277a190 name raid_bdev1, state offline 00:26:59.149 06:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.149 06:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # jq length 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:59.409 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:59.669 /dev/nbd0 00:26:59.669 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:59.669 06:43:12 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:59.669 06:43:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:59.669 06:43:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:26:59.669 06:43:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:59.669 06:43:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:59.669 06:43:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:59.669 1+0 records in 00:26:59.669 1+0 records out 00:26:59.669 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243585 s, 16.8 MB/s 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:59.669 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:59.928 /dev/nbd1 00:26:59.928 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:59.928 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:59.928 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:59.928 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:26:59.928 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:59.928 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:59.928 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:59.928 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:59.929 1+0 records in 00:26:59.929 1+0 records out 00:26:59.929 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299877 s, 13.7 MB/s 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@753 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:59.929 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.187 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@798 -- # killprocess 1240317 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 1240317 ']' 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 1240317 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1240317 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1240317' 00:27:00.446 killing process with pid 1240317 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 1240317 00:27:00.446 Received shutdown signal, test time was about 60.000000 seconds 00:27:00.446 00:27:00.446 Latency(us) 00:27:00.446 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:00.446 =================================================================================================================== 00:27:00.446 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:00.446 [2024-07-25 06:43:13.948956] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:00.446 06:43:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 1240317 00:27:00.446 [2024-07-25 06:43:13.989600] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:00.705 06:43:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@800 -- # return 0 00:27:00.705 00:27:00.706 real 0m22.169s 00:27:00.706 user 0m30.631s 00:27:00.706 sys 0m4.576s 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:27:00.706 ************************************ 00:27:00.706 END TEST raid_rebuild_test 00:27:00.706 ************************************ 00:27:00.706 06:43:14 bdev_raid -- bdev/bdev_raid.sh@958 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:27:00.706 06:43:14 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:27:00.706 06:43:14 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:00.706 06:43:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:00.706 ************************************ 00:27:00.706 START TEST raid_rebuild_test_sb 00:27:00.706 ************************************ 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true false true 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # local verify=true 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:00.706 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # local strip_size 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # local create_arg 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # local data_offset 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # raid_pid=1244194 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # waitforlisten 1244194 /var/tmp/spdk-raid.sock 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1244194 ']' 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:00.965 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:00.965 06:43:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:00.965 [2024-07-25 06:43:14.321804] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:27:00.965 [2024-07-25 06:43:14.321861] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1244194 ] 00:27:00.965 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:00.965 Zero copy mechanism will not be used. 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:01.0 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:01.1 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:01.2 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:01.3 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:01.4 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:01.5 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:01.6 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:01.7 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:02.0 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:02.1 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:02.2 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:02.3 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:02.4 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:02.5 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:02.6 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3d:02.7 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:01.0 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:01.1 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:01.2 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:01.3 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:01.4 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:01.5 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:01.6 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:01.7 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:02.0 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:02.1 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:02.2 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:02.3 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:02.4 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.965 EAL: Requested device 0000:3f:02.5 cannot be used 00:27:00.965 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.966 EAL: Requested device 0000:3f:02.6 cannot be used 00:27:00.966 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:00.966 EAL: Requested device 0000:3f:02.7 cannot be used 00:27:00.966 [2024-07-25 06:43:14.458068] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:00.966 [2024-07-25 06:43:14.502905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:01.224 [2024-07-25 06:43:14.564612] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:01.224 [2024-07-25 06:43:14.564648] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:01.792 06:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:01.792 06:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:27:01.792 06:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:01.792 06:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:02.051 BaseBdev1_malloc 00:27:02.051 06:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:02.310 [2024-07-25 06:43:15.663504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:02.310 [2024-07-25 06:43:15.663546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.310 [2024-07-25 06:43:15.663567] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16467b0 00:27:02.310 [2024-07-25 06:43:15.663578] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.310 [2024-07-25 06:43:15.665116] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.310 [2024-07-25 06:43:15.665152] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:02.310 BaseBdev1 00:27:02.310 06:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:02.310 06:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:02.570 BaseBdev2_malloc 00:27:02.570 06:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:02.570 [2024-07-25 06:43:16.121104] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:02.570 [2024-07-25 06:43:16.121153] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.570 [2024-07-25 06:43:16.121174] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14948f0 00:27:02.570 [2024-07-25 06:43:16.121186] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.570 [2024-07-25 06:43:16.122520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.570 [2024-07-25 06:43:16.122551] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:02.570 BaseBdev2 00:27:02.829 06:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:02.829 06:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:02.829 BaseBdev3_malloc 00:27:02.829 06:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:03.088 [2024-07-25 06:43:16.578500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:03.088 [2024-07-25 06:43:16.578542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.088 [2024-07-25 06:43:16.578560] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x163b330 00:27:03.088 [2024-07-25 06:43:16.578572] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.088 [2024-07-25 06:43:16.579895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.088 [2024-07-25 06:43:16.579922] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:03.088 BaseBdev3 00:27:03.088 06:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:03.088 06:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:03.348 BaseBdev4_malloc 00:27:03.348 06:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:03.606 [2024-07-25 06:43:17.032032] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:03.606 [2024-07-25 06:43:17.032073] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.606 [2024-07-25 06:43:17.032092] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148c240 00:27:03.606 [2024-07-25 06:43:17.032103] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.606 [2024-07-25 06:43:17.033437] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.606 [2024-07-25 06:43:17.033463] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:03.606 BaseBdev4 00:27:03.606 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:03.866 spare_malloc 00:27:03.866 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:04.125 spare_delay 00:27:04.125 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:04.383 [2024-07-25 06:43:17.726222] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:04.383 [2024-07-25 06:43:17.726262] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:04.383 [2024-07-25 06:43:17.726283] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148ce90 00:27:04.383 [2024-07-25 06:43:17.726294] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:04.383 [2024-07-25 06:43:17.727574] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:04.383 [2024-07-25 06:43:17.727599] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:04.383 spare 00:27:04.383 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:04.642 [2024-07-25 06:43:17.962866] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:04.642 [2024-07-25 06:43:17.963978] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:04.642 [2024-07-25 06:43:17.964028] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:04.642 [2024-07-25 06:43:17.964067] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:04.642 [2024-07-25 06:43:17.964249] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x148e190 00:27:04.642 [2024-07-25 06:43:17.964260] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:04.642 [2024-07-25 06:43:17.964434] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x148ba70 00:27:04.642 [2024-07-25 06:43:17.964564] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x148e190 00:27:04.642 [2024-07-25 06:43:17.964574] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x148e190 00:27:04.642 [2024-07-25 06:43:17.964658] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.642 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.643 06:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.932 06:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:04.932 "name": "raid_bdev1", 00:27:04.932 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:04.932 "strip_size_kb": 0, 00:27:04.932 "state": "online", 00:27:04.932 "raid_level": "raid1", 00:27:04.932 "superblock": true, 00:27:04.932 "num_base_bdevs": 4, 00:27:04.932 "num_base_bdevs_discovered": 4, 00:27:04.932 "num_base_bdevs_operational": 4, 00:27:04.932 "base_bdevs_list": [ 00:27:04.932 { 00:27:04.932 "name": "BaseBdev1", 00:27:04.932 "uuid": "50bf7b97-8a99-5ad1-b748-52bbeab78ef1", 00:27:04.933 "is_configured": true, 00:27:04.933 "data_offset": 2048, 00:27:04.933 "data_size": 63488 00:27:04.933 }, 00:27:04.933 { 00:27:04.933 "name": "BaseBdev2", 00:27:04.933 "uuid": "ada61370-4bc5-5b21-a0de-db765310285a", 00:27:04.933 "is_configured": true, 00:27:04.933 "data_offset": 2048, 00:27:04.933 "data_size": 63488 00:27:04.933 }, 00:27:04.933 { 00:27:04.933 "name": "BaseBdev3", 00:27:04.933 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:04.933 "is_configured": true, 00:27:04.933 "data_offset": 2048, 00:27:04.933 "data_size": 63488 00:27:04.933 }, 00:27:04.933 { 00:27:04.933 "name": "BaseBdev4", 00:27:04.933 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:04.933 "is_configured": true, 00:27:04.933 "data_offset": 2048, 00:27:04.933 "data_size": 63488 00:27:04.933 } 00:27:04.933 ] 00:27:04.933 }' 00:27:04.933 06:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:04.933 06:43:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:05.536 06:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:05.536 06:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:27:05.536 [2024-07-25 06:43:18.993851] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:05.536 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:27:05.536 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:05.536 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:05.795 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:06.054 [2024-07-25 06:43:19.458842] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x148ba70 00:27:06.054 /dev/nbd0 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:06.054 1+0 records in 00:27:06.054 1+0 records out 00:27:06.054 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252268 s, 16.2 MB/s 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:27:06.054 06:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:27:14.169 63488+0 records in 00:27:14.169 63488+0 records out 00:27:14.169 32505856 bytes (33 MB, 31 MiB) copied, 6.74808 s, 4.8 MB/s 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:14.169 [2024-07-25 06:43:26.514080] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:14.169 [2024-07-25 06:43:26.738709] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:14.169 "name": "raid_bdev1", 00:27:14.169 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:14.169 "strip_size_kb": 0, 00:27:14.169 "state": "online", 00:27:14.169 "raid_level": "raid1", 00:27:14.169 "superblock": true, 00:27:14.169 "num_base_bdevs": 4, 00:27:14.169 "num_base_bdevs_discovered": 3, 00:27:14.169 "num_base_bdevs_operational": 3, 00:27:14.169 "base_bdevs_list": [ 00:27:14.169 { 00:27:14.169 "name": null, 00:27:14.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.169 "is_configured": false, 00:27:14.169 "data_offset": 2048, 00:27:14.169 "data_size": 63488 00:27:14.169 }, 00:27:14.169 { 00:27:14.169 "name": "BaseBdev2", 00:27:14.169 "uuid": "ada61370-4bc5-5b21-a0de-db765310285a", 00:27:14.169 "is_configured": true, 00:27:14.169 "data_offset": 2048, 00:27:14.169 "data_size": 63488 00:27:14.169 }, 00:27:14.169 { 00:27:14.169 "name": "BaseBdev3", 00:27:14.169 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:14.169 "is_configured": true, 00:27:14.169 "data_offset": 2048, 00:27:14.169 "data_size": 63488 00:27:14.169 }, 00:27:14.169 { 00:27:14.169 "name": "BaseBdev4", 00:27:14.169 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:14.169 "is_configured": true, 00:27:14.169 "data_offset": 2048, 00:27:14.169 "data_size": 63488 00:27:14.169 } 00:27:14.169 ] 00:27:14.169 }' 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:14.169 06:43:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:14.170 06:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:14.428 [2024-07-25 06:43:27.773446] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:14.428 [2024-07-25 06:43:27.777237] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x148ba70 00:27:14.428 [2024-07-25 06:43:27.779275] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:14.428 06:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:15.364 06:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:15.364 06:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:15.364 06:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:15.364 06:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:15.364 06:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:15.364 06:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.364 06:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.623 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:15.623 "name": "raid_bdev1", 00:27:15.623 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:15.623 "strip_size_kb": 0, 00:27:15.623 "state": "online", 00:27:15.623 "raid_level": "raid1", 00:27:15.623 "superblock": true, 00:27:15.623 "num_base_bdevs": 4, 00:27:15.623 "num_base_bdevs_discovered": 4, 00:27:15.623 "num_base_bdevs_operational": 4, 00:27:15.623 "process": { 00:27:15.623 "type": "rebuild", 00:27:15.623 "target": "spare", 00:27:15.623 "progress": { 00:27:15.623 "blocks": 24576, 00:27:15.623 "percent": 38 00:27:15.623 } 00:27:15.623 }, 00:27:15.623 "base_bdevs_list": [ 00:27:15.623 { 00:27:15.623 "name": "spare", 00:27:15.623 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:15.623 "is_configured": true, 00:27:15.624 "data_offset": 2048, 00:27:15.624 "data_size": 63488 00:27:15.624 }, 00:27:15.624 { 00:27:15.624 "name": "BaseBdev2", 00:27:15.624 "uuid": "ada61370-4bc5-5b21-a0de-db765310285a", 00:27:15.624 "is_configured": true, 00:27:15.624 "data_offset": 2048, 00:27:15.624 "data_size": 63488 00:27:15.624 }, 00:27:15.624 { 00:27:15.624 "name": "BaseBdev3", 00:27:15.624 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:15.624 "is_configured": true, 00:27:15.624 "data_offset": 2048, 00:27:15.624 "data_size": 63488 00:27:15.624 }, 00:27:15.624 { 00:27:15.624 "name": "BaseBdev4", 00:27:15.624 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:15.624 "is_configured": true, 00:27:15.624 "data_offset": 2048, 00:27:15.624 "data_size": 63488 00:27:15.624 } 00:27:15.624 ] 00:27:15.624 }' 00:27:15.624 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:15.624 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:15.624 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:15.624 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:15.624 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:15.883 [2024-07-25 06:43:29.328291] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:15.883 [2024-07-25 06:43:29.390984] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:15.883 [2024-07-25 06:43:29.391024] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:15.883 [2024-07-25 06:43:29.391040] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:15.883 [2024-07-25 06:43:29.391052] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.883 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.142 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.142 "name": "raid_bdev1", 00:27:16.142 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:16.142 "strip_size_kb": 0, 00:27:16.142 "state": "online", 00:27:16.142 "raid_level": "raid1", 00:27:16.142 "superblock": true, 00:27:16.142 "num_base_bdevs": 4, 00:27:16.142 "num_base_bdevs_discovered": 3, 00:27:16.142 "num_base_bdevs_operational": 3, 00:27:16.142 "base_bdevs_list": [ 00:27:16.142 { 00:27:16.142 "name": null, 00:27:16.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.142 "is_configured": false, 00:27:16.142 "data_offset": 2048, 00:27:16.142 "data_size": 63488 00:27:16.142 }, 00:27:16.142 { 00:27:16.142 "name": "BaseBdev2", 00:27:16.142 "uuid": "ada61370-4bc5-5b21-a0de-db765310285a", 00:27:16.142 "is_configured": true, 00:27:16.142 "data_offset": 2048, 00:27:16.142 "data_size": 63488 00:27:16.142 }, 00:27:16.142 { 00:27:16.142 "name": "BaseBdev3", 00:27:16.142 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:16.142 "is_configured": true, 00:27:16.142 "data_offset": 2048, 00:27:16.142 "data_size": 63488 00:27:16.142 }, 00:27:16.142 { 00:27:16.142 "name": "BaseBdev4", 00:27:16.142 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:16.142 "is_configured": true, 00:27:16.142 "data_offset": 2048, 00:27:16.142 "data_size": 63488 00:27:16.142 } 00:27:16.142 ] 00:27:16.142 }' 00:27:16.142 06:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.142 06:43:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:16.709 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:16.709 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:16.710 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:16.710 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:16.710 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:16.710 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.710 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.968 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.968 "name": "raid_bdev1", 00:27:16.968 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:16.968 "strip_size_kb": 0, 00:27:16.968 "state": "online", 00:27:16.968 "raid_level": "raid1", 00:27:16.968 "superblock": true, 00:27:16.968 "num_base_bdevs": 4, 00:27:16.968 "num_base_bdevs_discovered": 3, 00:27:16.968 "num_base_bdevs_operational": 3, 00:27:16.968 "base_bdevs_list": [ 00:27:16.968 { 00:27:16.969 "name": null, 00:27:16.969 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.969 "is_configured": false, 00:27:16.969 "data_offset": 2048, 00:27:16.969 "data_size": 63488 00:27:16.969 }, 00:27:16.969 { 00:27:16.969 "name": "BaseBdev2", 00:27:16.969 "uuid": "ada61370-4bc5-5b21-a0de-db765310285a", 00:27:16.969 "is_configured": true, 00:27:16.969 "data_offset": 2048, 00:27:16.969 "data_size": 63488 00:27:16.969 }, 00:27:16.969 { 00:27:16.969 "name": "BaseBdev3", 00:27:16.969 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:16.969 "is_configured": true, 00:27:16.969 "data_offset": 2048, 00:27:16.969 "data_size": 63488 00:27:16.969 }, 00:27:16.969 { 00:27:16.969 "name": "BaseBdev4", 00:27:16.969 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:16.969 "is_configured": true, 00:27:16.969 "data_offset": 2048, 00:27:16.969 "data_size": 63488 00:27:16.969 } 00:27:16.969 ] 00:27:16.969 }' 00:27:16.969 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.969 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:16.969 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:16.969 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:16.969 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:17.227 [2024-07-25 06:43:30.710333] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:17.227 [2024-07-25 06:43:30.714106] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x148ba70 00:27:17.227 [2024-07-25 06:43:30.715478] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:17.227 06:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@678 -- # sleep 1 00:27:18.605 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:18.605 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:18.605 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:18.605 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:18.605 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:18.605 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.605 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.605 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:18.605 "name": "raid_bdev1", 00:27:18.605 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:18.605 "strip_size_kb": 0, 00:27:18.605 "state": "online", 00:27:18.605 "raid_level": "raid1", 00:27:18.605 "superblock": true, 00:27:18.605 "num_base_bdevs": 4, 00:27:18.605 "num_base_bdevs_discovered": 4, 00:27:18.605 "num_base_bdevs_operational": 4, 00:27:18.605 "process": { 00:27:18.605 "type": "rebuild", 00:27:18.605 "target": "spare", 00:27:18.605 "progress": { 00:27:18.605 "blocks": 24576, 00:27:18.605 "percent": 38 00:27:18.605 } 00:27:18.605 }, 00:27:18.605 "base_bdevs_list": [ 00:27:18.605 { 00:27:18.605 "name": "spare", 00:27:18.605 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:18.605 "is_configured": true, 00:27:18.605 "data_offset": 2048, 00:27:18.605 "data_size": 63488 00:27:18.605 }, 00:27:18.605 { 00:27:18.605 "name": "BaseBdev2", 00:27:18.605 "uuid": "ada61370-4bc5-5b21-a0de-db765310285a", 00:27:18.605 "is_configured": true, 00:27:18.605 "data_offset": 2048, 00:27:18.605 "data_size": 63488 00:27:18.605 }, 00:27:18.605 { 00:27:18.605 "name": "BaseBdev3", 00:27:18.605 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:18.605 "is_configured": true, 00:27:18.605 "data_offset": 2048, 00:27:18.606 "data_size": 63488 00:27:18.606 }, 00:27:18.606 { 00:27:18.606 "name": "BaseBdev4", 00:27:18.606 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:18.606 "is_configured": true, 00:27:18.606 "data_offset": 2048, 00:27:18.606 "data_size": 63488 00:27:18.606 } 00:27:18.606 ] 00:27:18.606 }' 00:27:18.606 06:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:27:18.606 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:27:18.606 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:18.864 [2024-07-25 06:43:32.268518] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:19.123 [2024-07-25 06:43:32.427434] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x148ba70 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.123 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:19.382 "name": "raid_bdev1", 00:27:19.382 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:19.382 "strip_size_kb": 0, 00:27:19.382 "state": "online", 00:27:19.382 "raid_level": "raid1", 00:27:19.382 "superblock": true, 00:27:19.382 "num_base_bdevs": 4, 00:27:19.382 "num_base_bdevs_discovered": 3, 00:27:19.382 "num_base_bdevs_operational": 3, 00:27:19.382 "process": { 00:27:19.382 "type": "rebuild", 00:27:19.382 "target": "spare", 00:27:19.382 "progress": { 00:27:19.382 "blocks": 36864, 00:27:19.382 "percent": 58 00:27:19.382 } 00:27:19.382 }, 00:27:19.382 "base_bdevs_list": [ 00:27:19.382 { 00:27:19.382 "name": "spare", 00:27:19.382 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:19.382 "is_configured": true, 00:27:19.382 "data_offset": 2048, 00:27:19.382 "data_size": 63488 00:27:19.382 }, 00:27:19.382 { 00:27:19.382 "name": null, 00:27:19.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.382 "is_configured": false, 00:27:19.382 "data_offset": 2048, 00:27:19.382 "data_size": 63488 00:27:19.382 }, 00:27:19.382 { 00:27:19.382 "name": "BaseBdev3", 00:27:19.382 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:19.382 "is_configured": true, 00:27:19.382 "data_offset": 2048, 00:27:19.382 "data_size": 63488 00:27:19.382 }, 00:27:19.382 { 00:27:19.382 "name": "BaseBdev4", 00:27:19.382 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:19.382 "is_configured": true, 00:27:19.382 "data_offset": 2048, 00:27:19.382 "data_size": 63488 00:27:19.382 } 00:27:19.382 ] 00:27:19.382 }' 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # local timeout=861 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.382 06:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.640 06:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:19.640 "name": "raid_bdev1", 00:27:19.640 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:19.640 "strip_size_kb": 0, 00:27:19.640 "state": "online", 00:27:19.640 "raid_level": "raid1", 00:27:19.640 "superblock": true, 00:27:19.640 "num_base_bdevs": 4, 00:27:19.640 "num_base_bdevs_discovered": 3, 00:27:19.640 "num_base_bdevs_operational": 3, 00:27:19.640 "process": { 00:27:19.640 "type": "rebuild", 00:27:19.640 "target": "spare", 00:27:19.640 "progress": { 00:27:19.640 "blocks": 43008, 00:27:19.640 "percent": 67 00:27:19.640 } 00:27:19.640 }, 00:27:19.640 "base_bdevs_list": [ 00:27:19.640 { 00:27:19.640 "name": "spare", 00:27:19.640 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:19.640 "is_configured": true, 00:27:19.640 "data_offset": 2048, 00:27:19.640 "data_size": 63488 00:27:19.640 }, 00:27:19.640 { 00:27:19.640 "name": null, 00:27:19.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.640 "is_configured": false, 00:27:19.640 "data_offset": 2048, 00:27:19.640 "data_size": 63488 00:27:19.640 }, 00:27:19.640 { 00:27:19.640 "name": "BaseBdev3", 00:27:19.640 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:19.640 "is_configured": true, 00:27:19.640 "data_offset": 2048, 00:27:19.640 "data_size": 63488 00:27:19.640 }, 00:27:19.640 { 00:27:19.640 "name": "BaseBdev4", 00:27:19.640 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:19.640 "is_configured": true, 00:27:19.640 "data_offset": 2048, 00:27:19.640 "data_size": 63488 00:27:19.640 } 00:27:19.640 ] 00:27:19.640 }' 00:27:19.640 06:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:19.640 06:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:19.640 06:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:19.640 06:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:19.640 06:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:20.574 [2024-07-25 06:43:33.938369] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:20.574 [2024-07-25 06:43:33.938421] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:20.574 [2024-07-25 06:43:33.938508] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:20.574 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:20.574 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:20.574 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:20.574 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:20.574 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:20.574 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:20.574 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.574 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.833 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:20.833 "name": "raid_bdev1", 00:27:20.833 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:20.833 "strip_size_kb": 0, 00:27:20.833 "state": "online", 00:27:20.833 "raid_level": "raid1", 00:27:20.833 "superblock": true, 00:27:20.833 "num_base_bdevs": 4, 00:27:20.833 "num_base_bdevs_discovered": 3, 00:27:20.833 "num_base_bdevs_operational": 3, 00:27:20.833 "base_bdevs_list": [ 00:27:20.833 { 00:27:20.833 "name": "spare", 00:27:20.833 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:20.833 "is_configured": true, 00:27:20.833 "data_offset": 2048, 00:27:20.833 "data_size": 63488 00:27:20.833 }, 00:27:20.833 { 00:27:20.833 "name": null, 00:27:20.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.833 "is_configured": false, 00:27:20.833 "data_offset": 2048, 00:27:20.833 "data_size": 63488 00:27:20.833 }, 00:27:20.833 { 00:27:20.833 "name": "BaseBdev3", 00:27:20.833 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:20.833 "is_configured": true, 00:27:20.833 "data_offset": 2048, 00:27:20.833 "data_size": 63488 00:27:20.833 }, 00:27:20.833 { 00:27:20.833 "name": "BaseBdev4", 00:27:20.833 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:20.833 "is_configured": true, 00:27:20.833 "data_offset": 2048, 00:27:20.833 "data_size": 63488 00:27:20.833 } 00:27:20.833 ] 00:27:20.833 }' 00:27:20.833 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:20.833 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:20.833 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # break 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.091 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:21.092 "name": "raid_bdev1", 00:27:21.092 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:21.092 "strip_size_kb": 0, 00:27:21.092 "state": "online", 00:27:21.092 "raid_level": "raid1", 00:27:21.092 "superblock": true, 00:27:21.092 "num_base_bdevs": 4, 00:27:21.092 "num_base_bdevs_discovered": 3, 00:27:21.092 "num_base_bdevs_operational": 3, 00:27:21.092 "base_bdevs_list": [ 00:27:21.092 { 00:27:21.092 "name": "spare", 00:27:21.092 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:21.092 "is_configured": true, 00:27:21.092 "data_offset": 2048, 00:27:21.092 "data_size": 63488 00:27:21.092 }, 00:27:21.092 { 00:27:21.092 "name": null, 00:27:21.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.092 "is_configured": false, 00:27:21.092 "data_offset": 2048, 00:27:21.092 "data_size": 63488 00:27:21.092 }, 00:27:21.092 { 00:27:21.092 "name": "BaseBdev3", 00:27:21.092 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:21.092 "is_configured": true, 00:27:21.092 "data_offset": 2048, 00:27:21.092 "data_size": 63488 00:27:21.092 }, 00:27:21.092 { 00:27:21.092 "name": "BaseBdev4", 00:27:21.092 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:21.092 "is_configured": true, 00:27:21.092 "data_offset": 2048, 00:27:21.092 "data_size": 63488 00:27:21.092 } 00:27:21.092 ] 00:27:21.092 }' 00:27:21.092 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.350 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.608 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.608 "name": "raid_bdev1", 00:27:21.608 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:21.608 "strip_size_kb": 0, 00:27:21.608 "state": "online", 00:27:21.608 "raid_level": "raid1", 00:27:21.608 "superblock": true, 00:27:21.608 "num_base_bdevs": 4, 00:27:21.608 "num_base_bdevs_discovered": 3, 00:27:21.608 "num_base_bdevs_operational": 3, 00:27:21.608 "base_bdevs_list": [ 00:27:21.608 { 00:27:21.608 "name": "spare", 00:27:21.608 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:21.608 "is_configured": true, 00:27:21.608 "data_offset": 2048, 00:27:21.608 "data_size": 63488 00:27:21.608 }, 00:27:21.608 { 00:27:21.608 "name": null, 00:27:21.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.608 "is_configured": false, 00:27:21.608 "data_offset": 2048, 00:27:21.608 "data_size": 63488 00:27:21.608 }, 00:27:21.608 { 00:27:21.608 "name": "BaseBdev3", 00:27:21.608 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:21.608 "is_configured": true, 00:27:21.608 "data_offset": 2048, 00:27:21.608 "data_size": 63488 00:27:21.608 }, 00:27:21.608 { 00:27:21.608 "name": "BaseBdev4", 00:27:21.608 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:21.608 "is_configured": true, 00:27:21.608 "data_offset": 2048, 00:27:21.608 "data_size": 63488 00:27:21.608 } 00:27:21.608 ] 00:27:21.608 }' 00:27:21.608 06:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.608 06:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:22.176 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:22.176 [2024-07-25 06:43:35.687217] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:22.176 [2024-07-25 06:43:35.687242] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:22.176 [2024-07-25 06:43:35.687294] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:22.176 [2024-07-25 06:43:35.687357] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:22.176 [2024-07-25 06:43:35.687368] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x148e190 name raid_bdev1, state offline 00:27:22.176 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.176 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # jq length 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:22.435 06:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:22.694 /dev/nbd0 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:22.694 1+0 records in 00:27:22.694 1+0 records out 00:27:22.694 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024662 s, 16.6 MB/s 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:22.694 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:22.953 /dev/nbd1 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:22.953 1+0 records in 00:27:22.953 1+0 records out 00:27:22.953 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280781 s, 14.6 MB/s 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:22.953 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:23.243 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:23.243 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:23.243 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:23.243 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:23.243 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:27:23.243 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:23.243 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:23.502 06:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:23.502 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:23.761 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:24.083 [2024-07-25 06:43:37.502411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:24.084 [2024-07-25 06:43:37.502452] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:24.084 [2024-07-25 06:43:37.502471] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14936e0 00:27:24.084 [2024-07-25 06:43:37.502482] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:24.084 [2024-07-25 06:43:37.503958] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:24.084 [2024-07-25 06:43:37.503985] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:24.084 [2024-07-25 06:43:37.504058] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:24.084 [2024-07-25 06:43:37.504081] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:24.084 [2024-07-25 06:43:37.504178] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:24.084 [2024-07-25 06:43:37.504243] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:24.084 spare 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:24.084 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.084 [2024-07-25 06:43:37.604550] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1491cd0 00:27:24.084 [2024-07-25 06:43:37.604567] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:24.084 [2024-07-25 06:43:37.604738] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x163acf0 00:27:24.084 [2024-07-25 06:43:37.604874] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1491cd0 00:27:24.084 [2024-07-25 06:43:37.604883] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1491cd0 00:27:24.084 [2024-07-25 06:43:37.604978] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:24.343 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:24.343 "name": "raid_bdev1", 00:27:24.343 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:24.343 "strip_size_kb": 0, 00:27:24.343 "state": "online", 00:27:24.343 "raid_level": "raid1", 00:27:24.343 "superblock": true, 00:27:24.343 "num_base_bdevs": 4, 00:27:24.343 "num_base_bdevs_discovered": 3, 00:27:24.343 "num_base_bdevs_operational": 3, 00:27:24.343 "base_bdevs_list": [ 00:27:24.343 { 00:27:24.343 "name": "spare", 00:27:24.343 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:24.343 "is_configured": true, 00:27:24.343 "data_offset": 2048, 00:27:24.343 "data_size": 63488 00:27:24.343 }, 00:27:24.343 { 00:27:24.343 "name": null, 00:27:24.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.343 "is_configured": false, 00:27:24.343 "data_offset": 2048, 00:27:24.343 "data_size": 63488 00:27:24.343 }, 00:27:24.343 { 00:27:24.343 "name": "BaseBdev3", 00:27:24.343 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:24.343 "is_configured": true, 00:27:24.343 "data_offset": 2048, 00:27:24.343 "data_size": 63488 00:27:24.343 }, 00:27:24.343 { 00:27:24.343 "name": "BaseBdev4", 00:27:24.343 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:24.343 "is_configured": true, 00:27:24.343 "data_offset": 2048, 00:27:24.343 "data_size": 63488 00:27:24.343 } 00:27:24.343 ] 00:27:24.343 }' 00:27:24.343 06:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:24.343 06:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:24.910 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:24.910 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:24.910 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:24.910 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:24.910 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:24.910 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.910 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:25.169 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:25.169 "name": "raid_bdev1", 00:27:25.169 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:25.169 "strip_size_kb": 0, 00:27:25.169 "state": "online", 00:27:25.169 "raid_level": "raid1", 00:27:25.169 "superblock": true, 00:27:25.169 "num_base_bdevs": 4, 00:27:25.169 "num_base_bdevs_discovered": 3, 00:27:25.169 "num_base_bdevs_operational": 3, 00:27:25.169 "base_bdevs_list": [ 00:27:25.169 { 00:27:25.169 "name": "spare", 00:27:25.169 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:25.169 "is_configured": true, 00:27:25.170 "data_offset": 2048, 00:27:25.170 "data_size": 63488 00:27:25.170 }, 00:27:25.170 { 00:27:25.170 "name": null, 00:27:25.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.170 "is_configured": false, 00:27:25.170 "data_offset": 2048, 00:27:25.170 "data_size": 63488 00:27:25.170 }, 00:27:25.170 { 00:27:25.170 "name": "BaseBdev3", 00:27:25.170 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:25.170 "is_configured": true, 00:27:25.170 "data_offset": 2048, 00:27:25.170 "data_size": 63488 00:27:25.170 }, 00:27:25.170 { 00:27:25.170 "name": "BaseBdev4", 00:27:25.170 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:25.170 "is_configured": true, 00:27:25.170 "data_offset": 2048, 00:27:25.170 "data_size": 63488 00:27:25.170 } 00:27:25.170 ] 00:27:25.170 }' 00:27:25.170 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:25.170 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:25.170 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:25.170 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:25.170 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.170 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:25.428 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:27:25.428 06:43:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:25.687 [2024-07-25 06:43:39.038630] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.687 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:25.946 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:25.946 "name": "raid_bdev1", 00:27:25.946 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:25.946 "strip_size_kb": 0, 00:27:25.946 "state": "online", 00:27:25.946 "raid_level": "raid1", 00:27:25.946 "superblock": true, 00:27:25.946 "num_base_bdevs": 4, 00:27:25.946 "num_base_bdevs_discovered": 2, 00:27:25.946 "num_base_bdevs_operational": 2, 00:27:25.946 "base_bdevs_list": [ 00:27:25.946 { 00:27:25.946 "name": null, 00:27:25.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.946 "is_configured": false, 00:27:25.946 "data_offset": 2048, 00:27:25.946 "data_size": 63488 00:27:25.946 }, 00:27:25.946 { 00:27:25.946 "name": null, 00:27:25.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.946 "is_configured": false, 00:27:25.946 "data_offset": 2048, 00:27:25.946 "data_size": 63488 00:27:25.946 }, 00:27:25.946 { 00:27:25.946 "name": "BaseBdev3", 00:27:25.946 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:25.946 "is_configured": true, 00:27:25.946 "data_offset": 2048, 00:27:25.946 "data_size": 63488 00:27:25.946 }, 00:27:25.946 { 00:27:25.946 "name": "BaseBdev4", 00:27:25.946 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:25.946 "is_configured": true, 00:27:25.946 "data_offset": 2048, 00:27:25.946 "data_size": 63488 00:27:25.946 } 00:27:25.946 ] 00:27:25.946 }' 00:27:25.946 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:25.946 06:43:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:26.512 06:43:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:26.512 [2024-07-25 06:43:40.061376] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:26.512 [2024-07-25 06:43:40.061512] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:26.512 [2024-07-25 06:43:40.061527] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:26.512 [2024-07-25 06:43:40.061555] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:26.512 [2024-07-25 06:43:40.065197] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1639520 00:27:26.512 [2024-07-25 06:43:40.066431] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:26.771 06:43:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # sleep 1 00:27:27.706 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:27.706 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:27.706 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:27.706 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:27.706 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:27.706 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.706 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.965 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:27.965 "name": "raid_bdev1", 00:27:27.965 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:27.965 "strip_size_kb": 0, 00:27:27.965 "state": "online", 00:27:27.965 "raid_level": "raid1", 00:27:27.965 "superblock": true, 00:27:27.965 "num_base_bdevs": 4, 00:27:27.965 "num_base_bdevs_discovered": 3, 00:27:27.965 "num_base_bdevs_operational": 3, 00:27:27.965 "process": { 00:27:27.965 "type": "rebuild", 00:27:27.965 "target": "spare", 00:27:27.965 "progress": { 00:27:27.965 "blocks": 24576, 00:27:27.965 "percent": 38 00:27:27.965 } 00:27:27.965 }, 00:27:27.965 "base_bdevs_list": [ 00:27:27.965 { 00:27:27.965 "name": "spare", 00:27:27.965 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:27.965 "is_configured": true, 00:27:27.965 "data_offset": 2048, 00:27:27.965 "data_size": 63488 00:27:27.965 }, 00:27:27.965 { 00:27:27.965 "name": null, 00:27:27.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.965 "is_configured": false, 00:27:27.965 "data_offset": 2048, 00:27:27.965 "data_size": 63488 00:27:27.965 }, 00:27:27.965 { 00:27:27.965 "name": "BaseBdev3", 00:27:27.965 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:27.965 "is_configured": true, 00:27:27.965 "data_offset": 2048, 00:27:27.965 "data_size": 63488 00:27:27.965 }, 00:27:27.965 { 00:27:27.965 "name": "BaseBdev4", 00:27:27.965 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:27.965 "is_configured": true, 00:27:27.965 "data_offset": 2048, 00:27:27.965 "data_size": 63488 00:27:27.965 } 00:27:27.965 ] 00:27:27.965 }' 00:27:27.965 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:27.965 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:27.965 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:27.965 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:27.965 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:28.224 [2024-07-25 06:43:41.611409] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:28.224 [2024-07-25 06:43:41.678073] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:28.224 [2024-07-25 06:43:41.678119] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:28.224 [2024-07-25 06:43:41.678133] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:28.224 [2024-07-25 06:43:41.678147] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.224 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:28.482 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:28.482 "name": "raid_bdev1", 00:27:28.482 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:28.482 "strip_size_kb": 0, 00:27:28.482 "state": "online", 00:27:28.482 "raid_level": "raid1", 00:27:28.482 "superblock": true, 00:27:28.482 "num_base_bdevs": 4, 00:27:28.482 "num_base_bdevs_discovered": 2, 00:27:28.482 "num_base_bdevs_operational": 2, 00:27:28.482 "base_bdevs_list": [ 00:27:28.482 { 00:27:28.482 "name": null, 00:27:28.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:28.482 "is_configured": false, 00:27:28.482 "data_offset": 2048, 00:27:28.482 "data_size": 63488 00:27:28.482 }, 00:27:28.482 { 00:27:28.482 "name": null, 00:27:28.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:28.482 "is_configured": false, 00:27:28.482 "data_offset": 2048, 00:27:28.482 "data_size": 63488 00:27:28.482 }, 00:27:28.482 { 00:27:28.482 "name": "BaseBdev3", 00:27:28.482 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:28.482 "is_configured": true, 00:27:28.482 "data_offset": 2048, 00:27:28.482 "data_size": 63488 00:27:28.482 }, 00:27:28.482 { 00:27:28.482 "name": "BaseBdev4", 00:27:28.482 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:28.482 "is_configured": true, 00:27:28.482 "data_offset": 2048, 00:27:28.482 "data_size": 63488 00:27:28.482 } 00:27:28.482 ] 00:27:28.482 }' 00:27:28.482 06:43:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:28.482 06:43:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:29.049 06:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:29.308 [2024-07-25 06:43:42.676683] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:29.308 [2024-07-25 06:43:42.676729] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:29.308 [2024-07-25 06:43:42.676748] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148f650 00:27:29.308 [2024-07-25 06:43:42.676760] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:29.308 [2024-07-25 06:43:42.677097] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:29.308 [2024-07-25 06:43:42.677113] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:29.308 [2024-07-25 06:43:42.677193] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:29.308 [2024-07-25 06:43:42.677205] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:29.308 [2024-07-25 06:43:42.677214] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:29.308 [2024-07-25 06:43:42.677238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:29.308 [2024-07-25 06:43:42.680989] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14936b0 00:27:29.308 spare 00:27:29.308 [2024-07-25 06:43:42.682239] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:29.308 06:43:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:27:30.243 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:30.243 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:30.243 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:30.243 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:30.243 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:30.243 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.243 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:30.501 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:30.501 "name": "raid_bdev1", 00:27:30.501 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:30.501 "strip_size_kb": 0, 00:27:30.501 "state": "online", 00:27:30.501 "raid_level": "raid1", 00:27:30.501 "superblock": true, 00:27:30.501 "num_base_bdevs": 4, 00:27:30.501 "num_base_bdevs_discovered": 3, 00:27:30.501 "num_base_bdevs_operational": 3, 00:27:30.501 "process": { 00:27:30.501 "type": "rebuild", 00:27:30.501 "target": "spare", 00:27:30.502 "progress": { 00:27:30.502 "blocks": 24576, 00:27:30.502 "percent": 38 00:27:30.502 } 00:27:30.502 }, 00:27:30.502 "base_bdevs_list": [ 00:27:30.502 { 00:27:30.502 "name": "spare", 00:27:30.502 "uuid": "6a1fd8d2-da0d-545a-880d-eb1e04bc132e", 00:27:30.502 "is_configured": true, 00:27:30.502 "data_offset": 2048, 00:27:30.502 "data_size": 63488 00:27:30.502 }, 00:27:30.502 { 00:27:30.502 "name": null, 00:27:30.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:30.502 "is_configured": false, 00:27:30.502 "data_offset": 2048, 00:27:30.502 "data_size": 63488 00:27:30.502 }, 00:27:30.502 { 00:27:30.502 "name": "BaseBdev3", 00:27:30.502 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:30.502 "is_configured": true, 00:27:30.502 "data_offset": 2048, 00:27:30.502 "data_size": 63488 00:27:30.502 }, 00:27:30.502 { 00:27:30.502 "name": "BaseBdev4", 00:27:30.502 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:30.502 "is_configured": true, 00:27:30.502 "data_offset": 2048, 00:27:30.502 "data_size": 63488 00:27:30.502 } 00:27:30.502 ] 00:27:30.502 }' 00:27:30.502 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:30.502 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:30.502 06:43:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:30.502 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:30.502 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:30.761 [2024-07-25 06:43:44.233918] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:30.761 [2024-07-25 06:43:44.293827] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:30.761 [2024-07-25 06:43:44.293872] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:30.761 [2024-07-25 06:43:44.293887] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:30.761 [2024-07-25 06:43:44.293894] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.019 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:31.019 "name": "raid_bdev1", 00:27:31.019 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:31.019 "strip_size_kb": 0, 00:27:31.019 "state": "online", 00:27:31.019 "raid_level": "raid1", 00:27:31.019 "superblock": true, 00:27:31.019 "num_base_bdevs": 4, 00:27:31.019 "num_base_bdevs_discovered": 2, 00:27:31.019 "num_base_bdevs_operational": 2, 00:27:31.019 "base_bdevs_list": [ 00:27:31.019 { 00:27:31.019 "name": null, 00:27:31.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.019 "is_configured": false, 00:27:31.019 "data_offset": 2048, 00:27:31.019 "data_size": 63488 00:27:31.019 }, 00:27:31.019 { 00:27:31.019 "name": null, 00:27:31.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.019 "is_configured": false, 00:27:31.019 "data_offset": 2048, 00:27:31.019 "data_size": 63488 00:27:31.019 }, 00:27:31.019 { 00:27:31.019 "name": "BaseBdev3", 00:27:31.019 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:31.019 "is_configured": true, 00:27:31.020 "data_offset": 2048, 00:27:31.020 "data_size": 63488 00:27:31.020 }, 00:27:31.020 { 00:27:31.020 "name": "BaseBdev4", 00:27:31.020 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:31.020 "is_configured": true, 00:27:31.020 "data_offset": 2048, 00:27:31.020 "data_size": 63488 00:27:31.020 } 00:27:31.020 ] 00:27:31.020 }' 00:27:31.020 06:43:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:31.020 06:43:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:31.587 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:31.587 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:31.587 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:31.587 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:31.587 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:31.587 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.587 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.846 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:31.846 "name": "raid_bdev1", 00:27:31.846 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:31.846 "strip_size_kb": 0, 00:27:31.846 "state": "online", 00:27:31.846 "raid_level": "raid1", 00:27:31.846 "superblock": true, 00:27:31.846 "num_base_bdevs": 4, 00:27:31.846 "num_base_bdevs_discovered": 2, 00:27:31.846 "num_base_bdevs_operational": 2, 00:27:31.846 "base_bdevs_list": [ 00:27:31.846 { 00:27:31.846 "name": null, 00:27:31.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.846 "is_configured": false, 00:27:31.846 "data_offset": 2048, 00:27:31.846 "data_size": 63488 00:27:31.846 }, 00:27:31.846 { 00:27:31.846 "name": null, 00:27:31.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.846 "is_configured": false, 00:27:31.846 "data_offset": 2048, 00:27:31.846 "data_size": 63488 00:27:31.846 }, 00:27:31.846 { 00:27:31.846 "name": "BaseBdev3", 00:27:31.847 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:31.847 "is_configured": true, 00:27:31.847 "data_offset": 2048, 00:27:31.847 "data_size": 63488 00:27:31.847 }, 00:27:31.847 { 00:27:31.847 "name": "BaseBdev4", 00:27:31.847 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:31.847 "is_configured": true, 00:27:31.847 "data_offset": 2048, 00:27:31.847 "data_size": 63488 00:27:31.847 } 00:27:31.847 ] 00:27:31.847 }' 00:27:31.847 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:32.105 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:32.105 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:32.105 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:32.105 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:32.364 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:32.622 [2024-07-25 06:43:45.933871] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:32.622 [2024-07-25 06:43:45.933913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:32.622 [2024-07-25 06:43:45.933931] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148d620 00:27:32.622 [2024-07-25 06:43:45.933943] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:32.622 [2024-07-25 06:43:45.934262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:32.622 [2024-07-25 06:43:45.934279] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:32.622 [2024-07-25 06:43:45.934336] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:32.622 [2024-07-25 06:43:45.934346] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:32.622 [2024-07-25 06:43:45.934356] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:32.622 BaseBdev1 00:27:32.622 06:43:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # sleep 1 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:33.557 06:43:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.815 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:33.815 "name": "raid_bdev1", 00:27:33.815 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:33.815 "strip_size_kb": 0, 00:27:33.815 "state": "online", 00:27:33.815 "raid_level": "raid1", 00:27:33.815 "superblock": true, 00:27:33.815 "num_base_bdevs": 4, 00:27:33.815 "num_base_bdevs_discovered": 2, 00:27:33.815 "num_base_bdevs_operational": 2, 00:27:33.815 "base_bdevs_list": [ 00:27:33.815 { 00:27:33.815 "name": null, 00:27:33.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:33.815 "is_configured": false, 00:27:33.815 "data_offset": 2048, 00:27:33.815 "data_size": 63488 00:27:33.815 }, 00:27:33.815 { 00:27:33.815 "name": null, 00:27:33.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:33.815 "is_configured": false, 00:27:33.815 "data_offset": 2048, 00:27:33.815 "data_size": 63488 00:27:33.815 }, 00:27:33.815 { 00:27:33.815 "name": "BaseBdev3", 00:27:33.815 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:33.815 "is_configured": true, 00:27:33.815 "data_offset": 2048, 00:27:33.815 "data_size": 63488 00:27:33.815 }, 00:27:33.815 { 00:27:33.815 "name": "BaseBdev4", 00:27:33.815 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:33.815 "is_configured": true, 00:27:33.815 "data_offset": 2048, 00:27:33.815 "data_size": 63488 00:27:33.815 } 00:27:33.815 ] 00:27:33.815 }' 00:27:33.815 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:33.815 06:43:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:34.381 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:34.381 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:34.381 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:34.381 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:34.381 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:34.381 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.381 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:34.639 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:34.639 "name": "raid_bdev1", 00:27:34.639 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:34.639 "strip_size_kb": 0, 00:27:34.639 "state": "online", 00:27:34.639 "raid_level": "raid1", 00:27:34.639 "superblock": true, 00:27:34.639 "num_base_bdevs": 4, 00:27:34.639 "num_base_bdevs_discovered": 2, 00:27:34.639 "num_base_bdevs_operational": 2, 00:27:34.639 "base_bdevs_list": [ 00:27:34.639 { 00:27:34.639 "name": null, 00:27:34.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.639 "is_configured": false, 00:27:34.639 "data_offset": 2048, 00:27:34.639 "data_size": 63488 00:27:34.639 }, 00:27:34.639 { 00:27:34.639 "name": null, 00:27:34.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.639 "is_configured": false, 00:27:34.639 "data_offset": 2048, 00:27:34.639 "data_size": 63488 00:27:34.639 }, 00:27:34.639 { 00:27:34.639 "name": "BaseBdev3", 00:27:34.639 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:34.639 "is_configured": true, 00:27:34.639 "data_offset": 2048, 00:27:34.639 "data_size": 63488 00:27:34.639 }, 00:27:34.639 { 00:27:34.639 "name": "BaseBdev4", 00:27:34.639 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:34.639 "is_configured": true, 00:27:34.639 "data_offset": 2048, 00:27:34.639 "data_size": 63488 00:27:34.639 } 00:27:34.639 ] 00:27:34.639 }' 00:27:34.639 06:43:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:34.639 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:34.897 [2024-07-25 06:43:48.284085] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:34.897 [2024-07-25 06:43:48.284197] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:34.897 [2024-07-25 06:43:48.284212] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:34.897 request: 00:27:34.897 { 00:27:34.897 "base_bdev": "BaseBdev1", 00:27:34.897 "raid_bdev": "raid_bdev1", 00:27:34.897 "method": "bdev_raid_add_base_bdev", 00:27:34.897 "req_id": 1 00:27:34.897 } 00:27:34.897 Got JSON-RPC error response 00:27:34.897 response: 00:27:34.897 { 00:27:34.897 "code": -22, 00:27:34.897 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:34.897 } 00:27:34.897 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:27:34.897 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:27:34.897 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:27:34.897 06:43:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:27:34.897 06:43:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@793 -- # sleep 1 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:35.829 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.088 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:36.088 "name": "raid_bdev1", 00:27:36.088 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:36.088 "strip_size_kb": 0, 00:27:36.088 "state": "online", 00:27:36.088 "raid_level": "raid1", 00:27:36.088 "superblock": true, 00:27:36.088 "num_base_bdevs": 4, 00:27:36.088 "num_base_bdevs_discovered": 2, 00:27:36.088 "num_base_bdevs_operational": 2, 00:27:36.088 "base_bdevs_list": [ 00:27:36.088 { 00:27:36.088 "name": null, 00:27:36.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:36.088 "is_configured": false, 00:27:36.088 "data_offset": 2048, 00:27:36.088 "data_size": 63488 00:27:36.088 }, 00:27:36.088 { 00:27:36.088 "name": null, 00:27:36.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:36.088 "is_configured": false, 00:27:36.088 "data_offset": 2048, 00:27:36.088 "data_size": 63488 00:27:36.088 }, 00:27:36.088 { 00:27:36.088 "name": "BaseBdev3", 00:27:36.088 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:36.088 "is_configured": true, 00:27:36.088 "data_offset": 2048, 00:27:36.088 "data_size": 63488 00:27:36.088 }, 00:27:36.088 { 00:27:36.088 "name": "BaseBdev4", 00:27:36.088 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:36.088 "is_configured": true, 00:27:36.088 "data_offset": 2048, 00:27:36.088 "data_size": 63488 00:27:36.088 } 00:27:36.088 ] 00:27:36.088 }' 00:27:36.088 06:43:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:36.088 06:43:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:36.652 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:36.652 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:36.652 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:36.652 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:36.652 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:36.652 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.652 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:36.910 "name": "raid_bdev1", 00:27:36.910 "uuid": "933a2b42-2bad-4323-a65e-23c72e59b0d5", 00:27:36.910 "strip_size_kb": 0, 00:27:36.910 "state": "online", 00:27:36.910 "raid_level": "raid1", 00:27:36.910 "superblock": true, 00:27:36.910 "num_base_bdevs": 4, 00:27:36.910 "num_base_bdevs_discovered": 2, 00:27:36.910 "num_base_bdevs_operational": 2, 00:27:36.910 "base_bdevs_list": [ 00:27:36.910 { 00:27:36.910 "name": null, 00:27:36.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:36.910 "is_configured": false, 00:27:36.910 "data_offset": 2048, 00:27:36.910 "data_size": 63488 00:27:36.910 }, 00:27:36.910 { 00:27:36.910 "name": null, 00:27:36.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:36.910 "is_configured": false, 00:27:36.910 "data_offset": 2048, 00:27:36.910 "data_size": 63488 00:27:36.910 }, 00:27:36.910 { 00:27:36.910 "name": "BaseBdev3", 00:27:36.910 "uuid": "246ca9f0-992a-5748-84fb-261d5acd9506", 00:27:36.910 "is_configured": true, 00:27:36.910 "data_offset": 2048, 00:27:36.910 "data_size": 63488 00:27:36.910 }, 00:27:36.910 { 00:27:36.910 "name": "BaseBdev4", 00:27:36.910 "uuid": "0a24bb57-fb85-5de2-a690-dd4d36c7c36c", 00:27:36.910 "is_configured": true, 00:27:36.910 "data_offset": 2048, 00:27:36.910 "data_size": 63488 00:27:36.910 } 00:27:36.910 ] 00:27:36.910 }' 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@798 -- # killprocess 1244194 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1244194 ']' 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 1244194 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1244194 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1244194' 00:27:36.910 killing process with pid 1244194 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 1244194 00:27:36.910 Received shutdown signal, test time was about 60.000000 seconds 00:27:36.910 00:27:36.910 Latency(us) 00:27:36.910 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:36.910 =================================================================================================================== 00:27:36.910 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:36.910 [2024-07-25 06:43:50.349948] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:36.910 [2024-07-25 06:43:50.350035] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:36.910 [2024-07-25 06:43:50.350089] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:36.910 [2024-07-25 06:43:50.350101] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1491cd0 name raid_bdev1, state offline 00:27:36.910 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 1244194 00:27:36.910 [2024-07-25 06:43:50.389658] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@800 -- # return 0 00:27:37.169 00:27:37.169 real 0m36.312s 00:27:37.169 user 0m52.458s 00:27:37.169 sys 0m6.383s 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:37.169 ************************************ 00:27:37.169 END TEST raid_rebuild_test_sb 00:27:37.169 ************************************ 00:27:37.169 06:43:50 bdev_raid -- bdev/bdev_raid.sh@959 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:27:37.169 06:43:50 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:27:37.169 06:43:50 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:37.169 06:43:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:37.169 ************************************ 00:27:37.169 START TEST raid_rebuild_test_io 00:27:37.169 ************************************ 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false true true 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1250587 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1250587 /var/tmp/spdk-raid.sock 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 1250587 ']' 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:37.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:37.169 06:43:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:37.169 [2024-07-25 06:43:50.714024] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:27:37.169 [2024-07-25 06:43:50.714080] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1250587 ] 00:27:37.169 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:37.169 Zero copy mechanism will not be used. 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:01.0 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:01.1 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:01.2 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:01.3 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:01.4 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:01.5 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:01.6 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:01.7 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:02.0 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:02.1 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:02.2 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:02.3 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:02.4 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:02.5 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:02.6 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3d:02.7 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:01.0 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:01.1 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:01.2 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:01.3 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:01.4 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:01.5 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:01.6 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:01.7 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:02.0 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:02.1 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:02.2 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:02.3 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:02.4 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:02.5 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:02.6 cannot be used 00:27:37.489 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:37.489 EAL: Requested device 0000:3f:02.7 cannot be used 00:27:37.489 [2024-07-25 06:43:50.850274] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:37.489 [2024-07-25 06:43:50.894328] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:37.489 [2024-07-25 06:43:50.956489] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:37.489 [2024-07-25 06:43:50.956525] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:38.057 06:43:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:38.057 06:43:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:27:38.057 06:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:38.057 06:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:38.315 BaseBdev1_malloc 00:27:38.315 06:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:38.574 [2024-07-25 06:43:52.026782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:38.574 [2024-07-25 06:43:52.026826] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:38.574 [2024-07-25 06:43:52.026848] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x215c7b0 00:27:38.574 [2024-07-25 06:43:52.026859] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:38.574 [2024-07-25 06:43:52.028443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:38.574 [2024-07-25 06:43:52.028471] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:38.574 BaseBdev1 00:27:38.574 06:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:38.574 06:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:38.833 BaseBdev2_malloc 00:27:38.833 06:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:39.092 [2024-07-25 06:43:52.476279] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:39.092 [2024-07-25 06:43:52.476321] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:39.092 [2024-07-25 06:43:52.476343] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1faa8f0 00:27:39.092 [2024-07-25 06:43:52.476355] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:39.092 [2024-07-25 06:43:52.477694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:39.092 [2024-07-25 06:43:52.477721] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:39.092 BaseBdev2 00:27:39.092 06:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:39.092 06:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:39.351 BaseBdev3_malloc 00:27:39.351 06:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:39.610 [2024-07-25 06:43:52.917769] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:39.610 [2024-07-25 06:43:52.917811] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:39.610 [2024-07-25 06:43:52.917828] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2151330 00:27:39.610 [2024-07-25 06:43:52.917840] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:39.610 [2024-07-25 06:43:52.919174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:39.610 [2024-07-25 06:43:52.919200] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:39.610 BaseBdev3 00:27:39.610 06:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:39.610 06:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:39.610 BaseBdev4_malloc 00:27:39.869 06:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:39.869 [2024-07-25 06:43:53.367153] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:39.869 [2024-07-25 06:43:53.367193] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:39.869 [2024-07-25 06:43:53.367210] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa2240 00:27:39.869 [2024-07-25 06:43:53.367221] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:39.869 [2024-07-25 06:43:53.368554] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:39.869 [2024-07-25 06:43:53.368580] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:39.869 BaseBdev4 00:27:39.869 06:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:40.127 spare_malloc 00:27:40.127 06:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:40.386 spare_delay 00:27:40.386 06:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:40.644 [2024-07-25 06:43:54.029028] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:40.644 [2024-07-25 06:43:54.029070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:40.644 [2024-07-25 06:43:54.029092] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa2e90 00:27:40.644 [2024-07-25 06:43:54.029103] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:40.644 [2024-07-25 06:43:54.030473] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:40.644 [2024-07-25 06:43:54.030499] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:40.644 spare 00:27:40.644 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:40.903 [2024-07-25 06:43:54.253763] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:40.903 [2024-07-25 06:43:54.254895] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:40.903 [2024-07-25 06:43:54.254946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:40.903 [2024-07-25 06:43:54.254991] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:40.903 [2024-07-25 06:43:54.255067] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fa4190 00:27:40.903 [2024-07-25 06:43:54.255076] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:27:40.903 [2024-07-25 06:43:54.255280] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa1a70 00:27:40.903 [2024-07-25 06:43:54.255415] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fa4190 00:27:40.903 [2024-07-25 06:43:54.255425] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fa4190 00:27:40.903 [2024-07-25 06:43:54.255528] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.903 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:40.903 "name": "raid_bdev1", 00:27:40.903 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:40.903 "strip_size_kb": 0, 00:27:40.903 "state": "online", 00:27:40.903 "raid_level": "raid1", 00:27:40.903 "superblock": false, 00:27:40.903 "num_base_bdevs": 4, 00:27:40.903 "num_base_bdevs_discovered": 4, 00:27:40.903 "num_base_bdevs_operational": 4, 00:27:40.903 "base_bdevs_list": [ 00:27:40.903 { 00:27:40.903 "name": "BaseBdev1", 00:27:40.903 "uuid": "fd43d1ed-eb30-5c6f-9750-64832b11712c", 00:27:40.903 "is_configured": true, 00:27:40.903 "data_offset": 0, 00:27:40.903 "data_size": 65536 00:27:40.903 }, 00:27:40.903 { 00:27:40.903 "name": "BaseBdev2", 00:27:40.903 "uuid": "ba5263a8-8afa-54ec-badb-dc249b11b86d", 00:27:40.903 "is_configured": true, 00:27:40.903 "data_offset": 0, 00:27:40.903 "data_size": 65536 00:27:40.903 }, 00:27:40.903 { 00:27:40.903 "name": "BaseBdev3", 00:27:40.904 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:40.904 "is_configured": true, 00:27:40.904 "data_offset": 0, 00:27:40.904 "data_size": 65536 00:27:40.904 }, 00:27:40.904 { 00:27:40.904 "name": "BaseBdev4", 00:27:40.904 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:40.904 "is_configured": true, 00:27:40.904 "data_offset": 0, 00:27:40.904 "data_size": 65536 00:27:40.904 } 00:27:40.904 ] 00:27:40.904 }' 00:27:40.904 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:40.904 06:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:41.471 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:41.471 06:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:27:41.730 [2024-07-25 06:43:55.196511] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:41.730 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:27:41.730 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.730 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:41.989 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:27:41.989 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:27:41.989 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:41.989 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:41.989 [2024-07-25 06:43:55.531049] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa3c60 00:27:41.989 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:41.989 Zero copy mechanism will not be used. 00:27:41.989 Running I/O for 60 seconds... 00:27:42.248 [2024-07-25 06:43:55.640067] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:42.248 [2024-07-25 06:43:55.640241] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1fa3c60 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.248 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.507 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:42.507 "name": "raid_bdev1", 00:27:42.507 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:42.507 "strip_size_kb": 0, 00:27:42.507 "state": "online", 00:27:42.507 "raid_level": "raid1", 00:27:42.507 "superblock": false, 00:27:42.507 "num_base_bdevs": 4, 00:27:42.507 "num_base_bdevs_discovered": 3, 00:27:42.507 "num_base_bdevs_operational": 3, 00:27:42.507 "base_bdevs_list": [ 00:27:42.507 { 00:27:42.507 "name": null, 00:27:42.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:42.507 "is_configured": false, 00:27:42.507 "data_offset": 0, 00:27:42.507 "data_size": 65536 00:27:42.507 }, 00:27:42.507 { 00:27:42.507 "name": "BaseBdev2", 00:27:42.507 "uuid": "ba5263a8-8afa-54ec-badb-dc249b11b86d", 00:27:42.507 "is_configured": true, 00:27:42.507 "data_offset": 0, 00:27:42.507 "data_size": 65536 00:27:42.507 }, 00:27:42.507 { 00:27:42.507 "name": "BaseBdev3", 00:27:42.507 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:42.508 "is_configured": true, 00:27:42.508 "data_offset": 0, 00:27:42.508 "data_size": 65536 00:27:42.508 }, 00:27:42.508 { 00:27:42.508 "name": "BaseBdev4", 00:27:42.508 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:42.508 "is_configured": true, 00:27:42.508 "data_offset": 0, 00:27:42.508 "data_size": 65536 00:27:42.508 } 00:27:42.508 ] 00:27:42.508 }' 00:27:42.508 06:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:42.508 06:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:43.076 06:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:43.076 [2024-07-25 06:43:56.624917] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:43.335 [2024-07-25 06:43:56.683825] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa5550 00:27:43.335 06:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:43.335 [2024-07-25 06:43:56.686016] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:43.335 [2024-07-25 06:43:56.788207] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:43.335 [2024-07-25 06:43:56.788484] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:43.594 [2024-07-25 06:43:56.912753] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:43.594 [2024-07-25 06:43:56.913313] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:43.852 [2024-07-25 06:43:57.246020] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:43.852 [2024-07-25 06:43:57.246314] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:43.852 [2024-07-25 06:43:57.383006] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:43.853 [2024-07-25 06:43:57.383571] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.420 [2024-07-25 06:43:57.723736] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:44.420 "name": "raid_bdev1", 00:27:44.420 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:44.420 "strip_size_kb": 0, 00:27:44.420 "state": "online", 00:27:44.420 "raid_level": "raid1", 00:27:44.420 "superblock": false, 00:27:44.420 "num_base_bdevs": 4, 00:27:44.420 "num_base_bdevs_discovered": 4, 00:27:44.420 "num_base_bdevs_operational": 4, 00:27:44.420 "process": { 00:27:44.420 "type": "rebuild", 00:27:44.420 "target": "spare", 00:27:44.420 "progress": { 00:27:44.420 "blocks": 14336, 00:27:44.420 "percent": 21 00:27:44.420 } 00:27:44.420 }, 00:27:44.420 "base_bdevs_list": [ 00:27:44.420 { 00:27:44.420 "name": "spare", 00:27:44.420 "uuid": "a99cc1ba-d157-5904-809d-1ed7afae73e4", 00:27:44.420 "is_configured": true, 00:27:44.420 "data_offset": 0, 00:27:44.420 "data_size": 65536 00:27:44.420 }, 00:27:44.420 { 00:27:44.420 "name": "BaseBdev2", 00:27:44.420 "uuid": "ba5263a8-8afa-54ec-badb-dc249b11b86d", 00:27:44.420 "is_configured": true, 00:27:44.420 "data_offset": 0, 00:27:44.420 "data_size": 65536 00:27:44.420 }, 00:27:44.420 { 00:27:44.420 "name": "BaseBdev3", 00:27:44.420 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:44.420 "is_configured": true, 00:27:44.420 "data_offset": 0, 00:27:44.420 "data_size": 65536 00:27:44.420 }, 00:27:44.420 { 00:27:44.420 "name": "BaseBdev4", 00:27:44.420 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:44.420 "is_configured": true, 00:27:44.420 "data_offset": 0, 00:27:44.420 "data_size": 65536 00:27:44.420 } 00:27:44.420 ] 00:27:44.420 }' 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:44.420 [2024-07-25 06:43:57.946465] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:44.420 06:43:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:44.678 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:44.678 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:44.678 [2024-07-25 06:43:58.195958] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:44.678 [2024-07-25 06:43:58.220033] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:44.936 [2024-07-25 06:43:58.443623] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:44.936 [2024-07-25 06:43:58.452873] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:44.936 [2024-07-25 06:43:58.452910] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:44.936 [2024-07-25 06:43:58.452919] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:44.936 [2024-07-25 06:43:58.465318] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1fa3c60 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.194 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.194 "name": "raid_bdev1", 00:27:45.194 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:45.194 "strip_size_kb": 0, 00:27:45.194 "state": "online", 00:27:45.194 "raid_level": "raid1", 00:27:45.195 "superblock": false, 00:27:45.195 "num_base_bdevs": 4, 00:27:45.195 "num_base_bdevs_discovered": 3, 00:27:45.195 "num_base_bdevs_operational": 3, 00:27:45.195 "base_bdevs_list": [ 00:27:45.195 { 00:27:45.195 "name": null, 00:27:45.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.195 "is_configured": false, 00:27:45.195 "data_offset": 0, 00:27:45.195 "data_size": 65536 00:27:45.195 }, 00:27:45.195 { 00:27:45.195 "name": "BaseBdev2", 00:27:45.195 "uuid": "ba5263a8-8afa-54ec-badb-dc249b11b86d", 00:27:45.195 "is_configured": true, 00:27:45.195 "data_offset": 0, 00:27:45.195 "data_size": 65536 00:27:45.195 }, 00:27:45.195 { 00:27:45.195 "name": "BaseBdev3", 00:27:45.195 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:45.195 "is_configured": true, 00:27:45.195 "data_offset": 0, 00:27:45.195 "data_size": 65536 00:27:45.195 }, 00:27:45.195 { 00:27:45.195 "name": "BaseBdev4", 00:27:45.195 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:45.195 "is_configured": true, 00:27:45.195 "data_offset": 0, 00:27:45.195 "data_size": 65536 00:27:45.195 } 00:27:45.195 ] 00:27:45.195 }' 00:27:45.195 06:43:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.195 06:43:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:46.131 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.132 "name": "raid_bdev1", 00:27:46.132 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:46.132 "strip_size_kb": 0, 00:27:46.132 "state": "online", 00:27:46.132 "raid_level": "raid1", 00:27:46.132 "superblock": false, 00:27:46.132 "num_base_bdevs": 4, 00:27:46.132 "num_base_bdevs_discovered": 3, 00:27:46.132 "num_base_bdevs_operational": 3, 00:27:46.132 "base_bdevs_list": [ 00:27:46.132 { 00:27:46.132 "name": null, 00:27:46.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.132 "is_configured": false, 00:27:46.132 "data_offset": 0, 00:27:46.132 "data_size": 65536 00:27:46.132 }, 00:27:46.132 { 00:27:46.132 "name": "BaseBdev2", 00:27:46.132 "uuid": "ba5263a8-8afa-54ec-badb-dc249b11b86d", 00:27:46.132 "is_configured": true, 00:27:46.132 "data_offset": 0, 00:27:46.132 "data_size": 65536 00:27:46.132 }, 00:27:46.132 { 00:27:46.132 "name": "BaseBdev3", 00:27:46.132 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:46.132 "is_configured": true, 00:27:46.132 "data_offset": 0, 00:27:46.132 "data_size": 65536 00:27:46.132 }, 00:27:46.132 { 00:27:46.132 "name": "BaseBdev4", 00:27:46.132 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:46.132 "is_configured": true, 00:27:46.132 "data_offset": 0, 00:27:46.132 "data_size": 65536 00:27:46.132 } 00:27:46.132 ] 00:27:46.132 }' 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:46.132 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:46.391 [2024-07-25 06:43:59.892292] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:46.652 [2024-07-25 06:43:59.951820] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa0b40 00:27:46.652 [2024-07-25 06:43:59.953215] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:46.652 06:43:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:27:46.652 [2024-07-25 06:44:00.070189] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:46.652 [2024-07-25 06:44:00.070601] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:46.911 [2024-07-25 06:44:00.280919] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:46.911 [2024-07-25 06:44:00.281377] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:47.170 [2024-07-25 06:44:00.650788] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:47.429 [2024-07-25 06:44:00.779228] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:47.429 06:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:47.429 06:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:47.429 06:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:47.429 06:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:47.429 06:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:47.429 06:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.429 06:44:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.688 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:47.689 "name": "raid_bdev1", 00:27:47.689 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:47.689 "strip_size_kb": 0, 00:27:47.689 "state": "online", 00:27:47.689 "raid_level": "raid1", 00:27:47.689 "superblock": false, 00:27:47.689 "num_base_bdevs": 4, 00:27:47.689 "num_base_bdevs_discovered": 4, 00:27:47.689 "num_base_bdevs_operational": 4, 00:27:47.689 "process": { 00:27:47.689 "type": "rebuild", 00:27:47.689 "target": "spare", 00:27:47.689 "progress": { 00:27:47.689 "blocks": 14336, 00:27:47.689 "percent": 21 00:27:47.689 } 00:27:47.689 }, 00:27:47.689 "base_bdevs_list": [ 00:27:47.689 { 00:27:47.689 "name": "spare", 00:27:47.689 "uuid": "a99cc1ba-d157-5904-809d-1ed7afae73e4", 00:27:47.689 "is_configured": true, 00:27:47.689 "data_offset": 0, 00:27:47.689 "data_size": 65536 00:27:47.689 }, 00:27:47.689 { 00:27:47.689 "name": "BaseBdev2", 00:27:47.689 "uuid": "ba5263a8-8afa-54ec-badb-dc249b11b86d", 00:27:47.689 "is_configured": true, 00:27:47.689 "data_offset": 0, 00:27:47.689 "data_size": 65536 00:27:47.689 }, 00:27:47.689 { 00:27:47.689 "name": "BaseBdev3", 00:27:47.689 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:47.689 "is_configured": true, 00:27:47.689 "data_offset": 0, 00:27:47.689 "data_size": 65536 00:27:47.689 }, 00:27:47.689 { 00:27:47.689 "name": "BaseBdev4", 00:27:47.689 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:47.689 "is_configured": true, 00:27:47.689 "data_offset": 0, 00:27:47.689 "data_size": 65536 00:27:47.689 } 00:27:47.689 ] 00:27:47.689 }' 00:27:47.689 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:47.689 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:47.689 [2024-07-25 06:44:01.242522] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:47.689 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:47.948 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:47.948 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:27:47.948 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:27:47.948 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:27:47.948 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:27:47.948 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:47.948 [2024-07-25 06:44:01.496278] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:48.207 [2024-07-25 06:44:01.677721] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1fa3c60 00:27:48.207 [2024-07-25 06:44:01.677754] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1fa0b40 00:27:48.207 [2024-07-25 06:44:01.678077] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.207 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.465 [2024-07-25 06:44:01.829593] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:27:48.465 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:48.465 "name": "raid_bdev1", 00:27:48.465 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:48.465 "strip_size_kb": 0, 00:27:48.465 "state": "online", 00:27:48.465 "raid_level": "raid1", 00:27:48.465 "superblock": false, 00:27:48.465 "num_base_bdevs": 4, 00:27:48.465 "num_base_bdevs_discovered": 3, 00:27:48.465 "num_base_bdevs_operational": 3, 00:27:48.465 "process": { 00:27:48.465 "type": "rebuild", 00:27:48.465 "target": "spare", 00:27:48.465 "progress": { 00:27:48.465 "blocks": 22528, 00:27:48.465 "percent": 34 00:27:48.465 } 00:27:48.466 }, 00:27:48.466 "base_bdevs_list": [ 00:27:48.466 { 00:27:48.466 "name": "spare", 00:27:48.466 "uuid": "a99cc1ba-d157-5904-809d-1ed7afae73e4", 00:27:48.466 "is_configured": true, 00:27:48.466 "data_offset": 0, 00:27:48.466 "data_size": 65536 00:27:48.466 }, 00:27:48.466 { 00:27:48.466 "name": null, 00:27:48.466 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.466 "is_configured": false, 00:27:48.466 "data_offset": 0, 00:27:48.466 "data_size": 65536 00:27:48.466 }, 00:27:48.466 { 00:27:48.466 "name": "BaseBdev3", 00:27:48.466 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:48.466 "is_configured": true, 00:27:48.466 "data_offset": 0, 00:27:48.466 "data_size": 65536 00:27:48.466 }, 00:27:48.466 { 00:27:48.466 "name": "BaseBdev4", 00:27:48.466 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:48.466 "is_configured": true, 00:27:48.466 "data_offset": 0, 00:27:48.466 "data_size": 65536 00:27:48.466 } 00:27:48.466 ] 00:27:48.466 }' 00:27:48.466 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:48.466 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:48.466 06:44:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # local timeout=891 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:48.725 "name": "raid_bdev1", 00:27:48.725 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:48.725 "strip_size_kb": 0, 00:27:48.725 "state": "online", 00:27:48.725 "raid_level": "raid1", 00:27:48.725 "superblock": false, 00:27:48.725 "num_base_bdevs": 4, 00:27:48.725 "num_base_bdevs_discovered": 3, 00:27:48.725 "num_base_bdevs_operational": 3, 00:27:48.725 "process": { 00:27:48.725 "type": "rebuild", 00:27:48.725 "target": "spare", 00:27:48.725 "progress": { 00:27:48.725 "blocks": 26624, 00:27:48.725 "percent": 40 00:27:48.725 } 00:27:48.725 }, 00:27:48.725 "base_bdevs_list": [ 00:27:48.725 { 00:27:48.725 "name": "spare", 00:27:48.725 "uuid": "a99cc1ba-d157-5904-809d-1ed7afae73e4", 00:27:48.725 "is_configured": true, 00:27:48.725 "data_offset": 0, 00:27:48.725 "data_size": 65536 00:27:48.725 }, 00:27:48.725 { 00:27:48.725 "name": null, 00:27:48.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.725 "is_configured": false, 00:27:48.725 "data_offset": 0, 00:27:48.725 "data_size": 65536 00:27:48.725 }, 00:27:48.725 { 00:27:48.725 "name": "BaseBdev3", 00:27:48.725 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:48.725 "is_configured": true, 00:27:48.725 "data_offset": 0, 00:27:48.725 "data_size": 65536 00:27:48.725 }, 00:27:48.725 { 00:27:48.725 "name": "BaseBdev4", 00:27:48.725 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:48.725 "is_configured": true, 00:27:48.725 "data_offset": 0, 00:27:48.725 "data_size": 65536 00:27:48.725 } 00:27:48.725 ] 00:27:48.725 }' 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:48.725 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:48.984 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:48.984 06:44:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:49.552 [2024-07-25 06:44:02.848122] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:27:49.552 [2024-07-25 06:44:02.965233] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:27:49.811 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:49.811 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:49.811 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:49.811 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:49.811 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:49.811 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:49.811 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.811 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.069 [2024-07-25 06:44:03.426483] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:27:50.069 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:50.069 "name": "raid_bdev1", 00:27:50.069 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:50.069 "strip_size_kb": 0, 00:27:50.069 "state": "online", 00:27:50.069 "raid_level": "raid1", 00:27:50.069 "superblock": false, 00:27:50.069 "num_base_bdevs": 4, 00:27:50.069 "num_base_bdevs_discovered": 3, 00:27:50.069 "num_base_bdevs_operational": 3, 00:27:50.069 "process": { 00:27:50.069 "type": "rebuild", 00:27:50.069 "target": "spare", 00:27:50.069 "progress": { 00:27:50.069 "blocks": 47104, 00:27:50.069 "percent": 71 00:27:50.069 } 00:27:50.069 }, 00:27:50.069 "base_bdevs_list": [ 00:27:50.069 { 00:27:50.069 "name": "spare", 00:27:50.069 "uuid": "a99cc1ba-d157-5904-809d-1ed7afae73e4", 00:27:50.069 "is_configured": true, 00:27:50.069 "data_offset": 0, 00:27:50.069 "data_size": 65536 00:27:50.069 }, 00:27:50.069 { 00:27:50.069 "name": null, 00:27:50.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.069 "is_configured": false, 00:27:50.069 "data_offset": 0, 00:27:50.069 "data_size": 65536 00:27:50.069 }, 00:27:50.069 { 00:27:50.069 "name": "BaseBdev3", 00:27:50.069 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:50.069 "is_configured": true, 00:27:50.069 "data_offset": 0, 00:27:50.069 "data_size": 65536 00:27:50.069 }, 00:27:50.069 { 00:27:50.069 "name": "BaseBdev4", 00:27:50.069 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:50.069 "is_configured": true, 00:27:50.069 "data_offset": 0, 00:27:50.069 "data_size": 65536 00:27:50.069 } 00:27:50.069 ] 00:27:50.069 }' 00:27:50.069 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:50.069 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:50.069 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:50.328 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:50.328 06:44:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:51.306 [2024-07-25 06:44:04.550024] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:51.306 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:51.307 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:51.307 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:51.307 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:51.307 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:51.307 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:51.307 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.307 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:51.307 [2024-07-25 06:44:04.657652] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:51.307 [2024-07-25 06:44:04.659694] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:51.565 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:51.565 "name": "raid_bdev1", 00:27:51.565 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:51.565 "strip_size_kb": 0, 00:27:51.565 "state": "online", 00:27:51.565 "raid_level": "raid1", 00:27:51.565 "superblock": false, 00:27:51.565 "num_base_bdevs": 4, 00:27:51.565 "num_base_bdevs_discovered": 3, 00:27:51.565 "num_base_bdevs_operational": 3, 00:27:51.565 "base_bdevs_list": [ 00:27:51.565 { 00:27:51.565 "name": "spare", 00:27:51.565 "uuid": "a99cc1ba-d157-5904-809d-1ed7afae73e4", 00:27:51.565 "is_configured": true, 00:27:51.565 "data_offset": 0, 00:27:51.566 "data_size": 65536 00:27:51.566 }, 00:27:51.566 { 00:27:51.566 "name": null, 00:27:51.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:51.566 "is_configured": false, 00:27:51.566 "data_offset": 0, 00:27:51.566 "data_size": 65536 00:27:51.566 }, 00:27:51.566 { 00:27:51.566 "name": "BaseBdev3", 00:27:51.566 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:51.566 "is_configured": true, 00:27:51.566 "data_offset": 0, 00:27:51.566 "data_size": 65536 00:27:51.566 }, 00:27:51.566 { 00:27:51.566 "name": "BaseBdev4", 00:27:51.566 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:51.566 "is_configured": true, 00:27:51.566 "data_offset": 0, 00:27:51.566 "data_size": 65536 00:27:51.566 } 00:27:51.566 ] 00:27:51.566 }' 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # break 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.566 06:44:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:51.825 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:51.825 "name": "raid_bdev1", 00:27:51.825 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:51.825 "strip_size_kb": 0, 00:27:51.825 "state": "online", 00:27:51.825 "raid_level": "raid1", 00:27:51.825 "superblock": false, 00:27:51.826 "num_base_bdevs": 4, 00:27:51.826 "num_base_bdevs_discovered": 3, 00:27:51.826 "num_base_bdevs_operational": 3, 00:27:51.826 "base_bdevs_list": [ 00:27:51.826 { 00:27:51.826 "name": "spare", 00:27:51.826 "uuid": "a99cc1ba-d157-5904-809d-1ed7afae73e4", 00:27:51.826 "is_configured": true, 00:27:51.826 "data_offset": 0, 00:27:51.826 "data_size": 65536 00:27:51.826 }, 00:27:51.826 { 00:27:51.826 "name": null, 00:27:51.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:51.826 "is_configured": false, 00:27:51.826 "data_offset": 0, 00:27:51.826 "data_size": 65536 00:27:51.826 }, 00:27:51.826 { 00:27:51.826 "name": "BaseBdev3", 00:27:51.826 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:51.826 "is_configured": true, 00:27:51.826 "data_offset": 0, 00:27:51.826 "data_size": 65536 00:27:51.826 }, 00:27:51.826 { 00:27:51.826 "name": "BaseBdev4", 00:27:51.826 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:51.826 "is_configured": true, 00:27:51.826 "data_offset": 0, 00:27:51.826 "data_size": 65536 00:27:51.826 } 00:27:51.826 ] 00:27:51.826 }' 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.826 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.084 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.085 "name": "raid_bdev1", 00:27:52.085 "uuid": "ffddbfe8-401e-4b7e-a436-709a96563cc4", 00:27:52.085 "strip_size_kb": 0, 00:27:52.085 "state": "online", 00:27:52.085 "raid_level": "raid1", 00:27:52.085 "superblock": false, 00:27:52.085 "num_base_bdevs": 4, 00:27:52.085 "num_base_bdevs_discovered": 3, 00:27:52.085 "num_base_bdevs_operational": 3, 00:27:52.085 "base_bdevs_list": [ 00:27:52.085 { 00:27:52.085 "name": "spare", 00:27:52.085 "uuid": "a99cc1ba-d157-5904-809d-1ed7afae73e4", 00:27:52.085 "is_configured": true, 00:27:52.085 "data_offset": 0, 00:27:52.085 "data_size": 65536 00:27:52.085 }, 00:27:52.085 { 00:27:52.085 "name": null, 00:27:52.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:52.085 "is_configured": false, 00:27:52.085 "data_offset": 0, 00:27:52.085 "data_size": 65536 00:27:52.085 }, 00:27:52.085 { 00:27:52.085 "name": "BaseBdev3", 00:27:52.085 "uuid": "5196ff9a-83a1-5d92-bdf2-ee1c8514adc2", 00:27:52.085 "is_configured": true, 00:27:52.085 "data_offset": 0, 00:27:52.085 "data_size": 65536 00:27:52.085 }, 00:27:52.085 { 00:27:52.085 "name": "BaseBdev4", 00:27:52.085 "uuid": "d810505d-eb6f-5320-9346-67649c3e5f16", 00:27:52.085 "is_configured": true, 00:27:52.085 "data_offset": 0, 00:27:52.085 "data_size": 65536 00:27:52.085 } 00:27:52.085 ] 00:27:52.085 }' 00:27:52.085 06:44:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.085 06:44:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:52.651 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:52.910 [2024-07-25 06:44:06.289386] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:52.910 [2024-07-25 06:44:06.289415] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:52.910 00:27:52.910 Latency(us) 00:27:52.910 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:52.910 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:27:52.910 raid_bdev1 : 10.75 94.17 282.52 0.00 0.00 14555.04 270.34 119957.09 00:27:52.910 =================================================================================================================== 00:27:52.910 Total : 94.17 282.52 0.00 0.00 14555.04 270.34 119957.09 00:27:52.910 [2024-07-25 06:44:06.309052] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:52.910 [2024-07-25 06:44:06.309076] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:52.910 [2024-07-25 06:44:06.309168] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:52.910 [2024-07-25 06:44:06.309179] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa4190 name raid_bdev1, state offline 00:27:52.910 0 00:27:52.910 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.910 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # jq length 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:53.168 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:27:53.426 /dev/nbd0 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:53.426 1+0 records in 00:27:53.426 1+0 records out 00:27:53.426 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259539 s, 15.8 MB/s 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z '' ']' 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@743 -- # continue 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev3 ']' 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:53.426 06:44:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:27:53.685 /dev/nbd1 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:53.685 1+0 records in 00:27:53.685 1+0 records out 00:27:53.685 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263781 s, 15.5 MB/s 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:53.685 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev4 ']' 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:53.945 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:27:54.204 /dev/nbd1 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:54.204 1+0 records in 00:27:54.204 1+0 records out 00:27:54.204 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240421 s, 17.0 MB/s 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:54.204 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:54.463 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:54.463 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:54.463 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:54.463 06:44:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:54.463 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:54.463 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:54.463 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:54.463 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:54.464 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:54.464 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:54.464 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:54.464 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:54.464 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:54.464 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:54.464 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@798 -- # killprocess 1250587 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 1250587 ']' 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 1250587 00:27:54.723 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:27:54.982 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:54.982 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1250587 00:27:54.982 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:54.982 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:54.982 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1250587' 00:27:54.982 killing process with pid 1250587 00:27:54.982 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 1250587 00:27:54.982 Received shutdown signal, test time was about 12.765645 seconds 00:27:54.982 00:27:54.982 Latency(us) 00:27:54.982 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:54.982 =================================================================================================================== 00:27:54.982 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:54.982 [2024-07-25 06:44:08.329932] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:54.982 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 1250587 00:27:54.982 [2024-07-25 06:44:08.365432] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@800 -- # return 0 00:27:55.242 00:27:55.242 real 0m17.901s 00:27:55.242 user 0m27.517s 00:27:55.242 sys 0m3.323s 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:55.242 ************************************ 00:27:55.242 END TEST raid_rebuild_test_io 00:27:55.242 ************************************ 00:27:55.242 06:44:08 bdev_raid -- bdev/bdev_raid.sh@960 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:27:55.242 06:44:08 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:27:55.242 06:44:08 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:55.242 06:44:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:55.242 ************************************ 00:27:55.242 START TEST raid_rebuild_test_sb_io 00:27:55.242 ************************************ 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true true true 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1253892 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1253892 /var/tmp/spdk-raid.sock 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 1253892 ']' 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:55.242 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:55.242 06:44:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:55.242 [2024-07-25 06:44:08.705926] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:27:55.242 [2024-07-25 06:44:08.705983] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1253892 ] 00:27:55.242 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:55.242 Zero copy mechanism will not be used. 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:01.0 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:01.1 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:01.2 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:01.3 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:01.4 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:01.5 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:01.6 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:01.7 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:02.0 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:02.1 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:02.2 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:02.3 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:02.4 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:02.5 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:02.6 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3d:02.7 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3f:01.0 cannot be used 00:27:55.242 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.242 EAL: Requested device 0000:3f:01.1 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:01.2 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:01.3 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:01.4 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:01.5 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:01.6 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:01.7 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:02.0 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:02.1 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:02.2 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:02.3 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:02.4 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:02.5 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:02.6 cannot be used 00:27:55.243 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:27:55.243 EAL: Requested device 0000:3f:02.7 cannot be used 00:27:55.502 [2024-07-25 06:44:08.843869] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:55.502 [2024-07-25 06:44:08.887205] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:55.502 [2024-07-25 06:44:08.943881] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:55.502 [2024-07-25 06:44:08.943924] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:56.070 06:44:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:56.070 06:44:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:27:56.070 06:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:56.070 06:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:56.329 BaseBdev1_malloc 00:27:56.329 06:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:56.588 [2024-07-25 06:44:10.051062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:56.589 [2024-07-25 06:44:10.051111] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:56.589 [2024-07-25 06:44:10.051134] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24917b0 00:27:56.589 [2024-07-25 06:44:10.051152] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:56.589 [2024-07-25 06:44:10.052625] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:56.589 [2024-07-25 06:44:10.052654] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:56.589 BaseBdev1 00:27:56.589 06:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:56.589 06:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:56.848 BaseBdev2_malloc 00:27:56.848 06:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:57.107 [2024-07-25 06:44:10.516576] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:57.107 [2024-07-25 06:44:10.516619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:57.107 [2024-07-25 06:44:10.516641] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22df8f0 00:27:57.107 [2024-07-25 06:44:10.516652] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:57.107 [2024-07-25 06:44:10.517895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:57.107 [2024-07-25 06:44:10.517921] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:57.107 BaseBdev2 00:27:57.107 06:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:57.107 06:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:57.366 BaseBdev3_malloc 00:27:57.366 06:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:57.625 [2024-07-25 06:44:10.965764] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:57.625 [2024-07-25 06:44:10.965803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:57.625 [2024-07-25 06:44:10.965821] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2486330 00:27:57.625 [2024-07-25 06:44:10.965832] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:57.625 [2024-07-25 06:44:10.967053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:57.625 [2024-07-25 06:44:10.967079] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:57.625 BaseBdev3 00:27:57.625 06:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:57.625 06:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:57.884 BaseBdev4_malloc 00:27:57.884 06:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:57.884 [2024-07-25 06:44:11.419280] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:57.884 [2024-07-25 06:44:11.419316] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:57.884 [2024-07-25 06:44:11.419334] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d7240 00:27:57.884 [2024-07-25 06:44:11.419345] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:57.884 [2024-07-25 06:44:11.420569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:57.884 [2024-07-25 06:44:11.420593] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:57.884 BaseBdev4 00:27:57.884 06:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:58.143 spare_malloc 00:27:58.143 06:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:58.402 spare_delay 00:27:58.402 06:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:58.661 [2024-07-25 06:44:12.101214] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:58.661 [2024-07-25 06:44:12.101253] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:58.661 [2024-07-25 06:44:12.101273] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d7e90 00:27:58.662 [2024-07-25 06:44:12.101285] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:58.662 [2024-07-25 06:44:12.102517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:58.662 [2024-07-25 06:44:12.102542] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:58.662 spare 00:27:58.662 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:58.920 [2024-07-25 06:44:12.329846] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:58.920 [2024-07-25 06:44:12.330876] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:58.920 [2024-07-25 06:44:12.330924] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:58.920 [2024-07-25 06:44:12.330963] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:58.920 [2024-07-25 06:44:12.331130] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22d9190 00:27:58.920 [2024-07-25 06:44:12.331148] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:58.920 [2024-07-25 06:44:12.331304] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d6a70 00:27:58.920 [2024-07-25 06:44:12.331431] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22d9190 00:27:58.920 [2024-07-25 06:44:12.331441] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22d9190 00:27:58.920 [2024-07-25 06:44:12.331520] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.920 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.921 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.921 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.180 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:59.180 "name": "raid_bdev1", 00:27:59.180 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:27:59.180 "strip_size_kb": 0, 00:27:59.180 "state": "online", 00:27:59.180 "raid_level": "raid1", 00:27:59.180 "superblock": true, 00:27:59.180 "num_base_bdevs": 4, 00:27:59.180 "num_base_bdevs_discovered": 4, 00:27:59.180 "num_base_bdevs_operational": 4, 00:27:59.180 "base_bdevs_list": [ 00:27:59.180 { 00:27:59.180 "name": "BaseBdev1", 00:27:59.180 "uuid": "28fd27d5-92fb-5bad-be1c-c0b5ee5f7108", 00:27:59.180 "is_configured": true, 00:27:59.180 "data_offset": 2048, 00:27:59.180 "data_size": 63488 00:27:59.180 }, 00:27:59.180 { 00:27:59.180 "name": "BaseBdev2", 00:27:59.180 "uuid": "59d3216c-7b91-546a-be89-f5d554fd83f5", 00:27:59.180 "is_configured": true, 00:27:59.180 "data_offset": 2048, 00:27:59.180 "data_size": 63488 00:27:59.180 }, 00:27:59.180 { 00:27:59.180 "name": "BaseBdev3", 00:27:59.180 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:27:59.180 "is_configured": true, 00:27:59.180 "data_offset": 2048, 00:27:59.180 "data_size": 63488 00:27:59.180 }, 00:27:59.180 { 00:27:59.180 "name": "BaseBdev4", 00:27:59.180 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:27:59.180 "is_configured": true, 00:27:59.180 "data_offset": 2048, 00:27:59.180 "data_size": 63488 00:27:59.180 } 00:27:59.180 ] 00:27:59.180 }' 00:27:59.180 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:59.180 06:44:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:59.748 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:59.748 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:28:00.010 [2024-07-25 06:44:13.372852] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:00.010 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:28:00.010 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.010 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:00.271 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:28:00.271 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:28:00.271 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:00.271 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:28:00.530 [2024-07-25 06:44:13.831756] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2483b80 00:28:00.530 [2024-07-25 06:44:13.833888] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:00.530 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:00.530 Zero copy mechanism will not be used. 00:28:00.530 Running I/O for 60 seconds... 00:28:00.530 [2024-07-25 06:44:13.834079] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2483b80 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.530 06:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.789 06:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:00.789 "name": "raid_bdev1", 00:28:00.789 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:00.789 "strip_size_kb": 0, 00:28:00.789 "state": "online", 00:28:00.789 "raid_level": "raid1", 00:28:00.789 "superblock": true, 00:28:00.789 "num_base_bdevs": 4, 00:28:00.789 "num_base_bdevs_discovered": 3, 00:28:00.789 "num_base_bdevs_operational": 3, 00:28:00.789 "base_bdevs_list": [ 00:28:00.789 { 00:28:00.789 "name": null, 00:28:00.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.789 "is_configured": false, 00:28:00.789 "data_offset": 2048, 00:28:00.789 "data_size": 63488 00:28:00.789 }, 00:28:00.789 { 00:28:00.789 "name": "BaseBdev2", 00:28:00.789 "uuid": "59d3216c-7b91-546a-be89-f5d554fd83f5", 00:28:00.789 "is_configured": true, 00:28:00.789 "data_offset": 2048, 00:28:00.789 "data_size": 63488 00:28:00.789 }, 00:28:00.789 { 00:28:00.789 "name": "BaseBdev3", 00:28:00.789 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:00.789 "is_configured": true, 00:28:00.789 "data_offset": 2048, 00:28:00.789 "data_size": 63488 00:28:00.789 }, 00:28:00.789 { 00:28:00.789 "name": "BaseBdev4", 00:28:00.789 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:00.789 "is_configured": true, 00:28:00.789 "data_offset": 2048, 00:28:00.789 "data_size": 63488 00:28:00.789 } 00:28:00.789 ] 00:28:00.789 }' 00:28:00.789 06:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:00.789 06:44:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:01.357 06:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:01.616 [2024-07-25 06:44:14.926636] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:01.616 [2024-07-25 06:44:14.976953] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22dfd50 00:28:01.616 [2024-07-25 06:44:14.979165] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:01.616 06:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:01.616 [2024-07-25 06:44:15.097150] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:01.616 [2024-07-25 06:44:15.097631] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:01.874 [2024-07-25 06:44:15.319031] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:01.874 [2024-07-25 06:44:15.319312] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:02.133 [2024-07-25 06:44:15.610146] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:28:02.133 [2024-07-25 06:44:15.610372] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:28:02.392 [2024-07-25 06:44:15.831018] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:02.651 06:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:02.651 06:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:02.651 06:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:02.651 06:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:02.651 06:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:02.651 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.651 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.651 [2024-07-25 06:44:16.162834] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:28:02.651 [2024-07-25 06:44:16.163993] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:28:02.911 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:02.911 "name": "raid_bdev1", 00:28:02.911 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:02.911 "strip_size_kb": 0, 00:28:02.911 "state": "online", 00:28:02.911 "raid_level": "raid1", 00:28:02.911 "superblock": true, 00:28:02.911 "num_base_bdevs": 4, 00:28:02.911 "num_base_bdevs_discovered": 4, 00:28:02.911 "num_base_bdevs_operational": 4, 00:28:02.911 "process": { 00:28:02.911 "type": "rebuild", 00:28:02.911 "target": "spare", 00:28:02.911 "progress": { 00:28:02.911 "blocks": 14336, 00:28:02.911 "percent": 22 00:28:02.911 } 00:28:02.911 }, 00:28:02.911 "base_bdevs_list": [ 00:28:02.911 { 00:28:02.911 "name": "spare", 00:28:02.911 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:02.911 "is_configured": true, 00:28:02.911 "data_offset": 2048, 00:28:02.911 "data_size": 63488 00:28:02.911 }, 00:28:02.911 { 00:28:02.911 "name": "BaseBdev2", 00:28:02.911 "uuid": "59d3216c-7b91-546a-be89-f5d554fd83f5", 00:28:02.911 "is_configured": true, 00:28:02.911 "data_offset": 2048, 00:28:02.911 "data_size": 63488 00:28:02.911 }, 00:28:02.911 { 00:28:02.911 "name": "BaseBdev3", 00:28:02.911 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:02.911 "is_configured": true, 00:28:02.911 "data_offset": 2048, 00:28:02.911 "data_size": 63488 00:28:02.911 }, 00:28:02.911 { 00:28:02.911 "name": "BaseBdev4", 00:28:02.911 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:02.911 "is_configured": true, 00:28:02.911 "data_offset": 2048, 00:28:02.911 "data_size": 63488 00:28:02.911 } 00:28:02.911 ] 00:28:02.911 }' 00:28:02.911 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:02.911 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:02.911 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:02.911 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:02.911 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:02.911 [2024-07-25 06:44:16.400711] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:28:02.911 [2024-07-25 06:44:16.401255] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:28:03.170 [2024-07-25 06:44:16.523807] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:03.464 [2024-07-25 06:44:16.728231] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:03.464 [2024-07-25 06:44:16.739648] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:03.464 [2024-07-25 06:44:16.739679] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:03.464 [2024-07-25 06:44:16.739688] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:03.464 [2024-07-25 06:44:16.768158] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2483b80 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.464 06:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.724 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.724 "name": "raid_bdev1", 00:28:03.724 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:03.724 "strip_size_kb": 0, 00:28:03.724 "state": "online", 00:28:03.724 "raid_level": "raid1", 00:28:03.724 "superblock": true, 00:28:03.724 "num_base_bdevs": 4, 00:28:03.724 "num_base_bdevs_discovered": 3, 00:28:03.724 "num_base_bdevs_operational": 3, 00:28:03.724 "base_bdevs_list": [ 00:28:03.724 { 00:28:03.724 "name": null, 00:28:03.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:03.724 "is_configured": false, 00:28:03.724 "data_offset": 2048, 00:28:03.724 "data_size": 63488 00:28:03.724 }, 00:28:03.724 { 00:28:03.724 "name": "BaseBdev2", 00:28:03.724 "uuid": "59d3216c-7b91-546a-be89-f5d554fd83f5", 00:28:03.724 "is_configured": true, 00:28:03.724 "data_offset": 2048, 00:28:03.724 "data_size": 63488 00:28:03.724 }, 00:28:03.724 { 00:28:03.724 "name": "BaseBdev3", 00:28:03.724 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:03.724 "is_configured": true, 00:28:03.724 "data_offset": 2048, 00:28:03.724 "data_size": 63488 00:28:03.724 }, 00:28:03.724 { 00:28:03.724 "name": "BaseBdev4", 00:28:03.724 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:03.724 "is_configured": true, 00:28:03.724 "data_offset": 2048, 00:28:03.724 "data_size": 63488 00:28:03.724 } 00:28:03.724 ] 00:28:03.724 }' 00:28:03.724 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.724 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:04.291 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:04.291 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:04.291 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:04.291 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:04.291 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:04.291 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.291 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:04.550 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:04.550 "name": "raid_bdev1", 00:28:04.550 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:04.550 "strip_size_kb": 0, 00:28:04.550 "state": "online", 00:28:04.550 "raid_level": "raid1", 00:28:04.550 "superblock": true, 00:28:04.550 "num_base_bdevs": 4, 00:28:04.550 "num_base_bdevs_discovered": 3, 00:28:04.550 "num_base_bdevs_operational": 3, 00:28:04.550 "base_bdevs_list": [ 00:28:04.550 { 00:28:04.550 "name": null, 00:28:04.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:04.550 "is_configured": false, 00:28:04.550 "data_offset": 2048, 00:28:04.550 "data_size": 63488 00:28:04.550 }, 00:28:04.550 { 00:28:04.550 "name": "BaseBdev2", 00:28:04.550 "uuid": "59d3216c-7b91-546a-be89-f5d554fd83f5", 00:28:04.550 "is_configured": true, 00:28:04.550 "data_offset": 2048, 00:28:04.550 "data_size": 63488 00:28:04.550 }, 00:28:04.550 { 00:28:04.550 "name": "BaseBdev3", 00:28:04.550 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:04.550 "is_configured": true, 00:28:04.550 "data_offset": 2048, 00:28:04.550 "data_size": 63488 00:28:04.550 }, 00:28:04.550 { 00:28:04.550 "name": "BaseBdev4", 00:28:04.550 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:04.550 "is_configured": true, 00:28:04.550 "data_offset": 2048, 00:28:04.550 "data_size": 63488 00:28:04.550 } 00:28:04.550 ] 00:28:04.550 }' 00:28:04.550 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:04.550 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:04.550 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:04.550 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:04.550 06:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:04.809 [2024-07-25 06:44:18.213489] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:04.809 [2024-07-25 06:44:18.256271] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22dd6c0 00:28:04.809 [2024-07-25 06:44:18.257660] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:04.809 06:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:28:05.068 [2024-07-25 06:44:18.403553] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:05.069 [2024-07-25 06:44:18.575104] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:05.328 [2024-07-25 06:44:18.839585] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:28:05.587 [2024-07-25 06:44:18.959115] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:05.846 [2024-07-25 06:44:19.195806] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:28:05.846 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:05.846 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:05.846 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:05.846 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:05.846 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:05.846 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.846 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:06.105 "name": "raid_bdev1", 00:28:06.105 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:06.105 "strip_size_kb": 0, 00:28:06.105 "state": "online", 00:28:06.105 "raid_level": "raid1", 00:28:06.105 "superblock": true, 00:28:06.105 "num_base_bdevs": 4, 00:28:06.105 "num_base_bdevs_discovered": 4, 00:28:06.105 "num_base_bdevs_operational": 4, 00:28:06.105 "process": { 00:28:06.105 "type": "rebuild", 00:28:06.105 "target": "spare", 00:28:06.105 "progress": { 00:28:06.105 "blocks": 18432, 00:28:06.105 "percent": 29 00:28:06.105 } 00:28:06.105 }, 00:28:06.105 "base_bdevs_list": [ 00:28:06.105 { 00:28:06.105 "name": "spare", 00:28:06.105 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:06.105 "is_configured": true, 00:28:06.105 "data_offset": 2048, 00:28:06.105 "data_size": 63488 00:28:06.105 }, 00:28:06.105 { 00:28:06.105 "name": "BaseBdev2", 00:28:06.105 "uuid": "59d3216c-7b91-546a-be89-f5d554fd83f5", 00:28:06.105 "is_configured": true, 00:28:06.105 "data_offset": 2048, 00:28:06.105 "data_size": 63488 00:28:06.105 }, 00:28:06.105 { 00:28:06.105 "name": "BaseBdev3", 00:28:06.105 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:06.105 "is_configured": true, 00:28:06.105 "data_offset": 2048, 00:28:06.105 "data_size": 63488 00:28:06.105 }, 00:28:06.105 { 00:28:06.105 "name": "BaseBdev4", 00:28:06.105 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:06.105 "is_configured": true, 00:28:06.105 "data_offset": 2048, 00:28:06.105 "data_size": 63488 00:28:06.105 } 00:28:06.105 ] 00:28:06.105 }' 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:06.105 [2024-07-25 06:44:19.555910] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:28:06.105 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:28:06.105 06:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:28:06.365 [2024-07-25 06:44:19.665889] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:28:06.365 [2024-07-25 06:44:19.666463] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:28:06.365 [2024-07-25 06:44:19.807612] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:06.624 [2024-07-25 06:44:20.042723] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2483b80 00:28:06.624 [2024-07-25 06:44:20.042759] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x22dd6c0 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.624 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:06.883 "name": "raid_bdev1", 00:28:06.883 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:06.883 "strip_size_kb": 0, 00:28:06.883 "state": "online", 00:28:06.883 "raid_level": "raid1", 00:28:06.883 "superblock": true, 00:28:06.883 "num_base_bdevs": 4, 00:28:06.883 "num_base_bdevs_discovered": 3, 00:28:06.883 "num_base_bdevs_operational": 3, 00:28:06.883 "process": { 00:28:06.883 "type": "rebuild", 00:28:06.883 "target": "spare", 00:28:06.883 "progress": { 00:28:06.883 "blocks": 28672, 00:28:06.883 "percent": 45 00:28:06.883 } 00:28:06.883 }, 00:28:06.883 "base_bdevs_list": [ 00:28:06.883 { 00:28:06.883 "name": "spare", 00:28:06.883 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:06.883 "is_configured": true, 00:28:06.883 "data_offset": 2048, 00:28:06.883 "data_size": 63488 00:28:06.883 }, 00:28:06.883 { 00:28:06.883 "name": null, 00:28:06.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:06.883 "is_configured": false, 00:28:06.883 "data_offset": 2048, 00:28:06.883 "data_size": 63488 00:28:06.883 }, 00:28:06.883 { 00:28:06.883 "name": "BaseBdev3", 00:28:06.883 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:06.883 "is_configured": true, 00:28:06.883 "data_offset": 2048, 00:28:06.883 "data_size": 63488 00:28:06.883 }, 00:28:06.883 { 00:28:06.883 "name": "BaseBdev4", 00:28:06.883 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:06.883 "is_configured": true, 00:28:06.883 "data_offset": 2048, 00:28:06.883 "data_size": 63488 00:28:06.883 } 00:28:06.883 ] 00:28:06.883 }' 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # local timeout=909 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:06.883 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.884 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.143 [2024-07-25 06:44:20.499914] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:28:07.143 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:07.143 "name": "raid_bdev1", 00:28:07.143 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:07.143 "strip_size_kb": 0, 00:28:07.143 "state": "online", 00:28:07.143 "raid_level": "raid1", 00:28:07.143 "superblock": true, 00:28:07.143 "num_base_bdevs": 4, 00:28:07.143 "num_base_bdevs_discovered": 3, 00:28:07.143 "num_base_bdevs_operational": 3, 00:28:07.143 "process": { 00:28:07.143 "type": "rebuild", 00:28:07.143 "target": "spare", 00:28:07.143 "progress": { 00:28:07.143 "blocks": 32768, 00:28:07.143 "percent": 51 00:28:07.143 } 00:28:07.143 }, 00:28:07.143 "base_bdevs_list": [ 00:28:07.143 { 00:28:07.143 "name": "spare", 00:28:07.143 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:07.143 "is_configured": true, 00:28:07.143 "data_offset": 2048, 00:28:07.143 "data_size": 63488 00:28:07.143 }, 00:28:07.143 { 00:28:07.143 "name": null, 00:28:07.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.143 "is_configured": false, 00:28:07.143 "data_offset": 2048, 00:28:07.143 "data_size": 63488 00:28:07.143 }, 00:28:07.143 { 00:28:07.143 "name": "BaseBdev3", 00:28:07.143 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:07.143 "is_configured": true, 00:28:07.143 "data_offset": 2048, 00:28:07.143 "data_size": 63488 00:28:07.143 }, 00:28:07.143 { 00:28:07.143 "name": "BaseBdev4", 00:28:07.143 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:07.143 "is_configured": true, 00:28:07.143 "data_offset": 2048, 00:28:07.143 "data_size": 63488 00:28:07.143 } 00:28:07.143 ] 00:28:07.143 }' 00:28:07.143 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:07.143 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:07.143 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:07.403 [2024-07-25 06:44:20.720152] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:28:07.403 [2024-07-25 06:44:20.720310] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:28:07.403 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:07.403 06:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:28:07.663 [2024-07-25 06:44:21.074559] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:28:07.922 [2024-07-25 06:44:21.397681] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:28:08.182 [2024-07-25 06:44:21.643003] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:08.441 [2024-07-25 06:44:21.760657] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:28:08.441 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:08.441 "name": "raid_bdev1", 00:28:08.441 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:08.441 "strip_size_kb": 0, 00:28:08.441 "state": "online", 00:28:08.441 "raid_level": "raid1", 00:28:08.441 "superblock": true, 00:28:08.441 "num_base_bdevs": 4, 00:28:08.441 "num_base_bdevs_discovered": 3, 00:28:08.441 "num_base_bdevs_operational": 3, 00:28:08.441 "process": { 00:28:08.441 "type": "rebuild", 00:28:08.441 "target": "spare", 00:28:08.441 "progress": { 00:28:08.441 "blocks": 55296, 00:28:08.441 "percent": 87 00:28:08.441 } 00:28:08.441 }, 00:28:08.441 "base_bdevs_list": [ 00:28:08.441 { 00:28:08.441 "name": "spare", 00:28:08.441 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:08.441 "is_configured": true, 00:28:08.441 "data_offset": 2048, 00:28:08.441 "data_size": 63488 00:28:08.441 }, 00:28:08.441 { 00:28:08.441 "name": null, 00:28:08.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:08.441 "is_configured": false, 00:28:08.441 "data_offset": 2048, 00:28:08.441 "data_size": 63488 00:28:08.441 }, 00:28:08.441 { 00:28:08.441 "name": "BaseBdev3", 00:28:08.441 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:08.441 "is_configured": true, 00:28:08.441 "data_offset": 2048, 00:28:08.441 "data_size": 63488 00:28:08.441 }, 00:28:08.441 { 00:28:08.441 "name": "BaseBdev4", 00:28:08.441 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:08.441 "is_configured": true, 00:28:08.441 "data_offset": 2048, 00:28:08.441 "data_size": 63488 00:28:08.441 } 00:28:08.441 ] 00:28:08.441 }' 00:28:08.442 06:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:08.701 06:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:08.701 06:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:08.701 06:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:08.701 06:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:28:08.701 [2024-07-25 06:44:22.099197] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:28:08.961 [2024-07-25 06:44:22.325334] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:08.961 [2024-07-25 06:44:22.432958] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:08.961 [2024-07-25 06:44:22.434696] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:09.530 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:09.530 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:09.530 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:09.530 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:09.530 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:09.530 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:09.530 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:09.530 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.789 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:09.789 "name": "raid_bdev1", 00:28:09.789 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:09.789 "strip_size_kb": 0, 00:28:09.789 "state": "online", 00:28:09.789 "raid_level": "raid1", 00:28:09.789 "superblock": true, 00:28:09.789 "num_base_bdevs": 4, 00:28:09.789 "num_base_bdevs_discovered": 3, 00:28:09.789 "num_base_bdevs_operational": 3, 00:28:09.789 "base_bdevs_list": [ 00:28:09.789 { 00:28:09.789 "name": "spare", 00:28:09.789 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:09.789 "is_configured": true, 00:28:09.789 "data_offset": 2048, 00:28:09.789 "data_size": 63488 00:28:09.789 }, 00:28:09.789 { 00:28:09.789 "name": null, 00:28:09.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.789 "is_configured": false, 00:28:09.789 "data_offset": 2048, 00:28:09.789 "data_size": 63488 00:28:09.789 }, 00:28:09.789 { 00:28:09.789 "name": "BaseBdev3", 00:28:09.789 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:09.789 "is_configured": true, 00:28:09.789 "data_offset": 2048, 00:28:09.789 "data_size": 63488 00:28:09.789 }, 00:28:09.789 { 00:28:09.789 "name": "BaseBdev4", 00:28:09.789 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:09.789 "is_configured": true, 00:28:09.789 "data_offset": 2048, 00:28:09.789 "data_size": 63488 00:28:09.789 } 00:28:09.789 ] 00:28:09.789 }' 00:28:09.789 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # break 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.048 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:10.306 "name": "raid_bdev1", 00:28:10.306 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:10.306 "strip_size_kb": 0, 00:28:10.306 "state": "online", 00:28:10.306 "raid_level": "raid1", 00:28:10.306 "superblock": true, 00:28:10.306 "num_base_bdevs": 4, 00:28:10.306 "num_base_bdevs_discovered": 3, 00:28:10.306 "num_base_bdevs_operational": 3, 00:28:10.306 "base_bdevs_list": [ 00:28:10.306 { 00:28:10.306 "name": "spare", 00:28:10.306 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:10.306 "is_configured": true, 00:28:10.306 "data_offset": 2048, 00:28:10.306 "data_size": 63488 00:28:10.306 }, 00:28:10.306 { 00:28:10.306 "name": null, 00:28:10.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:10.306 "is_configured": false, 00:28:10.306 "data_offset": 2048, 00:28:10.306 "data_size": 63488 00:28:10.306 }, 00:28:10.306 { 00:28:10.306 "name": "BaseBdev3", 00:28:10.306 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:10.306 "is_configured": true, 00:28:10.306 "data_offset": 2048, 00:28:10.306 "data_size": 63488 00:28:10.306 }, 00:28:10.306 { 00:28:10.306 "name": "BaseBdev4", 00:28:10.306 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:10.306 "is_configured": true, 00:28:10.306 "data_offset": 2048, 00:28:10.306 "data_size": 63488 00:28:10.306 } 00:28:10.306 ] 00:28:10.306 }' 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:10.306 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:10.307 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.307 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:10.565 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:10.565 "name": "raid_bdev1", 00:28:10.565 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:10.565 "strip_size_kb": 0, 00:28:10.565 "state": "online", 00:28:10.565 "raid_level": "raid1", 00:28:10.565 "superblock": true, 00:28:10.565 "num_base_bdevs": 4, 00:28:10.565 "num_base_bdevs_discovered": 3, 00:28:10.565 "num_base_bdevs_operational": 3, 00:28:10.565 "base_bdevs_list": [ 00:28:10.565 { 00:28:10.565 "name": "spare", 00:28:10.565 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:10.565 "is_configured": true, 00:28:10.565 "data_offset": 2048, 00:28:10.565 "data_size": 63488 00:28:10.565 }, 00:28:10.565 { 00:28:10.565 "name": null, 00:28:10.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:10.565 "is_configured": false, 00:28:10.565 "data_offset": 2048, 00:28:10.565 "data_size": 63488 00:28:10.565 }, 00:28:10.565 { 00:28:10.565 "name": "BaseBdev3", 00:28:10.565 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:10.565 "is_configured": true, 00:28:10.565 "data_offset": 2048, 00:28:10.565 "data_size": 63488 00:28:10.565 }, 00:28:10.565 { 00:28:10.565 "name": "BaseBdev4", 00:28:10.565 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:10.565 "is_configured": true, 00:28:10.565 "data_offset": 2048, 00:28:10.565 "data_size": 63488 00:28:10.565 } 00:28:10.565 ] 00:28:10.565 }' 00:28:10.565 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:10.565 06:44:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:11.133 06:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:11.392 [2024-07-25 06:44:24.736473] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:11.392 [2024-07-25 06:44:24.736504] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:11.392 00:28:11.392 Latency(us) 00:28:11.392 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:11.392 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:28:11.392 raid_bdev1 : 10.97 96.34 289.01 0.00 0.00 13636.01 270.34 119957.09 00:28:11.392 =================================================================================================================== 00:28:11.392 Total : 96.34 289.01 0.00 0.00 13636.01 270.34 119957.09 00:28:11.392 [2024-07-25 06:44:24.836384] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:11.392 [2024-07-25 06:44:24.836413] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:11.392 [2024-07-25 06:44:24.836498] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:11.392 [2024-07-25 06:44:24.836509] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d9190 name raid_bdev1, state offline 00:28:11.392 0 00:28:11.392 06:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.392 06:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # jq length 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:11.651 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:28:11.946 /dev/nbd0 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:11.946 1+0 records in 00:28:11.946 1+0 records out 00:28:11.946 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025761 s, 15.9 MB/s 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:28:11.946 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z '' ']' 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@743 -- # continue 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev3 ']' 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:11.947 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:28:12.206 /dev/nbd1 00:28:12.206 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:12.206 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:12.206 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:28:12.206 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:28:12.206 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:12.206 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:12.206 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:12.207 1+0 records in 00:28:12.207 1+0 records out 00:28:12.207 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274256 s, 14.9 MB/s 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:12.207 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev4 ']' 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:12.466 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:12.467 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:12.467 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:12.467 06:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:28:12.727 /dev/nbd1 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:12.727 1+0 records in 00:28:12.727 1+0 records out 00:28:12.727 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00019457 s, 21.1 MB/s 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:12.727 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:12.987 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:28:13.246 06:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:13.505 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:13.764 [2024-07-25 06:44:27.212750] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:13.764 [2024-07-25 06:44:27.212797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:13.764 [2024-07-25 06:44:27.212817] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22dbbe0 00:28:13.764 [2024-07-25 06:44:27.212829] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:13.764 [2024-07-25 06:44:27.214339] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:13.764 [2024-07-25 06:44:27.214368] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:13.764 [2024-07-25 06:44:27.214445] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:13.764 [2024-07-25 06:44:27.214470] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:13.764 [2024-07-25 06:44:27.214565] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:28:13.764 [2024-07-25 06:44:27.214629] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:28:13.764 spare 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.764 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.764 [2024-07-25 06:44:27.314937] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2309b70 00:28:13.764 [2024-07-25 06:44:27.314952] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:13.764 [2024-07-25 06:44:27.315136] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22cb930 00:28:13.764 [2024-07-25 06:44:27.315284] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2309b70 00:28:13.764 [2024-07-25 06:44:27.315293] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2309b70 00:28:13.764 [2024-07-25 06:44:27.315393] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:14.023 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:14.023 "name": "raid_bdev1", 00:28:14.023 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:14.023 "strip_size_kb": 0, 00:28:14.023 "state": "online", 00:28:14.023 "raid_level": "raid1", 00:28:14.023 "superblock": true, 00:28:14.023 "num_base_bdevs": 4, 00:28:14.023 "num_base_bdevs_discovered": 3, 00:28:14.023 "num_base_bdevs_operational": 3, 00:28:14.023 "base_bdevs_list": [ 00:28:14.023 { 00:28:14.023 "name": "spare", 00:28:14.023 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:14.023 "is_configured": true, 00:28:14.023 "data_offset": 2048, 00:28:14.023 "data_size": 63488 00:28:14.023 }, 00:28:14.023 { 00:28:14.023 "name": null, 00:28:14.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.023 "is_configured": false, 00:28:14.023 "data_offset": 2048, 00:28:14.023 "data_size": 63488 00:28:14.023 }, 00:28:14.023 { 00:28:14.023 "name": "BaseBdev3", 00:28:14.023 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:14.023 "is_configured": true, 00:28:14.023 "data_offset": 2048, 00:28:14.023 "data_size": 63488 00:28:14.023 }, 00:28:14.023 { 00:28:14.023 "name": "BaseBdev4", 00:28:14.023 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:14.023 "is_configured": true, 00:28:14.023 "data_offset": 2048, 00:28:14.023 "data_size": 63488 00:28:14.023 } 00:28:14.023 ] 00:28:14.023 }' 00:28:14.023 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:14.023 06:44:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:14.589 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:14.589 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:14.589 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:14.589 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:14.589 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:14.589 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.589 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.848 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:14.848 "name": "raid_bdev1", 00:28:14.848 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:14.848 "strip_size_kb": 0, 00:28:14.848 "state": "online", 00:28:14.848 "raid_level": "raid1", 00:28:14.848 "superblock": true, 00:28:14.848 "num_base_bdevs": 4, 00:28:14.848 "num_base_bdevs_discovered": 3, 00:28:14.848 "num_base_bdevs_operational": 3, 00:28:14.848 "base_bdevs_list": [ 00:28:14.848 { 00:28:14.848 "name": "spare", 00:28:14.848 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:14.848 "is_configured": true, 00:28:14.848 "data_offset": 2048, 00:28:14.848 "data_size": 63488 00:28:14.848 }, 00:28:14.848 { 00:28:14.848 "name": null, 00:28:14.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.848 "is_configured": false, 00:28:14.848 "data_offset": 2048, 00:28:14.848 "data_size": 63488 00:28:14.848 }, 00:28:14.848 { 00:28:14.848 "name": "BaseBdev3", 00:28:14.848 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:14.848 "is_configured": true, 00:28:14.848 "data_offset": 2048, 00:28:14.848 "data_size": 63488 00:28:14.848 }, 00:28:14.848 { 00:28:14.848 "name": "BaseBdev4", 00:28:14.848 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:14.848 "is_configured": true, 00:28:14.848 "data_offset": 2048, 00:28:14.848 "data_size": 63488 00:28:14.848 } 00:28:14.848 ] 00:28:14.848 }' 00:28:14.848 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:14.848 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:14.848 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:14.848 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:14.848 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.848 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:15.108 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:28:15.108 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:15.367 [2024-07-25 06:44:28.809258] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.367 06:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.626 06:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.626 "name": "raid_bdev1", 00:28:15.626 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:15.626 "strip_size_kb": 0, 00:28:15.626 "state": "online", 00:28:15.626 "raid_level": "raid1", 00:28:15.626 "superblock": true, 00:28:15.626 "num_base_bdevs": 4, 00:28:15.626 "num_base_bdevs_discovered": 2, 00:28:15.626 "num_base_bdevs_operational": 2, 00:28:15.626 "base_bdevs_list": [ 00:28:15.626 { 00:28:15.626 "name": null, 00:28:15.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.626 "is_configured": false, 00:28:15.626 "data_offset": 2048, 00:28:15.626 "data_size": 63488 00:28:15.626 }, 00:28:15.626 { 00:28:15.626 "name": null, 00:28:15.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.626 "is_configured": false, 00:28:15.626 "data_offset": 2048, 00:28:15.626 "data_size": 63488 00:28:15.626 }, 00:28:15.626 { 00:28:15.626 "name": "BaseBdev3", 00:28:15.626 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:15.626 "is_configured": true, 00:28:15.626 "data_offset": 2048, 00:28:15.626 "data_size": 63488 00:28:15.626 }, 00:28:15.626 { 00:28:15.626 "name": "BaseBdev4", 00:28:15.626 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:15.626 "is_configured": true, 00:28:15.626 "data_offset": 2048, 00:28:15.626 "data_size": 63488 00:28:15.626 } 00:28:15.626 ] 00:28:15.626 }' 00:28:15.626 06:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.626 06:44:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:16.194 06:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:16.488 [2024-07-25 06:44:29.840111] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:16.488 [2024-07-25 06:44:29.840277] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:28:16.488 [2024-07-25 06:44:29.840295] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:16.488 [2024-07-25 06:44:29.840324] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:16.488 [2024-07-25 06:44:29.844447] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d8530 00:28:16.488 [2024-07-25 06:44:29.845690] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:16.488 06:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # sleep 1 00:28:17.423 06:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:17.423 06:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:17.423 06:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:17.423 06:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:17.423 06:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:17.423 06:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.423 06:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.681 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:17.681 "name": "raid_bdev1", 00:28:17.681 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:17.681 "strip_size_kb": 0, 00:28:17.681 "state": "online", 00:28:17.681 "raid_level": "raid1", 00:28:17.681 "superblock": true, 00:28:17.681 "num_base_bdevs": 4, 00:28:17.681 "num_base_bdevs_discovered": 3, 00:28:17.681 "num_base_bdevs_operational": 3, 00:28:17.681 "process": { 00:28:17.681 "type": "rebuild", 00:28:17.681 "target": "spare", 00:28:17.681 "progress": { 00:28:17.681 "blocks": 24576, 00:28:17.681 "percent": 38 00:28:17.681 } 00:28:17.681 }, 00:28:17.681 "base_bdevs_list": [ 00:28:17.681 { 00:28:17.681 "name": "spare", 00:28:17.681 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:17.681 "is_configured": true, 00:28:17.681 "data_offset": 2048, 00:28:17.681 "data_size": 63488 00:28:17.681 }, 00:28:17.681 { 00:28:17.681 "name": null, 00:28:17.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.681 "is_configured": false, 00:28:17.681 "data_offset": 2048, 00:28:17.681 "data_size": 63488 00:28:17.681 }, 00:28:17.681 { 00:28:17.681 "name": "BaseBdev3", 00:28:17.681 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:17.681 "is_configured": true, 00:28:17.681 "data_offset": 2048, 00:28:17.681 "data_size": 63488 00:28:17.681 }, 00:28:17.681 { 00:28:17.681 "name": "BaseBdev4", 00:28:17.681 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:17.681 "is_configured": true, 00:28:17.681 "data_offset": 2048, 00:28:17.681 "data_size": 63488 00:28:17.681 } 00:28:17.681 ] 00:28:17.681 }' 00:28:17.681 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:17.681 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:17.681 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:17.681 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:17.681 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:17.939 [2024-07-25 06:44:31.400192] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:17.939 [2024-07-25 06:44:31.457565] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:17.939 [2024-07-25 06:44:31.457611] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:17.939 [2024-07-25 06:44:31.457626] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:17.939 [2024-07-25 06:44:31.457634] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:17.939 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:17.939 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:17.939 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:17.939 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:17.939 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:17.939 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:17.939 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:17.940 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:17.940 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:17.940 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:17.940 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.940 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:18.198 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:18.198 "name": "raid_bdev1", 00:28:18.198 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:18.198 "strip_size_kb": 0, 00:28:18.198 "state": "online", 00:28:18.198 "raid_level": "raid1", 00:28:18.198 "superblock": true, 00:28:18.198 "num_base_bdevs": 4, 00:28:18.198 "num_base_bdevs_discovered": 2, 00:28:18.198 "num_base_bdevs_operational": 2, 00:28:18.198 "base_bdevs_list": [ 00:28:18.198 { 00:28:18.198 "name": null, 00:28:18.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:18.198 "is_configured": false, 00:28:18.198 "data_offset": 2048, 00:28:18.198 "data_size": 63488 00:28:18.198 }, 00:28:18.198 { 00:28:18.198 "name": null, 00:28:18.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:18.198 "is_configured": false, 00:28:18.198 "data_offset": 2048, 00:28:18.198 "data_size": 63488 00:28:18.198 }, 00:28:18.198 { 00:28:18.198 "name": "BaseBdev3", 00:28:18.198 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:18.198 "is_configured": true, 00:28:18.198 "data_offset": 2048, 00:28:18.198 "data_size": 63488 00:28:18.198 }, 00:28:18.198 { 00:28:18.198 "name": "BaseBdev4", 00:28:18.198 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:18.198 "is_configured": true, 00:28:18.198 "data_offset": 2048, 00:28:18.198 "data_size": 63488 00:28:18.198 } 00:28:18.198 ] 00:28:18.198 }' 00:28:18.198 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:18.198 06:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:18.766 06:44:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:19.025 [2024-07-25 06:44:32.512518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:19.025 [2024-07-25 06:44:32.512567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:19.025 [2024-07-25 06:44:32.512593] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2308a90 00:28:19.025 [2024-07-25 06:44:32.512605] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:19.025 [2024-07-25 06:44:32.512958] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:19.025 [2024-07-25 06:44:32.512975] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:19.025 [2024-07-25 06:44:32.513053] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:19.025 [2024-07-25 06:44:32.513064] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:28:19.025 [2024-07-25 06:44:32.513073] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:19.025 [2024-07-25 06:44:32.513091] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:19.025 [2024-07-25 06:44:32.517255] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d8530 00:28:19.025 spare 00:28:19.025 [2024-07-25 06:44:32.518666] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:19.025 06:44:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:28:20.402 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:20.402 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:20.402 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:20.402 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:20.402 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:20.402 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.402 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.402 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:20.402 "name": "raid_bdev1", 00:28:20.402 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:20.402 "strip_size_kb": 0, 00:28:20.402 "state": "online", 00:28:20.402 "raid_level": "raid1", 00:28:20.402 "superblock": true, 00:28:20.403 "num_base_bdevs": 4, 00:28:20.403 "num_base_bdevs_discovered": 3, 00:28:20.403 "num_base_bdevs_operational": 3, 00:28:20.403 "process": { 00:28:20.403 "type": "rebuild", 00:28:20.403 "target": "spare", 00:28:20.403 "progress": { 00:28:20.403 "blocks": 24576, 00:28:20.403 "percent": 38 00:28:20.403 } 00:28:20.403 }, 00:28:20.403 "base_bdevs_list": [ 00:28:20.403 { 00:28:20.403 "name": "spare", 00:28:20.403 "uuid": "71561407-6a10-54a4-8231-8d12fd1aa18d", 00:28:20.403 "is_configured": true, 00:28:20.403 "data_offset": 2048, 00:28:20.403 "data_size": 63488 00:28:20.403 }, 00:28:20.403 { 00:28:20.403 "name": null, 00:28:20.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.403 "is_configured": false, 00:28:20.403 "data_offset": 2048, 00:28:20.403 "data_size": 63488 00:28:20.403 }, 00:28:20.403 { 00:28:20.403 "name": "BaseBdev3", 00:28:20.403 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:20.403 "is_configured": true, 00:28:20.403 "data_offset": 2048, 00:28:20.403 "data_size": 63488 00:28:20.403 }, 00:28:20.403 { 00:28:20.403 "name": "BaseBdev4", 00:28:20.403 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:20.403 "is_configured": true, 00:28:20.403 "data_offset": 2048, 00:28:20.403 "data_size": 63488 00:28:20.403 } 00:28:20.403 ] 00:28:20.403 }' 00:28:20.403 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:20.403 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:20.403 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:20.403 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:20.403 06:44:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:20.662 [2024-07-25 06:44:34.067806] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:20.662 [2024-07-25 06:44:34.130521] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:20.662 [2024-07-25 06:44:34.130566] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:20.662 [2024-07-25 06:44:34.130581] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:20.662 [2024-07-25 06:44:34.130588] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.662 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.921 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.921 "name": "raid_bdev1", 00:28:20.921 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:20.921 "strip_size_kb": 0, 00:28:20.921 "state": "online", 00:28:20.921 "raid_level": "raid1", 00:28:20.921 "superblock": true, 00:28:20.921 "num_base_bdevs": 4, 00:28:20.921 "num_base_bdevs_discovered": 2, 00:28:20.921 "num_base_bdevs_operational": 2, 00:28:20.921 "base_bdevs_list": [ 00:28:20.921 { 00:28:20.921 "name": null, 00:28:20.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.921 "is_configured": false, 00:28:20.921 "data_offset": 2048, 00:28:20.921 "data_size": 63488 00:28:20.921 }, 00:28:20.921 { 00:28:20.921 "name": null, 00:28:20.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.921 "is_configured": false, 00:28:20.921 "data_offset": 2048, 00:28:20.921 "data_size": 63488 00:28:20.921 }, 00:28:20.921 { 00:28:20.921 "name": "BaseBdev3", 00:28:20.921 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:20.921 "is_configured": true, 00:28:20.921 "data_offset": 2048, 00:28:20.921 "data_size": 63488 00:28:20.921 }, 00:28:20.921 { 00:28:20.921 "name": "BaseBdev4", 00:28:20.921 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:20.921 "is_configured": true, 00:28:20.921 "data_offset": 2048, 00:28:20.921 "data_size": 63488 00:28:20.921 } 00:28:20.921 ] 00:28:20.921 }' 00:28:20.921 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.921 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:21.489 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:21.489 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.489 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:21.489 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:21.489 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.489 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.489 06:44:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.748 06:44:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:21.748 "name": "raid_bdev1", 00:28:21.748 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:21.748 "strip_size_kb": 0, 00:28:21.748 "state": "online", 00:28:21.748 "raid_level": "raid1", 00:28:21.748 "superblock": true, 00:28:21.748 "num_base_bdevs": 4, 00:28:21.748 "num_base_bdevs_discovered": 2, 00:28:21.748 "num_base_bdevs_operational": 2, 00:28:21.748 "base_bdevs_list": [ 00:28:21.748 { 00:28:21.748 "name": null, 00:28:21.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.748 "is_configured": false, 00:28:21.748 "data_offset": 2048, 00:28:21.748 "data_size": 63488 00:28:21.748 }, 00:28:21.748 { 00:28:21.748 "name": null, 00:28:21.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.748 "is_configured": false, 00:28:21.748 "data_offset": 2048, 00:28:21.748 "data_size": 63488 00:28:21.748 }, 00:28:21.748 { 00:28:21.748 "name": "BaseBdev3", 00:28:21.748 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:21.748 "is_configured": true, 00:28:21.748 "data_offset": 2048, 00:28:21.748 "data_size": 63488 00:28:21.748 }, 00:28:21.748 { 00:28:21.748 "name": "BaseBdev4", 00:28:21.748 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:21.748 "is_configured": true, 00:28:21.748 "data_offset": 2048, 00:28:21.748 "data_size": 63488 00:28:21.748 } 00:28:21.748 ] 00:28:21.748 }' 00:28:21.748 06:44:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:21.748 06:44:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:21.748 06:44:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:22.007 06:44:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:22.008 06:44:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:22.008 06:44:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:22.267 [2024-07-25 06:44:35.731086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:22.267 [2024-07-25 06:44:35.731134] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:22.267 [2024-07-25 06:44:35.731162] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d8180 00:28:22.267 [2024-07-25 06:44:35.731174] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:22.267 [2024-07-25 06:44:35.731502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:22.267 [2024-07-25 06:44:35.731518] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:22.267 [2024-07-25 06:44:35.731580] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:22.267 [2024-07-25 06:44:35.731591] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:28:22.267 [2024-07-25 06:44:35.731601] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:22.267 BaseBdev1 00:28:22.267 06:44:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # sleep 1 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.203 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.463 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.463 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.463 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.463 "name": "raid_bdev1", 00:28:23.463 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:23.463 "strip_size_kb": 0, 00:28:23.463 "state": "online", 00:28:23.463 "raid_level": "raid1", 00:28:23.463 "superblock": true, 00:28:23.463 "num_base_bdevs": 4, 00:28:23.463 "num_base_bdevs_discovered": 2, 00:28:23.463 "num_base_bdevs_operational": 2, 00:28:23.463 "base_bdevs_list": [ 00:28:23.463 { 00:28:23.463 "name": null, 00:28:23.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.463 "is_configured": false, 00:28:23.463 "data_offset": 2048, 00:28:23.463 "data_size": 63488 00:28:23.463 }, 00:28:23.463 { 00:28:23.463 "name": null, 00:28:23.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.463 "is_configured": false, 00:28:23.463 "data_offset": 2048, 00:28:23.463 "data_size": 63488 00:28:23.463 }, 00:28:23.463 { 00:28:23.463 "name": "BaseBdev3", 00:28:23.463 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:23.463 "is_configured": true, 00:28:23.463 "data_offset": 2048, 00:28:23.463 "data_size": 63488 00:28:23.463 }, 00:28:23.463 { 00:28:23.463 "name": "BaseBdev4", 00:28:23.463 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:23.463 "is_configured": true, 00:28:23.463 "data_offset": 2048, 00:28:23.463 "data_size": 63488 00:28:23.463 } 00:28:23.463 ] 00:28:23.463 }' 00:28:23.463 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.463 06:44:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:24.031 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:24.031 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:24.031 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:24.031 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:24.031 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:24.031 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.031 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:24.290 "name": "raid_bdev1", 00:28:24.290 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:24.290 "strip_size_kb": 0, 00:28:24.290 "state": "online", 00:28:24.290 "raid_level": "raid1", 00:28:24.290 "superblock": true, 00:28:24.290 "num_base_bdevs": 4, 00:28:24.290 "num_base_bdevs_discovered": 2, 00:28:24.290 "num_base_bdevs_operational": 2, 00:28:24.290 "base_bdevs_list": [ 00:28:24.290 { 00:28:24.290 "name": null, 00:28:24.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.290 "is_configured": false, 00:28:24.290 "data_offset": 2048, 00:28:24.290 "data_size": 63488 00:28:24.290 }, 00:28:24.290 { 00:28:24.290 "name": null, 00:28:24.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.290 "is_configured": false, 00:28:24.290 "data_offset": 2048, 00:28:24.290 "data_size": 63488 00:28:24.290 }, 00:28:24.290 { 00:28:24.290 "name": "BaseBdev3", 00:28:24.290 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:24.290 "is_configured": true, 00:28:24.290 "data_offset": 2048, 00:28:24.290 "data_size": 63488 00:28:24.290 }, 00:28:24.290 { 00:28:24.290 "name": "BaseBdev4", 00:28:24.290 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:24.290 "is_configured": true, 00:28:24.290 "data_offset": 2048, 00:28:24.290 "data_size": 63488 00:28:24.290 } 00:28:24.290 ] 00:28:24.290 }' 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.290 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:24.291 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.291 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:24.291 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.291 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:24.291 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.291 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:24.291 06:44:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:24.550 [2024-07-25 06:44:38.045506] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:24.550 [2024-07-25 06:44:38.045633] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:28:24.550 [2024-07-25 06:44:38.045648] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:24.550 request: 00:28:24.550 { 00:28:24.550 "base_bdev": "BaseBdev1", 00:28:24.550 "raid_bdev": "raid_bdev1", 00:28:24.550 "method": "bdev_raid_add_base_bdev", 00:28:24.550 "req_id": 1 00:28:24.550 } 00:28:24.550 Got JSON-RPC error response 00:28:24.550 response: 00:28:24.550 { 00:28:24.550 "code": -22, 00:28:24.550 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:24.550 } 00:28:24.550 06:44:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:28:24.550 06:44:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:28:24.550 06:44:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:28:24.550 06:44:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:28:24.550 06:44:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@793 -- # sleep 1 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.927 "name": "raid_bdev1", 00:28:25.927 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:25.927 "strip_size_kb": 0, 00:28:25.927 "state": "online", 00:28:25.927 "raid_level": "raid1", 00:28:25.927 "superblock": true, 00:28:25.927 "num_base_bdevs": 4, 00:28:25.927 "num_base_bdevs_discovered": 2, 00:28:25.927 "num_base_bdevs_operational": 2, 00:28:25.927 "base_bdevs_list": [ 00:28:25.927 { 00:28:25.927 "name": null, 00:28:25.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:25.927 "is_configured": false, 00:28:25.927 "data_offset": 2048, 00:28:25.927 "data_size": 63488 00:28:25.927 }, 00:28:25.927 { 00:28:25.927 "name": null, 00:28:25.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:25.927 "is_configured": false, 00:28:25.927 "data_offset": 2048, 00:28:25.927 "data_size": 63488 00:28:25.927 }, 00:28:25.927 { 00:28:25.927 "name": "BaseBdev3", 00:28:25.927 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:25.927 "is_configured": true, 00:28:25.927 "data_offset": 2048, 00:28:25.927 "data_size": 63488 00:28:25.927 }, 00:28:25.927 { 00:28:25.927 "name": "BaseBdev4", 00:28:25.927 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:25.927 "is_configured": true, 00:28:25.927 "data_offset": 2048, 00:28:25.927 "data_size": 63488 00:28:25.927 } 00:28:25.927 ] 00:28:25.927 }' 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.927 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:26.495 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:26.495 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:26.495 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:26.495 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:26.495 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:26.495 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.495 06:44:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.496 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:26.496 "name": "raid_bdev1", 00:28:26.496 "uuid": "e1119d93-343d-4108-8fc9-26c671db52d9", 00:28:26.496 "strip_size_kb": 0, 00:28:26.496 "state": "online", 00:28:26.496 "raid_level": "raid1", 00:28:26.496 "superblock": true, 00:28:26.496 "num_base_bdevs": 4, 00:28:26.496 "num_base_bdevs_discovered": 2, 00:28:26.496 "num_base_bdevs_operational": 2, 00:28:26.496 "base_bdevs_list": [ 00:28:26.496 { 00:28:26.496 "name": null, 00:28:26.496 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.496 "is_configured": false, 00:28:26.496 "data_offset": 2048, 00:28:26.496 "data_size": 63488 00:28:26.496 }, 00:28:26.496 { 00:28:26.496 "name": null, 00:28:26.496 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.496 "is_configured": false, 00:28:26.496 "data_offset": 2048, 00:28:26.496 "data_size": 63488 00:28:26.496 }, 00:28:26.496 { 00:28:26.496 "name": "BaseBdev3", 00:28:26.496 "uuid": "515c9443-c5ed-5a21-9c80-800328b9ef8c", 00:28:26.496 "is_configured": true, 00:28:26.496 "data_offset": 2048, 00:28:26.496 "data_size": 63488 00:28:26.496 }, 00:28:26.496 { 00:28:26.496 "name": "BaseBdev4", 00:28:26.496 "uuid": "9e1abc15-fa70-597b-a5b2-ae6c79b70580", 00:28:26.496 "is_configured": true, 00:28:26.496 "data_offset": 2048, 00:28:26.496 "data_size": 63488 00:28:26.496 } 00:28:26.496 ] 00:28:26.496 }' 00:28:26.496 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@798 -- # killprocess 1253892 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 1253892 ']' 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 1253892 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1253892 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1253892' 00:28:26.755 killing process with pid 1253892 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 1253892 00:28:26.755 Received shutdown signal, test time was about 26.277426 seconds 00:28:26.755 00:28:26.755 Latency(us) 00:28:26.755 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:26.755 =================================================================================================================== 00:28:26.755 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:26.755 [2024-07-25 06:44:40.175470] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:26.755 [2024-07-25 06:44:40.175574] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:26.755 [2024-07-25 06:44:40.175629] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:26.755 [2024-07-25 06:44:40.175640] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2309b70 name raid_bdev1, state offline 00:28:26.755 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 1253892 00:28:26.755 [2024-07-25 06:44:40.210077] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:27.014 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@800 -- # return 0 00:28:27.014 00:28:27.014 real 0m31.753s 00:28:27.014 user 0m49.722s 00:28:27.014 sys 0m5.115s 00:28:27.014 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:27.014 06:44:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:27.014 ************************************ 00:28:27.014 END TEST raid_rebuild_test_sb_io 00:28:27.014 ************************************ 00:28:27.014 06:44:40 bdev_raid -- bdev/bdev_raid.sh@964 -- # '[' n == y ']' 00:28:27.014 06:44:40 bdev_raid -- bdev/bdev_raid.sh@976 -- # base_blocklen=4096 00:28:27.014 06:44:40 bdev_raid -- bdev/bdev_raid.sh@978 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:28:27.014 06:44:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:28:27.014 06:44:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:27.014 06:44:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:27.014 ************************************ 00:28:27.014 START TEST raid_state_function_test_sb_4k 00:28:27.014 ************************************ 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1259559 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1259559' 00:28:27.014 Process raid pid: 1259559 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1259559 /var/tmp/spdk-raid.sock 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 1259559 ']' 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:27.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:27.014 06:44:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:27.014 [2024-07-25 06:44:40.550132] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:28:27.014 [2024-07-25 06:44:40.550196] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:01.0 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:01.1 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:01.2 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:01.3 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:01.4 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:01.5 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:01.6 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:01.7 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:02.0 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:02.1 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:02.2 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:02.3 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:02.4 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:02.5 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:02.6 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3d:02.7 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:01.0 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:01.1 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:01.2 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:01.3 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:01.4 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:01.5 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:01.6 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:01.7 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:02.0 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:02.1 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:02.2 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:02.3 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:02.4 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:02.5 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:02.6 cannot be used 00:28:27.273 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:27.273 EAL: Requested device 0000:3f:02.7 cannot be used 00:28:27.273 [2024-07-25 06:44:40.687532] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:27.274 [2024-07-25 06:44:40.731784] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:27.274 [2024-07-25 06:44:40.790174] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:27.274 [2024-07-25 06:44:40.790199] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:27.838 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:27.838 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:28:27.838 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:28.097 [2024-07-25 06:44:41.573812] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:28.097 [2024-07-25 06:44:41.573845] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:28.097 [2024-07-25 06:44:41.573855] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:28.097 [2024-07-25 06:44:41.573866] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.097 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:28.355 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:28.355 "name": "Existed_Raid", 00:28:28.355 "uuid": "f829f919-046c-4d29-9679-f415de26ffd6", 00:28:28.355 "strip_size_kb": 0, 00:28:28.355 "state": "configuring", 00:28:28.355 "raid_level": "raid1", 00:28:28.355 "superblock": true, 00:28:28.355 "num_base_bdevs": 2, 00:28:28.355 "num_base_bdevs_discovered": 0, 00:28:28.355 "num_base_bdevs_operational": 2, 00:28:28.355 "base_bdevs_list": [ 00:28:28.355 { 00:28:28.355 "name": "BaseBdev1", 00:28:28.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.355 "is_configured": false, 00:28:28.355 "data_offset": 0, 00:28:28.355 "data_size": 0 00:28:28.355 }, 00:28:28.355 { 00:28:28.355 "name": "BaseBdev2", 00:28:28.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.355 "is_configured": false, 00:28:28.355 "data_offset": 0, 00:28:28.355 "data_size": 0 00:28:28.355 } 00:28:28.355 ] 00:28:28.355 }' 00:28:28.355 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:28.355 06:44:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:28.921 06:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:29.179 [2024-07-25 06:44:42.560277] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:29.179 [2024-07-25 06:44:42.560303] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c9470 name Existed_Raid, state configuring 00:28:29.179 06:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:29.438 [2024-07-25 06:44:42.788886] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:29.438 [2024-07-25 06:44:42.788914] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:29.438 [2024-07-25 06:44:42.788923] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:29.438 [2024-07-25 06:44:42.788934] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:29.438 06:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:28:29.696 [2024-07-25 06:44:43.022876] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:29.696 BaseBdev1 00:28:29.696 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:29.696 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:28:29.696 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:29.696 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:28:29.696 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:29.696 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:29.696 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:29.954 [ 00:28:29.954 { 00:28:29.954 "name": "BaseBdev1", 00:28:29.954 "aliases": [ 00:28:29.954 "61cb2e7f-7c0e-44b6-a638-aa9eeb89033d" 00:28:29.954 ], 00:28:29.954 "product_name": "Malloc disk", 00:28:29.954 "block_size": 4096, 00:28:29.954 "num_blocks": 8192, 00:28:29.954 "uuid": "61cb2e7f-7c0e-44b6-a638-aa9eeb89033d", 00:28:29.954 "assigned_rate_limits": { 00:28:29.954 "rw_ios_per_sec": 0, 00:28:29.954 "rw_mbytes_per_sec": 0, 00:28:29.954 "r_mbytes_per_sec": 0, 00:28:29.954 "w_mbytes_per_sec": 0 00:28:29.954 }, 00:28:29.954 "claimed": true, 00:28:29.954 "claim_type": "exclusive_write", 00:28:29.954 "zoned": false, 00:28:29.954 "supported_io_types": { 00:28:29.954 "read": true, 00:28:29.954 "write": true, 00:28:29.954 "unmap": true, 00:28:29.954 "flush": true, 00:28:29.954 "reset": true, 00:28:29.954 "nvme_admin": false, 00:28:29.954 "nvme_io": false, 00:28:29.954 "nvme_io_md": false, 00:28:29.954 "write_zeroes": true, 00:28:29.954 "zcopy": true, 00:28:29.954 "get_zone_info": false, 00:28:29.954 "zone_management": false, 00:28:29.954 "zone_append": false, 00:28:29.954 "compare": false, 00:28:29.954 "compare_and_write": false, 00:28:29.954 "abort": true, 00:28:29.954 "seek_hole": false, 00:28:29.954 "seek_data": false, 00:28:29.954 "copy": true, 00:28:29.954 "nvme_iov_md": false 00:28:29.954 }, 00:28:29.954 "memory_domains": [ 00:28:29.954 { 00:28:29.954 "dma_device_id": "system", 00:28:29.954 "dma_device_type": 1 00:28:29.954 }, 00:28:29.954 { 00:28:29.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:29.954 "dma_device_type": 2 00:28:29.954 } 00:28:29.954 ], 00:28:29.954 "driver_specific": {} 00:28:29.954 } 00:28:29.954 ] 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.954 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:30.245 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:30.245 "name": "Existed_Raid", 00:28:30.245 "uuid": "836bb18a-9330-478f-ba3d-0f4c936e5cc0", 00:28:30.245 "strip_size_kb": 0, 00:28:30.245 "state": "configuring", 00:28:30.245 "raid_level": "raid1", 00:28:30.245 "superblock": true, 00:28:30.245 "num_base_bdevs": 2, 00:28:30.245 "num_base_bdevs_discovered": 1, 00:28:30.245 "num_base_bdevs_operational": 2, 00:28:30.245 "base_bdevs_list": [ 00:28:30.245 { 00:28:30.245 "name": "BaseBdev1", 00:28:30.245 "uuid": "61cb2e7f-7c0e-44b6-a638-aa9eeb89033d", 00:28:30.245 "is_configured": true, 00:28:30.245 "data_offset": 256, 00:28:30.245 "data_size": 7936 00:28:30.245 }, 00:28:30.245 { 00:28:30.245 "name": "BaseBdev2", 00:28:30.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:30.245 "is_configured": false, 00:28:30.245 "data_offset": 0, 00:28:30.245 "data_size": 0 00:28:30.245 } 00:28:30.245 ] 00:28:30.245 }' 00:28:30.245 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:30.245 06:44:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:30.822 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:31.080 [2024-07-25 06:44:44.470871] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:31.080 [2024-07-25 06:44:44.470907] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c8ce0 name Existed_Raid, state configuring 00:28:31.080 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:31.337 [2024-07-25 06:44:44.695504] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:31.337 [2024-07-25 06:44:44.696876] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:31.337 [2024-07-25 06:44:44.696904] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:31.337 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:31.337 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:31.337 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:31.337 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.338 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:31.596 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.596 "name": "Existed_Raid", 00:28:31.596 "uuid": "a4d2bf17-197f-4015-9fca-b9e4a40de985", 00:28:31.596 "strip_size_kb": 0, 00:28:31.596 "state": "configuring", 00:28:31.596 "raid_level": "raid1", 00:28:31.596 "superblock": true, 00:28:31.596 "num_base_bdevs": 2, 00:28:31.596 "num_base_bdevs_discovered": 1, 00:28:31.596 "num_base_bdevs_operational": 2, 00:28:31.596 "base_bdevs_list": [ 00:28:31.596 { 00:28:31.596 "name": "BaseBdev1", 00:28:31.596 "uuid": "61cb2e7f-7c0e-44b6-a638-aa9eeb89033d", 00:28:31.596 "is_configured": true, 00:28:31.596 "data_offset": 256, 00:28:31.596 "data_size": 7936 00:28:31.596 }, 00:28:31.596 { 00:28:31.596 "name": "BaseBdev2", 00:28:31.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.596 "is_configured": false, 00:28:31.596 "data_offset": 0, 00:28:31.596 "data_size": 0 00:28:31.596 } 00:28:31.596 ] 00:28:31.596 }' 00:28:31.596 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.596 06:44:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:32.162 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:28:32.162 [2024-07-25 06:44:45.705391] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:32.162 [2024-07-25 06:44:45.705523] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x147c120 00:28:32.162 [2024-07-25 06:44:45.705535] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:32.162 [2024-07-25 06:44:45.705695] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12ca050 00:28:32.162 [2024-07-25 06:44:45.705808] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x147c120 00:28:32.163 [2024-07-25 06:44:45.705817] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x147c120 00:28:32.163 [2024-07-25 06:44:45.705899] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:32.163 BaseBdev2 00:28:32.421 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:32.421 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:28:32.421 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:32.421 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:28:32.421 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:32.421 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:32.421 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:32.421 06:44:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:32.680 [ 00:28:32.680 { 00:28:32.680 "name": "BaseBdev2", 00:28:32.680 "aliases": [ 00:28:32.680 "8e3be8bd-dc84-4080-a2cd-c5e1fc385281" 00:28:32.680 ], 00:28:32.680 "product_name": "Malloc disk", 00:28:32.680 "block_size": 4096, 00:28:32.680 "num_blocks": 8192, 00:28:32.680 "uuid": "8e3be8bd-dc84-4080-a2cd-c5e1fc385281", 00:28:32.680 "assigned_rate_limits": { 00:28:32.680 "rw_ios_per_sec": 0, 00:28:32.680 "rw_mbytes_per_sec": 0, 00:28:32.680 "r_mbytes_per_sec": 0, 00:28:32.680 "w_mbytes_per_sec": 0 00:28:32.680 }, 00:28:32.680 "claimed": true, 00:28:32.680 "claim_type": "exclusive_write", 00:28:32.680 "zoned": false, 00:28:32.680 "supported_io_types": { 00:28:32.680 "read": true, 00:28:32.680 "write": true, 00:28:32.680 "unmap": true, 00:28:32.680 "flush": true, 00:28:32.680 "reset": true, 00:28:32.680 "nvme_admin": false, 00:28:32.680 "nvme_io": false, 00:28:32.680 "nvme_io_md": false, 00:28:32.680 "write_zeroes": true, 00:28:32.680 "zcopy": true, 00:28:32.680 "get_zone_info": false, 00:28:32.680 "zone_management": false, 00:28:32.680 "zone_append": false, 00:28:32.680 "compare": false, 00:28:32.680 "compare_and_write": false, 00:28:32.680 "abort": true, 00:28:32.680 "seek_hole": false, 00:28:32.680 "seek_data": false, 00:28:32.680 "copy": true, 00:28:32.680 "nvme_iov_md": false 00:28:32.680 }, 00:28:32.680 "memory_domains": [ 00:28:32.680 { 00:28:32.680 "dma_device_id": "system", 00:28:32.680 "dma_device_type": 1 00:28:32.680 }, 00:28:32.680 { 00:28:32.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:32.680 "dma_device_type": 2 00:28:32.680 } 00:28:32.680 ], 00:28:32.680 "driver_specific": {} 00:28:32.680 } 00:28:32.680 ] 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.680 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:32.939 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:32.939 "name": "Existed_Raid", 00:28:32.939 "uuid": "a4d2bf17-197f-4015-9fca-b9e4a40de985", 00:28:32.939 "strip_size_kb": 0, 00:28:32.939 "state": "online", 00:28:32.939 "raid_level": "raid1", 00:28:32.939 "superblock": true, 00:28:32.939 "num_base_bdevs": 2, 00:28:32.939 "num_base_bdevs_discovered": 2, 00:28:32.939 "num_base_bdevs_operational": 2, 00:28:32.939 "base_bdevs_list": [ 00:28:32.939 { 00:28:32.939 "name": "BaseBdev1", 00:28:32.939 "uuid": "61cb2e7f-7c0e-44b6-a638-aa9eeb89033d", 00:28:32.939 "is_configured": true, 00:28:32.939 "data_offset": 256, 00:28:32.939 "data_size": 7936 00:28:32.939 }, 00:28:32.939 { 00:28:32.939 "name": "BaseBdev2", 00:28:32.939 "uuid": "8e3be8bd-dc84-4080-a2cd-c5e1fc385281", 00:28:32.939 "is_configured": true, 00:28:32.939 "data_offset": 256, 00:28:32.939 "data_size": 7936 00:28:32.939 } 00:28:32.939 ] 00:28:32.939 }' 00:28:32.939 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:32.939 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:33.507 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:33.507 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:33.507 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:33.507 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:33.507 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:33.507 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:33.507 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:33.507 06:44:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:33.766 [2024-07-25 06:44:47.141550] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:33.766 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:33.766 "name": "Existed_Raid", 00:28:33.766 "aliases": [ 00:28:33.766 "a4d2bf17-197f-4015-9fca-b9e4a40de985" 00:28:33.766 ], 00:28:33.766 "product_name": "Raid Volume", 00:28:33.766 "block_size": 4096, 00:28:33.766 "num_blocks": 7936, 00:28:33.766 "uuid": "a4d2bf17-197f-4015-9fca-b9e4a40de985", 00:28:33.766 "assigned_rate_limits": { 00:28:33.766 "rw_ios_per_sec": 0, 00:28:33.766 "rw_mbytes_per_sec": 0, 00:28:33.766 "r_mbytes_per_sec": 0, 00:28:33.766 "w_mbytes_per_sec": 0 00:28:33.766 }, 00:28:33.766 "claimed": false, 00:28:33.766 "zoned": false, 00:28:33.766 "supported_io_types": { 00:28:33.766 "read": true, 00:28:33.766 "write": true, 00:28:33.766 "unmap": false, 00:28:33.766 "flush": false, 00:28:33.766 "reset": true, 00:28:33.766 "nvme_admin": false, 00:28:33.766 "nvme_io": false, 00:28:33.766 "nvme_io_md": false, 00:28:33.766 "write_zeroes": true, 00:28:33.766 "zcopy": false, 00:28:33.766 "get_zone_info": false, 00:28:33.766 "zone_management": false, 00:28:33.766 "zone_append": false, 00:28:33.766 "compare": false, 00:28:33.766 "compare_and_write": false, 00:28:33.766 "abort": false, 00:28:33.766 "seek_hole": false, 00:28:33.766 "seek_data": false, 00:28:33.766 "copy": false, 00:28:33.766 "nvme_iov_md": false 00:28:33.766 }, 00:28:33.766 "memory_domains": [ 00:28:33.766 { 00:28:33.766 "dma_device_id": "system", 00:28:33.766 "dma_device_type": 1 00:28:33.766 }, 00:28:33.766 { 00:28:33.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:33.766 "dma_device_type": 2 00:28:33.766 }, 00:28:33.766 { 00:28:33.766 "dma_device_id": "system", 00:28:33.766 "dma_device_type": 1 00:28:33.766 }, 00:28:33.766 { 00:28:33.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:33.766 "dma_device_type": 2 00:28:33.766 } 00:28:33.766 ], 00:28:33.766 "driver_specific": { 00:28:33.766 "raid": { 00:28:33.766 "uuid": "a4d2bf17-197f-4015-9fca-b9e4a40de985", 00:28:33.766 "strip_size_kb": 0, 00:28:33.766 "state": "online", 00:28:33.766 "raid_level": "raid1", 00:28:33.766 "superblock": true, 00:28:33.766 "num_base_bdevs": 2, 00:28:33.766 "num_base_bdevs_discovered": 2, 00:28:33.766 "num_base_bdevs_operational": 2, 00:28:33.766 "base_bdevs_list": [ 00:28:33.766 { 00:28:33.766 "name": "BaseBdev1", 00:28:33.766 "uuid": "61cb2e7f-7c0e-44b6-a638-aa9eeb89033d", 00:28:33.766 "is_configured": true, 00:28:33.766 "data_offset": 256, 00:28:33.766 "data_size": 7936 00:28:33.766 }, 00:28:33.766 { 00:28:33.766 "name": "BaseBdev2", 00:28:33.766 "uuid": "8e3be8bd-dc84-4080-a2cd-c5e1fc385281", 00:28:33.766 "is_configured": true, 00:28:33.766 "data_offset": 256, 00:28:33.766 "data_size": 7936 00:28:33.766 } 00:28:33.766 ] 00:28:33.766 } 00:28:33.766 } 00:28:33.766 }' 00:28:33.766 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:33.766 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:33.766 BaseBdev2' 00:28:33.766 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:33.766 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:33.766 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:34.025 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:34.025 "name": "BaseBdev1", 00:28:34.025 "aliases": [ 00:28:34.025 "61cb2e7f-7c0e-44b6-a638-aa9eeb89033d" 00:28:34.025 ], 00:28:34.025 "product_name": "Malloc disk", 00:28:34.025 "block_size": 4096, 00:28:34.025 "num_blocks": 8192, 00:28:34.025 "uuid": "61cb2e7f-7c0e-44b6-a638-aa9eeb89033d", 00:28:34.025 "assigned_rate_limits": { 00:28:34.025 "rw_ios_per_sec": 0, 00:28:34.025 "rw_mbytes_per_sec": 0, 00:28:34.025 "r_mbytes_per_sec": 0, 00:28:34.025 "w_mbytes_per_sec": 0 00:28:34.025 }, 00:28:34.025 "claimed": true, 00:28:34.025 "claim_type": "exclusive_write", 00:28:34.025 "zoned": false, 00:28:34.025 "supported_io_types": { 00:28:34.025 "read": true, 00:28:34.025 "write": true, 00:28:34.025 "unmap": true, 00:28:34.025 "flush": true, 00:28:34.025 "reset": true, 00:28:34.025 "nvme_admin": false, 00:28:34.025 "nvme_io": false, 00:28:34.025 "nvme_io_md": false, 00:28:34.025 "write_zeroes": true, 00:28:34.025 "zcopy": true, 00:28:34.025 "get_zone_info": false, 00:28:34.025 "zone_management": false, 00:28:34.025 "zone_append": false, 00:28:34.025 "compare": false, 00:28:34.025 "compare_and_write": false, 00:28:34.025 "abort": true, 00:28:34.025 "seek_hole": false, 00:28:34.025 "seek_data": false, 00:28:34.025 "copy": true, 00:28:34.025 "nvme_iov_md": false 00:28:34.025 }, 00:28:34.025 "memory_domains": [ 00:28:34.025 { 00:28:34.025 "dma_device_id": "system", 00:28:34.025 "dma_device_type": 1 00:28:34.025 }, 00:28:34.025 { 00:28:34.025 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.025 "dma_device_type": 2 00:28:34.025 } 00:28:34.025 ], 00:28:34.025 "driver_specific": {} 00:28:34.025 }' 00:28:34.025 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.025 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.025 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:34.025 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:34.025 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:34.284 06:44:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:34.542 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:34.542 "name": "BaseBdev2", 00:28:34.542 "aliases": [ 00:28:34.542 "8e3be8bd-dc84-4080-a2cd-c5e1fc385281" 00:28:34.542 ], 00:28:34.542 "product_name": "Malloc disk", 00:28:34.542 "block_size": 4096, 00:28:34.542 "num_blocks": 8192, 00:28:34.542 "uuid": "8e3be8bd-dc84-4080-a2cd-c5e1fc385281", 00:28:34.542 "assigned_rate_limits": { 00:28:34.542 "rw_ios_per_sec": 0, 00:28:34.542 "rw_mbytes_per_sec": 0, 00:28:34.542 "r_mbytes_per_sec": 0, 00:28:34.542 "w_mbytes_per_sec": 0 00:28:34.542 }, 00:28:34.542 "claimed": true, 00:28:34.542 "claim_type": "exclusive_write", 00:28:34.542 "zoned": false, 00:28:34.542 "supported_io_types": { 00:28:34.542 "read": true, 00:28:34.542 "write": true, 00:28:34.542 "unmap": true, 00:28:34.542 "flush": true, 00:28:34.542 "reset": true, 00:28:34.542 "nvme_admin": false, 00:28:34.542 "nvme_io": false, 00:28:34.542 "nvme_io_md": false, 00:28:34.542 "write_zeroes": true, 00:28:34.542 "zcopy": true, 00:28:34.542 "get_zone_info": false, 00:28:34.542 "zone_management": false, 00:28:34.542 "zone_append": false, 00:28:34.542 "compare": false, 00:28:34.542 "compare_and_write": false, 00:28:34.542 "abort": true, 00:28:34.542 "seek_hole": false, 00:28:34.543 "seek_data": false, 00:28:34.543 "copy": true, 00:28:34.543 "nvme_iov_md": false 00:28:34.543 }, 00:28:34.543 "memory_domains": [ 00:28:34.543 { 00:28:34.543 "dma_device_id": "system", 00:28:34.543 "dma_device_type": 1 00:28:34.543 }, 00:28:34.543 { 00:28:34.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.543 "dma_device_type": 2 00:28:34.543 } 00:28:34.543 ], 00:28:34.543 "driver_specific": {} 00:28:34.543 }' 00:28:34.543 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.543 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.543 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:34.543 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:34.801 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:34.801 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:34.801 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:34.801 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:34.801 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:34.801 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:34.801 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:35.060 [2024-07-25 06:44:48.573129] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.060 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:35.318 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.318 "name": "Existed_Raid", 00:28:35.318 "uuid": "a4d2bf17-197f-4015-9fca-b9e4a40de985", 00:28:35.318 "strip_size_kb": 0, 00:28:35.318 "state": "online", 00:28:35.318 "raid_level": "raid1", 00:28:35.318 "superblock": true, 00:28:35.318 "num_base_bdevs": 2, 00:28:35.318 "num_base_bdevs_discovered": 1, 00:28:35.318 "num_base_bdevs_operational": 1, 00:28:35.318 "base_bdevs_list": [ 00:28:35.318 { 00:28:35.318 "name": null, 00:28:35.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:35.318 "is_configured": false, 00:28:35.319 "data_offset": 256, 00:28:35.319 "data_size": 7936 00:28:35.319 }, 00:28:35.319 { 00:28:35.319 "name": "BaseBdev2", 00:28:35.319 "uuid": "8e3be8bd-dc84-4080-a2cd-c5e1fc385281", 00:28:35.319 "is_configured": true, 00:28:35.319 "data_offset": 256, 00:28:35.319 "data_size": 7936 00:28:35.319 } 00:28:35.319 ] 00:28:35.319 }' 00:28:35.319 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.319 06:44:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:35.885 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:35.885 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:35.885 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.885 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:36.144 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:36.144 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:36.144 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:36.404 [2024-07-25 06:44:49.789396] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:36.404 [2024-07-25 06:44:49.789473] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:36.404 [2024-07-25 06:44:49.799714] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:36.404 [2024-07-25 06:44:49.799744] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:36.404 [2024-07-25 06:44:49.799755] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x147c120 name Existed_Raid, state offline 00:28:36.404 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:36.404 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:36.404 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.404 06:44:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1259559 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 1259559 ']' 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 1259559 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1259559 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1259559' 00:28:36.663 killing process with pid 1259559 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@969 -- # kill 1259559 00:28:36.663 [2024-07-25 06:44:50.105372] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:36.663 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@974 -- # wait 1259559 00:28:36.663 [2024-07-25 06:44:50.106231] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:36.923 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:28:36.923 00:28:36.923 real 0m9.798s 00:28:36.923 user 0m17.370s 00:28:36.923 sys 0m1.897s 00:28:36.923 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:36.923 06:44:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:36.923 ************************************ 00:28:36.923 END TEST raid_state_function_test_sb_4k 00:28:36.923 ************************************ 00:28:36.923 06:44:50 bdev_raid -- bdev/bdev_raid.sh@979 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:28:36.923 06:44:50 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:28:36.923 06:44:50 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:36.923 06:44:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:36.923 ************************************ 00:28:36.923 START TEST raid_superblock_test_4k 00:28:36.923 ************************************ 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@414 -- # local strip_size 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@427 -- # raid_pid=1261461 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@428 -- # waitforlisten 1261461 /var/tmp/spdk-raid.sock 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@831 -- # '[' -z 1261461 ']' 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:36.923 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:36.923 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:36.923 [2024-07-25 06:44:50.405003] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:28:36.923 [2024-07-25 06:44:50.405048] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1261461 ] 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:01.0 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:01.1 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:01.2 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:01.3 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:01.4 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:01.5 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:01.6 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:01.7 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:02.0 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:02.1 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:02.2 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:02.3 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:02.4 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:02.5 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:02.6 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3d:02.7 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:01.0 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:01.1 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:01.2 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:01.3 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:01.4 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:01.5 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:01.6 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:01.7 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:02.0 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:02.1 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:02.2 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:02.3 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:02.4 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:02.5 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:02.6 cannot be used 00:28:36.923 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:36.923 EAL: Requested device 0000:3f:02.7 cannot be used 00:28:37.183 [2024-07-25 06:44:50.529304] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:37.183 [2024-07-25 06:44:50.573490] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:37.183 [2024-07-25 06:44:50.634177] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:37.183 [2024-07-25 06:44:50.634215] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@864 -- # return 0 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:37.183 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:28:37.442 malloc1 00:28:37.442 06:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:37.701 [2024-07-25 06:44:51.128840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:37.701 [2024-07-25 06:44:51.128884] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:37.701 [2024-07-25 06:44:51.128904] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x991d70 00:28:37.701 [2024-07-25 06:44:51.128915] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:37.701 [2024-07-25 06:44:51.130464] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:37.701 [2024-07-25 06:44:51.130491] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:37.701 pt1 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:37.701 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:28:37.961 malloc2 00:28:37.962 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:38.220 [2024-07-25 06:44:51.578280] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:38.220 [2024-07-25 06:44:51.578319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:38.220 [2024-07-25 06:44:51.578335] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7e0790 00:28:38.220 [2024-07-25 06:44:51.578347] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:38.220 [2024-07-25 06:44:51.579632] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:38.220 [2024-07-25 06:44:51.579658] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:38.220 pt2 00:28:38.220 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:28:38.220 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:38.220 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:38.479 [2024-07-25 06:44:51.790850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:38.479 [2024-07-25 06:44:51.791931] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:38.479 [2024-07-25 06:44:51.792065] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x9831c0 00:28:38.479 [2024-07-25 06:44:51.792078] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:38.479 [2024-07-25 06:44:51.792488] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7de880 00:28:38.479 [2024-07-25 06:44:51.792631] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9831c0 00:28:38.479 [2024-07-25 06:44:51.792641] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9831c0 00:28:38.479 [2024-07-25 06:44:51.792730] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.479 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.480 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.480 06:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.739 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.739 "name": "raid_bdev1", 00:28:38.739 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:38.739 "strip_size_kb": 0, 00:28:38.739 "state": "online", 00:28:38.739 "raid_level": "raid1", 00:28:38.739 "superblock": true, 00:28:38.739 "num_base_bdevs": 2, 00:28:38.739 "num_base_bdevs_discovered": 2, 00:28:38.739 "num_base_bdevs_operational": 2, 00:28:38.739 "base_bdevs_list": [ 00:28:38.739 { 00:28:38.739 "name": "pt1", 00:28:38.739 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:38.739 "is_configured": true, 00:28:38.739 "data_offset": 256, 00:28:38.739 "data_size": 7936 00:28:38.739 }, 00:28:38.739 { 00:28:38.739 "name": "pt2", 00:28:38.739 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:38.739 "is_configured": true, 00:28:38.739 "data_offset": 256, 00:28:38.739 "data_size": 7936 00:28:38.739 } 00:28:38.739 ] 00:28:38.739 }' 00:28:38.739 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.739 06:44:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:39.306 [2024-07-25 06:44:52.821784] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:39.306 "name": "raid_bdev1", 00:28:39.306 "aliases": [ 00:28:39.306 "d18186aa-48dd-4681-8109-238aa419b81c" 00:28:39.306 ], 00:28:39.306 "product_name": "Raid Volume", 00:28:39.306 "block_size": 4096, 00:28:39.306 "num_blocks": 7936, 00:28:39.306 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:39.306 "assigned_rate_limits": { 00:28:39.306 "rw_ios_per_sec": 0, 00:28:39.306 "rw_mbytes_per_sec": 0, 00:28:39.306 "r_mbytes_per_sec": 0, 00:28:39.306 "w_mbytes_per_sec": 0 00:28:39.306 }, 00:28:39.306 "claimed": false, 00:28:39.306 "zoned": false, 00:28:39.306 "supported_io_types": { 00:28:39.306 "read": true, 00:28:39.306 "write": true, 00:28:39.306 "unmap": false, 00:28:39.306 "flush": false, 00:28:39.306 "reset": true, 00:28:39.306 "nvme_admin": false, 00:28:39.306 "nvme_io": false, 00:28:39.306 "nvme_io_md": false, 00:28:39.306 "write_zeroes": true, 00:28:39.306 "zcopy": false, 00:28:39.306 "get_zone_info": false, 00:28:39.306 "zone_management": false, 00:28:39.306 "zone_append": false, 00:28:39.306 "compare": false, 00:28:39.306 "compare_and_write": false, 00:28:39.306 "abort": false, 00:28:39.306 "seek_hole": false, 00:28:39.306 "seek_data": false, 00:28:39.306 "copy": false, 00:28:39.306 "nvme_iov_md": false 00:28:39.306 }, 00:28:39.306 "memory_domains": [ 00:28:39.306 { 00:28:39.306 "dma_device_id": "system", 00:28:39.306 "dma_device_type": 1 00:28:39.306 }, 00:28:39.306 { 00:28:39.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.306 "dma_device_type": 2 00:28:39.306 }, 00:28:39.306 { 00:28:39.306 "dma_device_id": "system", 00:28:39.306 "dma_device_type": 1 00:28:39.306 }, 00:28:39.306 { 00:28:39.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.306 "dma_device_type": 2 00:28:39.306 } 00:28:39.306 ], 00:28:39.306 "driver_specific": { 00:28:39.306 "raid": { 00:28:39.306 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:39.306 "strip_size_kb": 0, 00:28:39.306 "state": "online", 00:28:39.306 "raid_level": "raid1", 00:28:39.306 "superblock": true, 00:28:39.306 "num_base_bdevs": 2, 00:28:39.306 "num_base_bdevs_discovered": 2, 00:28:39.306 "num_base_bdevs_operational": 2, 00:28:39.306 "base_bdevs_list": [ 00:28:39.306 { 00:28:39.306 "name": "pt1", 00:28:39.306 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:39.306 "is_configured": true, 00:28:39.306 "data_offset": 256, 00:28:39.306 "data_size": 7936 00:28:39.306 }, 00:28:39.306 { 00:28:39.306 "name": "pt2", 00:28:39.306 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:39.306 "is_configured": true, 00:28:39.306 "data_offset": 256, 00:28:39.306 "data_size": 7936 00:28:39.306 } 00:28:39.306 ] 00:28:39.306 } 00:28:39.306 } 00:28:39.306 }' 00:28:39.306 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:39.565 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:39.565 pt2' 00:28:39.565 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:39.565 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:39.565 06:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:39.824 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:39.824 "name": "pt1", 00:28:39.824 "aliases": [ 00:28:39.824 "00000000-0000-0000-0000-000000000001" 00:28:39.824 ], 00:28:39.824 "product_name": "passthru", 00:28:39.824 "block_size": 4096, 00:28:39.824 "num_blocks": 8192, 00:28:39.824 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:39.824 "assigned_rate_limits": { 00:28:39.824 "rw_ios_per_sec": 0, 00:28:39.824 "rw_mbytes_per_sec": 0, 00:28:39.824 "r_mbytes_per_sec": 0, 00:28:39.824 "w_mbytes_per_sec": 0 00:28:39.824 }, 00:28:39.824 "claimed": true, 00:28:39.824 "claim_type": "exclusive_write", 00:28:39.824 "zoned": false, 00:28:39.824 "supported_io_types": { 00:28:39.824 "read": true, 00:28:39.824 "write": true, 00:28:39.824 "unmap": true, 00:28:39.824 "flush": true, 00:28:39.824 "reset": true, 00:28:39.824 "nvme_admin": false, 00:28:39.824 "nvme_io": false, 00:28:39.824 "nvme_io_md": false, 00:28:39.824 "write_zeroes": true, 00:28:39.824 "zcopy": true, 00:28:39.824 "get_zone_info": false, 00:28:39.824 "zone_management": false, 00:28:39.824 "zone_append": false, 00:28:39.824 "compare": false, 00:28:39.824 "compare_and_write": false, 00:28:39.824 "abort": true, 00:28:39.824 "seek_hole": false, 00:28:39.824 "seek_data": false, 00:28:39.824 "copy": true, 00:28:39.824 "nvme_iov_md": false 00:28:39.824 }, 00:28:39.824 "memory_domains": [ 00:28:39.824 { 00:28:39.824 "dma_device_id": "system", 00:28:39.824 "dma_device_type": 1 00:28:39.824 }, 00:28:39.824 { 00:28:39.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.824 "dma_device_type": 2 00:28:39.824 } 00:28:39.824 ], 00:28:39.824 "driver_specific": { 00:28:39.824 "passthru": { 00:28:39.824 "name": "pt1", 00:28:39.824 "base_bdev_name": "malloc1" 00:28:39.824 } 00:28:39.824 } 00:28:39.824 }' 00:28:39.824 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:39.824 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:39.824 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:39.824 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:39.824 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:39.825 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:39.825 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:39.825 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:40.082 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:40.082 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:40.082 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:40.082 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:40.082 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:40.082 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:40.082 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:40.340 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:40.340 "name": "pt2", 00:28:40.340 "aliases": [ 00:28:40.340 "00000000-0000-0000-0000-000000000002" 00:28:40.340 ], 00:28:40.340 "product_name": "passthru", 00:28:40.340 "block_size": 4096, 00:28:40.340 "num_blocks": 8192, 00:28:40.340 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:40.340 "assigned_rate_limits": { 00:28:40.340 "rw_ios_per_sec": 0, 00:28:40.340 "rw_mbytes_per_sec": 0, 00:28:40.340 "r_mbytes_per_sec": 0, 00:28:40.340 "w_mbytes_per_sec": 0 00:28:40.340 }, 00:28:40.340 "claimed": true, 00:28:40.340 "claim_type": "exclusive_write", 00:28:40.340 "zoned": false, 00:28:40.340 "supported_io_types": { 00:28:40.340 "read": true, 00:28:40.340 "write": true, 00:28:40.340 "unmap": true, 00:28:40.340 "flush": true, 00:28:40.340 "reset": true, 00:28:40.340 "nvme_admin": false, 00:28:40.340 "nvme_io": false, 00:28:40.340 "nvme_io_md": false, 00:28:40.340 "write_zeroes": true, 00:28:40.340 "zcopy": true, 00:28:40.340 "get_zone_info": false, 00:28:40.340 "zone_management": false, 00:28:40.340 "zone_append": false, 00:28:40.340 "compare": false, 00:28:40.340 "compare_and_write": false, 00:28:40.340 "abort": true, 00:28:40.340 "seek_hole": false, 00:28:40.340 "seek_data": false, 00:28:40.340 "copy": true, 00:28:40.340 "nvme_iov_md": false 00:28:40.340 }, 00:28:40.340 "memory_domains": [ 00:28:40.340 { 00:28:40.340 "dma_device_id": "system", 00:28:40.340 "dma_device_type": 1 00:28:40.340 }, 00:28:40.340 { 00:28:40.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:40.340 "dma_device_type": 2 00:28:40.340 } 00:28:40.340 ], 00:28:40.340 "driver_specific": { 00:28:40.340 "passthru": { 00:28:40.340 "name": "pt2", 00:28:40.340 "base_bdev_name": "malloc2" 00:28:40.340 } 00:28:40.340 } 00:28:40.340 }' 00:28:40.340 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:40.340 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:40.340 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:40.340 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:40.340 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:40.598 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:40.598 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:40.598 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:40.598 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:40.598 06:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:40.598 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:40.598 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:40.598 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:40.598 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:28:40.858 [2024-07-25 06:44:54.277612] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:40.858 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=d18186aa-48dd-4681-8109-238aa419b81c 00:28:40.858 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # '[' -z d18186aa-48dd-4681-8109-238aa419b81c ']' 00:28:40.858 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:41.116 [2024-07-25 06:44:54.501962] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:41.116 [2024-07-25 06:44:54.501981] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:41.116 [2024-07-25 06:44:54.502033] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:41.116 [2024-07-25 06:44:54.502081] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:41.116 [2024-07-25 06:44:54.502092] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9831c0 name raid_bdev1, state offline 00:28:41.117 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.117 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:28:41.375 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:28:41.375 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:28:41.375 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:28:41.375 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:41.635 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:28:41.635 06:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:41.635 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:41.635 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # local es=0 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:41.894 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:42.153 [2024-07-25 06:44:55.628956] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:42.153 [2024-07-25 06:44:55.630191] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:42.153 [2024-07-25 06:44:55.630240] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:42.153 [2024-07-25 06:44:55.630277] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:42.153 [2024-07-25 06:44:55.630294] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:42.153 [2024-07-25 06:44:55.630302] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9845a0 name raid_bdev1, state configuring 00:28:42.153 request: 00:28:42.153 { 00:28:42.153 "name": "raid_bdev1", 00:28:42.153 "raid_level": "raid1", 00:28:42.153 "base_bdevs": [ 00:28:42.153 "malloc1", 00:28:42.153 "malloc2" 00:28:42.153 ], 00:28:42.153 "superblock": false, 00:28:42.153 "method": "bdev_raid_create", 00:28:42.153 "req_id": 1 00:28:42.153 } 00:28:42.153 Got JSON-RPC error response 00:28:42.153 response: 00:28:42.153 { 00:28:42.153 "code": -17, 00:28:42.153 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:42.153 } 00:28:42.153 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # es=1 00:28:42.153 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:28:42.153 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:28:42.153 06:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:28:42.153 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.153 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:28:42.412 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:28:42.412 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:28:42.412 06:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:42.672 [2024-07-25 06:44:56.078086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:42.672 [2024-07-25 06:44:56.078126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:42.672 [2024-07-25 06:44:56.078149] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x982f60 00:28:42.672 [2024-07-25 06:44:56.078161] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:42.672 [2024-07-25 06:44:56.079586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:42.672 [2024-07-25 06:44:56.079612] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:42.672 [2024-07-25 06:44:56.079671] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:42.672 [2024-07-25 06:44:56.079694] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:42.672 pt1 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.672 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:42.932 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:42.932 "name": "raid_bdev1", 00:28:42.932 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:42.932 "strip_size_kb": 0, 00:28:42.932 "state": "configuring", 00:28:42.932 "raid_level": "raid1", 00:28:42.932 "superblock": true, 00:28:42.932 "num_base_bdevs": 2, 00:28:42.933 "num_base_bdevs_discovered": 1, 00:28:42.933 "num_base_bdevs_operational": 2, 00:28:42.933 "base_bdevs_list": [ 00:28:42.933 { 00:28:42.933 "name": "pt1", 00:28:42.933 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:42.933 "is_configured": true, 00:28:42.933 "data_offset": 256, 00:28:42.933 "data_size": 7936 00:28:42.933 }, 00:28:42.933 { 00:28:42.933 "name": null, 00:28:42.933 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:42.933 "is_configured": false, 00:28:42.933 "data_offset": 256, 00:28:42.933 "data_size": 7936 00:28:42.933 } 00:28:42.933 ] 00:28:42.933 }' 00:28:42.933 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:42.933 06:44:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:43.573 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:28:43.573 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:28:43.573 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:28:43.573 06:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:43.573 [2024-07-25 06:44:57.040633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:43.573 [2024-07-25 06:44:57.040687] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:43.573 [2024-07-25 06:44:57.040706] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9849d0 00:28:43.573 [2024-07-25 06:44:57.040718] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:43.573 [2024-07-25 06:44:57.041031] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:43.573 [2024-07-25 06:44:57.041047] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:43.573 [2024-07-25 06:44:57.041103] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:43.573 [2024-07-25 06:44:57.041121] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:43.573 [2024-07-25 06:44:57.041221] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x987700 00:28:43.573 [2024-07-25 06:44:57.041231] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:43.573 [2024-07-25 06:44:57.041380] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9910a0 00:28:43.573 [2024-07-25 06:44:57.041498] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x987700 00:28:43.573 [2024-07-25 06:44:57.041508] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x987700 00:28:43.573 [2024-07-25 06:44:57.041597] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:43.573 pt2 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.573 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.574 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.574 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.833 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.833 "name": "raid_bdev1", 00:28:43.833 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:43.833 "strip_size_kb": 0, 00:28:43.833 "state": "online", 00:28:43.833 "raid_level": "raid1", 00:28:43.833 "superblock": true, 00:28:43.833 "num_base_bdevs": 2, 00:28:43.833 "num_base_bdevs_discovered": 2, 00:28:43.833 "num_base_bdevs_operational": 2, 00:28:43.833 "base_bdevs_list": [ 00:28:43.833 { 00:28:43.833 "name": "pt1", 00:28:43.833 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:43.833 "is_configured": true, 00:28:43.833 "data_offset": 256, 00:28:43.833 "data_size": 7936 00:28:43.833 }, 00:28:43.833 { 00:28:43.833 "name": "pt2", 00:28:43.833 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:43.833 "is_configured": true, 00:28:43.833 "data_offset": 256, 00:28:43.833 "data_size": 7936 00:28:43.833 } 00:28:43.833 ] 00:28:43.833 }' 00:28:43.833 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.833 06:44:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:44.402 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:28:44.402 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:44.402 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:44.402 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:44.402 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:44.402 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:44.402 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:44.402 06:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:44.661 [2024-07-25 06:44:58.047514] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:44.661 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:44.661 "name": "raid_bdev1", 00:28:44.661 "aliases": [ 00:28:44.661 "d18186aa-48dd-4681-8109-238aa419b81c" 00:28:44.661 ], 00:28:44.661 "product_name": "Raid Volume", 00:28:44.661 "block_size": 4096, 00:28:44.661 "num_blocks": 7936, 00:28:44.661 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:44.661 "assigned_rate_limits": { 00:28:44.661 "rw_ios_per_sec": 0, 00:28:44.661 "rw_mbytes_per_sec": 0, 00:28:44.661 "r_mbytes_per_sec": 0, 00:28:44.661 "w_mbytes_per_sec": 0 00:28:44.661 }, 00:28:44.661 "claimed": false, 00:28:44.661 "zoned": false, 00:28:44.661 "supported_io_types": { 00:28:44.661 "read": true, 00:28:44.661 "write": true, 00:28:44.661 "unmap": false, 00:28:44.661 "flush": false, 00:28:44.661 "reset": true, 00:28:44.661 "nvme_admin": false, 00:28:44.661 "nvme_io": false, 00:28:44.661 "nvme_io_md": false, 00:28:44.661 "write_zeroes": true, 00:28:44.661 "zcopy": false, 00:28:44.661 "get_zone_info": false, 00:28:44.661 "zone_management": false, 00:28:44.661 "zone_append": false, 00:28:44.661 "compare": false, 00:28:44.661 "compare_and_write": false, 00:28:44.661 "abort": false, 00:28:44.661 "seek_hole": false, 00:28:44.661 "seek_data": false, 00:28:44.661 "copy": false, 00:28:44.661 "nvme_iov_md": false 00:28:44.661 }, 00:28:44.661 "memory_domains": [ 00:28:44.661 { 00:28:44.661 "dma_device_id": "system", 00:28:44.661 "dma_device_type": 1 00:28:44.661 }, 00:28:44.661 { 00:28:44.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.661 "dma_device_type": 2 00:28:44.661 }, 00:28:44.661 { 00:28:44.661 "dma_device_id": "system", 00:28:44.661 "dma_device_type": 1 00:28:44.661 }, 00:28:44.661 { 00:28:44.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.661 "dma_device_type": 2 00:28:44.661 } 00:28:44.661 ], 00:28:44.661 "driver_specific": { 00:28:44.661 "raid": { 00:28:44.661 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:44.661 "strip_size_kb": 0, 00:28:44.661 "state": "online", 00:28:44.661 "raid_level": "raid1", 00:28:44.661 "superblock": true, 00:28:44.661 "num_base_bdevs": 2, 00:28:44.661 "num_base_bdevs_discovered": 2, 00:28:44.661 "num_base_bdevs_operational": 2, 00:28:44.661 "base_bdevs_list": [ 00:28:44.661 { 00:28:44.661 "name": "pt1", 00:28:44.661 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:44.661 "is_configured": true, 00:28:44.661 "data_offset": 256, 00:28:44.661 "data_size": 7936 00:28:44.661 }, 00:28:44.661 { 00:28:44.661 "name": "pt2", 00:28:44.661 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:44.661 "is_configured": true, 00:28:44.661 "data_offset": 256, 00:28:44.661 "data_size": 7936 00:28:44.661 } 00:28:44.661 ] 00:28:44.661 } 00:28:44.661 } 00:28:44.661 }' 00:28:44.661 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:44.661 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:44.661 pt2' 00:28:44.661 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:44.661 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:44.661 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:44.920 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:44.920 "name": "pt1", 00:28:44.920 "aliases": [ 00:28:44.920 "00000000-0000-0000-0000-000000000001" 00:28:44.920 ], 00:28:44.920 "product_name": "passthru", 00:28:44.920 "block_size": 4096, 00:28:44.920 "num_blocks": 8192, 00:28:44.920 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:44.920 "assigned_rate_limits": { 00:28:44.921 "rw_ios_per_sec": 0, 00:28:44.921 "rw_mbytes_per_sec": 0, 00:28:44.921 "r_mbytes_per_sec": 0, 00:28:44.921 "w_mbytes_per_sec": 0 00:28:44.921 }, 00:28:44.921 "claimed": true, 00:28:44.921 "claim_type": "exclusive_write", 00:28:44.921 "zoned": false, 00:28:44.921 "supported_io_types": { 00:28:44.921 "read": true, 00:28:44.921 "write": true, 00:28:44.921 "unmap": true, 00:28:44.921 "flush": true, 00:28:44.921 "reset": true, 00:28:44.921 "nvme_admin": false, 00:28:44.921 "nvme_io": false, 00:28:44.921 "nvme_io_md": false, 00:28:44.921 "write_zeroes": true, 00:28:44.921 "zcopy": true, 00:28:44.921 "get_zone_info": false, 00:28:44.921 "zone_management": false, 00:28:44.921 "zone_append": false, 00:28:44.921 "compare": false, 00:28:44.921 "compare_and_write": false, 00:28:44.921 "abort": true, 00:28:44.921 "seek_hole": false, 00:28:44.921 "seek_data": false, 00:28:44.921 "copy": true, 00:28:44.921 "nvme_iov_md": false 00:28:44.921 }, 00:28:44.921 "memory_domains": [ 00:28:44.921 { 00:28:44.921 "dma_device_id": "system", 00:28:44.921 "dma_device_type": 1 00:28:44.921 }, 00:28:44.921 { 00:28:44.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.921 "dma_device_type": 2 00:28:44.921 } 00:28:44.921 ], 00:28:44.921 "driver_specific": { 00:28:44.921 "passthru": { 00:28:44.921 "name": "pt1", 00:28:44.921 "base_bdev_name": "malloc1" 00:28:44.921 } 00:28:44.921 } 00:28:44.921 }' 00:28:44.921 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:44.921 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:44.921 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:44.921 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:44.921 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:45.180 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:45.439 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:45.439 "name": "pt2", 00:28:45.439 "aliases": [ 00:28:45.439 "00000000-0000-0000-0000-000000000002" 00:28:45.439 ], 00:28:45.439 "product_name": "passthru", 00:28:45.439 "block_size": 4096, 00:28:45.439 "num_blocks": 8192, 00:28:45.439 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:45.439 "assigned_rate_limits": { 00:28:45.439 "rw_ios_per_sec": 0, 00:28:45.439 "rw_mbytes_per_sec": 0, 00:28:45.439 "r_mbytes_per_sec": 0, 00:28:45.439 "w_mbytes_per_sec": 0 00:28:45.439 }, 00:28:45.439 "claimed": true, 00:28:45.439 "claim_type": "exclusive_write", 00:28:45.439 "zoned": false, 00:28:45.439 "supported_io_types": { 00:28:45.439 "read": true, 00:28:45.439 "write": true, 00:28:45.439 "unmap": true, 00:28:45.439 "flush": true, 00:28:45.439 "reset": true, 00:28:45.439 "nvme_admin": false, 00:28:45.439 "nvme_io": false, 00:28:45.439 "nvme_io_md": false, 00:28:45.439 "write_zeroes": true, 00:28:45.439 "zcopy": true, 00:28:45.439 "get_zone_info": false, 00:28:45.439 "zone_management": false, 00:28:45.439 "zone_append": false, 00:28:45.439 "compare": false, 00:28:45.439 "compare_and_write": false, 00:28:45.439 "abort": true, 00:28:45.439 "seek_hole": false, 00:28:45.439 "seek_data": false, 00:28:45.439 "copy": true, 00:28:45.439 "nvme_iov_md": false 00:28:45.439 }, 00:28:45.439 "memory_domains": [ 00:28:45.439 { 00:28:45.439 "dma_device_id": "system", 00:28:45.439 "dma_device_type": 1 00:28:45.439 }, 00:28:45.439 { 00:28:45.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:45.439 "dma_device_type": 2 00:28:45.439 } 00:28:45.439 ], 00:28:45.439 "driver_specific": { 00:28:45.439 "passthru": { 00:28:45.439 "name": "pt2", 00:28:45.439 "base_bdev_name": "malloc2" 00:28:45.439 } 00:28:45.439 } 00:28:45.439 }' 00:28:45.439 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:45.439 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:45.439 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:45.439 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:45.439 06:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:45.698 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:28:45.957 [2024-07-25 06:44:59.411097] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:45.957 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # '[' d18186aa-48dd-4681-8109-238aa419b81c '!=' d18186aa-48dd-4681-8109-238aa419b81c ']' 00:28:45.957 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:28:45.957 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:45.957 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:28:45.957 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:46.216 [2024-07-25 06:44:59.643516] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.216 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:46.476 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:46.476 "name": "raid_bdev1", 00:28:46.476 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:46.476 "strip_size_kb": 0, 00:28:46.476 "state": "online", 00:28:46.476 "raid_level": "raid1", 00:28:46.476 "superblock": true, 00:28:46.476 "num_base_bdevs": 2, 00:28:46.476 "num_base_bdevs_discovered": 1, 00:28:46.476 "num_base_bdevs_operational": 1, 00:28:46.476 "base_bdevs_list": [ 00:28:46.476 { 00:28:46.476 "name": null, 00:28:46.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:46.476 "is_configured": false, 00:28:46.476 "data_offset": 256, 00:28:46.476 "data_size": 7936 00:28:46.476 }, 00:28:46.476 { 00:28:46.476 "name": "pt2", 00:28:46.476 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:46.476 "is_configured": true, 00:28:46.476 "data_offset": 256, 00:28:46.476 "data_size": 7936 00:28:46.476 } 00:28:46.476 ] 00:28:46.476 }' 00:28:46.476 06:44:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:46.476 06:44:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:47.413 06:45:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:47.413 [2024-07-25 06:45:00.934885] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:47.413 [2024-07-25 06:45:00.934912] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:47.413 [2024-07-25 06:45:00.934971] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:47.413 [2024-07-25 06:45:00.935012] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:47.413 [2024-07-25 06:45:00.935023] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x987700 name raid_bdev1, state offline 00:28:47.413 06:45:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.413 06:45:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:28:47.672 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:28:47.672 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:28:47.672 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:28:47.672 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:28:47.672 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:47.931 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:28:47.931 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:28:47.931 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:28:47.931 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:28:47.931 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@534 -- # i=1 00:28:47.931 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:48.190 [2024-07-25 06:45:01.636697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:48.190 [2024-07-25 06:45:01.636742] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:48.190 [2024-07-25 06:45:01.636760] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7e00e0 00:28:48.190 [2024-07-25 06:45:01.636773] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:48.190 [2024-07-25 06:45:01.638262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:48.190 [2024-07-25 06:45:01.638290] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:48.190 [2024-07-25 06:45:01.638350] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:48.190 [2024-07-25 06:45:01.638374] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:48.190 [2024-07-25 06:45:01.638447] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x988250 00:28:48.190 [2024-07-25 06:45:01.638457] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:48.190 [2024-07-25 06:45:01.638614] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7def00 00:28:48.190 [2024-07-25 06:45:01.638721] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x988250 00:28:48.190 [2024-07-25 06:45:01.638731] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x988250 00:28:48.190 [2024-07-25 06:45:01.638818] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:48.190 pt2 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.190 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.449 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:48.450 "name": "raid_bdev1", 00:28:48.450 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:48.450 "strip_size_kb": 0, 00:28:48.450 "state": "online", 00:28:48.450 "raid_level": "raid1", 00:28:48.450 "superblock": true, 00:28:48.450 "num_base_bdevs": 2, 00:28:48.450 "num_base_bdevs_discovered": 1, 00:28:48.450 "num_base_bdevs_operational": 1, 00:28:48.450 "base_bdevs_list": [ 00:28:48.450 { 00:28:48.450 "name": null, 00:28:48.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.450 "is_configured": false, 00:28:48.450 "data_offset": 256, 00:28:48.450 "data_size": 7936 00:28:48.450 }, 00:28:48.450 { 00:28:48.450 "name": "pt2", 00:28:48.450 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:48.450 "is_configured": true, 00:28:48.450 "data_offset": 256, 00:28:48.450 "data_size": 7936 00:28:48.450 } 00:28:48.450 ] 00:28:48.450 }' 00:28:48.450 06:45:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:48.450 06:45:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:49.018 06:45:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:49.586 [2024-07-25 06:45:02.940118] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:49.586 [2024-07-25 06:45:02.940149] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:49.586 [2024-07-25 06:45:02.940203] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:49.586 [2024-07-25 06:45:02.940245] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:49.586 [2024-07-25 06:45:02.940255] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x988250 name raid_bdev1, state offline 00:28:49.586 06:45:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.586 06:45:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:28:49.846 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:28:49.846 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:28:49.846 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:28:49.846 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:50.411 [2024-07-25 06:45:03.682027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:50.411 [2024-07-25 06:45:03.682073] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:50.411 [2024-07-25 06:45:03.682090] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7d5ea0 00:28:50.411 [2024-07-25 06:45:03.682102] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:50.411 [2024-07-25 06:45:03.683579] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:50.411 [2024-07-25 06:45:03.683606] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:50.411 [2024-07-25 06:45:03.683663] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:50.411 [2024-07-25 06:45:03.683686] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:50.411 [2024-07-25 06:45:03.683774] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:50.411 [2024-07-25 06:45:03.683792] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:50.411 [2024-07-25 06:45:03.683804] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d6890 name raid_bdev1, state configuring 00:28:50.411 [2024-07-25 06:45:03.683825] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:50.411 [2024-07-25 06:45:03.683878] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x987d30 00:28:50.411 [2024-07-25 06:45:03.683887] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:50.411 [2024-07-25 06:45:03.684037] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7def00 00:28:50.411 [2024-07-25 06:45:03.684159] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x987d30 00:28:50.411 [2024-07-25 06:45:03.684169] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x987d30 00:28:50.411 [2024-07-25 06:45:03.684258] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:50.411 pt1 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:50.411 "name": "raid_bdev1", 00:28:50.411 "uuid": "d18186aa-48dd-4681-8109-238aa419b81c", 00:28:50.411 "strip_size_kb": 0, 00:28:50.411 "state": "online", 00:28:50.411 "raid_level": "raid1", 00:28:50.411 "superblock": true, 00:28:50.411 "num_base_bdevs": 2, 00:28:50.411 "num_base_bdevs_discovered": 1, 00:28:50.411 "num_base_bdevs_operational": 1, 00:28:50.411 "base_bdevs_list": [ 00:28:50.411 { 00:28:50.411 "name": null, 00:28:50.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:50.411 "is_configured": false, 00:28:50.411 "data_offset": 256, 00:28:50.411 "data_size": 7936 00:28:50.411 }, 00:28:50.411 { 00:28:50.411 "name": "pt2", 00:28:50.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:50.411 "is_configured": true, 00:28:50.411 "data_offset": 256, 00:28:50.411 "data_size": 7936 00:28:50.411 } 00:28:50.411 ] 00:28:50.411 }' 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:50.411 06:45:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:50.978 06:45:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:50.978 06:45:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:51.236 06:45:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:28:51.236 06:45:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:51.237 06:45:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:28:51.496 [2024-07-25 06:45:04.957598] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:51.496 06:45:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # '[' d18186aa-48dd-4681-8109-238aa419b81c '!=' d18186aa-48dd-4681-8109-238aa419b81c ']' 00:28:51.496 06:45:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@578 -- # killprocess 1261461 00:28:51.496 06:45:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@950 -- # '[' -z 1261461 ']' 00:28:51.496 06:45:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # kill -0 1261461 00:28:51.496 06:45:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # uname 00:28:51.496 06:45:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:51.496 06:45:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1261461 00:28:51.496 06:45:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:51.496 06:45:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:51.496 06:45:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1261461' 00:28:51.496 killing process with pid 1261461 00:28:51.496 06:45:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@969 -- # kill 1261461 00:28:51.496 [2024-07-25 06:45:05.032745] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:51.496 06:45:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@974 -- # wait 1261461 00:28:51.496 [2024-07-25 06:45:05.032797] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:51.496 [2024-07-25 06:45:05.032839] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:51.496 [2024-07-25 06:45:05.032850] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x987d30 name raid_bdev1, state offline 00:28:51.496 [2024-07-25 06:45:05.048360] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:51.755 06:45:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@580 -- # return 0 00:28:51.755 00:28:51.755 real 0m14.855s 00:28:51.755 user 0m27.338s 00:28:51.755 sys 0m2.817s 00:28:51.755 06:45:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:51.755 06:45:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:51.755 ************************************ 00:28:51.755 END TEST raid_superblock_test_4k 00:28:51.755 ************************************ 00:28:51.755 06:45:05 bdev_raid -- bdev/bdev_raid.sh@980 -- # '[' true = true ']' 00:28:51.755 06:45:05 bdev_raid -- bdev/bdev_raid.sh@981 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:28:51.755 06:45:05 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:28:51.755 06:45:05 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:51.755 06:45:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:51.755 ************************************ 00:28:51.755 START TEST raid_rebuild_test_sb_4k 00:28:51.755 ************************************ 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # local verify=true 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # local strip_size 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # local create_arg 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@594 -- # local data_offset 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # raid_pid=1264700 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@613 -- # waitforlisten 1264700 /var/tmp/spdk-raid.sock 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 1264700 ']' 00:28:51.755 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:51.756 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:51.756 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:51.756 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:51.756 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:51.756 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:51.756 06:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:52.015 [2024-07-25 06:45:05.363066] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:28:52.015 [2024-07-25 06:45:05.363121] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1264700 ] 00:28:52.015 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:52.015 Zero copy mechanism will not be used. 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:01.0 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:01.1 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:01.2 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:01.3 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:01.4 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:01.5 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:01.6 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:01.7 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:02.0 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:02.1 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:02.2 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:02.3 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:02.4 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:02.5 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:02.6 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3d:02.7 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:01.0 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:01.1 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:01.2 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:01.3 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:01.4 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:01.5 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:01.6 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:01.7 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:02.0 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:02.1 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:02.2 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:02.3 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:02.4 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:02.5 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:02.6 cannot be used 00:28:52.015 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:28:52.015 EAL: Requested device 0000:3f:02.7 cannot be used 00:28:52.015 [2024-07-25 06:45:05.497716] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:52.015 [2024-07-25 06:45:05.541267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:52.274 [2024-07-25 06:45:05.604095] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:52.274 [2024-07-25 06:45:05.604130] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:52.842 06:45:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:52.842 06:45:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:28:52.842 06:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:28:52.842 06:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:28:53.101 BaseBdev1_malloc 00:28:53.101 06:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:53.358 [2024-07-25 06:45:06.695268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:53.358 [2024-07-25 06:45:06.695313] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:53.358 [2024-07-25 06:45:06.695335] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e537b0 00:28:53.358 [2024-07-25 06:45:06.695346] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:53.358 [2024-07-25 06:45:06.696834] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:53.358 [2024-07-25 06:45:06.696865] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:53.358 BaseBdev1 00:28:53.358 06:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:28:53.358 06:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:28:53.617 BaseBdev2_malloc 00:28:53.617 06:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:53.617 [2024-07-25 06:45:07.128660] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:53.617 [2024-07-25 06:45:07.128700] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:53.617 [2024-07-25 06:45:07.128724] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ca18f0 00:28:53.617 [2024-07-25 06:45:07.128735] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:53.617 [2024-07-25 06:45:07.130041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:53.617 [2024-07-25 06:45:07.130066] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:53.617 BaseBdev2 00:28:53.617 06:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:28:53.875 spare_malloc 00:28:53.875 06:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:54.133 spare_delay 00:28:54.133 06:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:54.391 [2024-07-25 06:45:07.802698] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:54.391 [2024-07-25 06:45:07.802741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:54.391 [2024-07-25 06:45:07.802759] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c98c10 00:28:54.391 [2024-07-25 06:45:07.802770] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:54.391 [2024-07-25 06:45:07.804160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:54.391 [2024-07-25 06:45:07.804188] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:54.391 spare 00:28:54.391 06:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:54.650 [2024-07-25 06:45:08.019293] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:54.650 [2024-07-25 06:45:08.020384] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:54.650 [2024-07-25 06:45:08.020530] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c99d20 00:28:54.650 [2024-07-25 06:45:08.020542] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:54.650 [2024-07-25 06:45:08.020711] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e45e20 00:28:54.650 [2024-07-25 06:45:08.020834] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c99d20 00:28:54.650 [2024-07-25 06:45:08.020844] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c99d20 00:28:54.650 [2024-07-25 06:45:08.020927] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:54.650 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.651 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.909 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:54.910 "name": "raid_bdev1", 00:28:54.910 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:28:54.910 "strip_size_kb": 0, 00:28:54.910 "state": "online", 00:28:54.910 "raid_level": "raid1", 00:28:54.910 "superblock": true, 00:28:54.910 "num_base_bdevs": 2, 00:28:54.910 "num_base_bdevs_discovered": 2, 00:28:54.910 "num_base_bdevs_operational": 2, 00:28:54.910 "base_bdevs_list": [ 00:28:54.910 { 00:28:54.910 "name": "BaseBdev1", 00:28:54.910 "uuid": "819f7664-4948-5398-813c-5ffd6a89122a", 00:28:54.910 "is_configured": true, 00:28:54.910 "data_offset": 256, 00:28:54.910 "data_size": 7936 00:28:54.910 }, 00:28:54.910 { 00:28:54.910 "name": "BaseBdev2", 00:28:54.910 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:28:54.910 "is_configured": true, 00:28:54.910 "data_offset": 256, 00:28:54.910 "data_size": 7936 00:28:54.910 } 00:28:54.910 ] 00:28:54.910 }' 00:28:54.910 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:54.910 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:55.497 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:55.497 06:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:28:55.756 [2024-07-25 06:45:09.070269] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:55.756 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:28:55.756 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.756 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:56.015 [2024-07-25 06:45:09.527274] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e45ab0 00:28:56.015 /dev/nbd0 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:56.015 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:56.015 1+0 records in 00:28:56.015 1+0 records out 00:28:56.015 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241114 s, 17.0 MB/s 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:28:56.330 06:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:56.900 7936+0 records in 00:28:56.900 7936+0 records out 00:28:56.900 32505856 bytes (33 MB, 31 MiB) copied, 0.681113 s, 47.7 MB/s 00:28:56.900 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:56.900 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:56.900 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:56.900 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:56.900 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:28:56.900 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:56.900 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:57.158 [2024-07-25 06:45:10.513809] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:57.158 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:57.416 [2024-07-25 06:45:10.738444] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.416 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:57.675 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:57.675 "name": "raid_bdev1", 00:28:57.675 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:28:57.675 "strip_size_kb": 0, 00:28:57.675 "state": "online", 00:28:57.675 "raid_level": "raid1", 00:28:57.675 "superblock": true, 00:28:57.675 "num_base_bdevs": 2, 00:28:57.675 "num_base_bdevs_discovered": 1, 00:28:57.675 "num_base_bdevs_operational": 1, 00:28:57.675 "base_bdevs_list": [ 00:28:57.675 { 00:28:57.675 "name": null, 00:28:57.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:57.675 "is_configured": false, 00:28:57.675 "data_offset": 256, 00:28:57.675 "data_size": 7936 00:28:57.675 }, 00:28:57.675 { 00:28:57.675 "name": "BaseBdev2", 00:28:57.675 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:28:57.675 "is_configured": true, 00:28:57.675 "data_offset": 256, 00:28:57.675 "data_size": 7936 00:28:57.675 } 00:28:57.675 ] 00:28:57.675 }' 00:28:57.675 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:57.675 06:45:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:58.240 06:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:58.240 [2024-07-25 06:45:11.785216] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:58.240 [2024-07-25 06:45:11.789865] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e48bf0 00:28:58.240 [2024-07-25 06:45:11.791867] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:58.497 06:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:59.432 06:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:59.432 06:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:59.432 06:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:59.432 06:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:59.432 06:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:59.432 06:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.432 06:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:59.691 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:59.691 "name": "raid_bdev1", 00:28:59.691 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:28:59.691 "strip_size_kb": 0, 00:28:59.691 "state": "online", 00:28:59.691 "raid_level": "raid1", 00:28:59.691 "superblock": true, 00:28:59.691 "num_base_bdevs": 2, 00:28:59.691 "num_base_bdevs_discovered": 2, 00:28:59.691 "num_base_bdevs_operational": 2, 00:28:59.691 "process": { 00:28:59.691 "type": "rebuild", 00:28:59.691 "target": "spare", 00:28:59.691 "progress": { 00:28:59.691 "blocks": 3072, 00:28:59.691 "percent": 38 00:28:59.691 } 00:28:59.691 }, 00:28:59.691 "base_bdevs_list": [ 00:28:59.691 { 00:28:59.691 "name": "spare", 00:28:59.691 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:28:59.691 "is_configured": true, 00:28:59.691 "data_offset": 256, 00:28:59.691 "data_size": 7936 00:28:59.691 }, 00:28:59.691 { 00:28:59.691 "name": "BaseBdev2", 00:28:59.691 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:28:59.691 "is_configured": true, 00:28:59.691 "data_offset": 256, 00:28:59.691 "data_size": 7936 00:28:59.691 } 00:28:59.691 ] 00:28:59.691 }' 00:28:59.691 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:59.691 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:59.691 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:59.691 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:59.691 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:59.950 [2024-07-25 06:45:13.330113] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:59.950 [2024-07-25 06:45:13.403545] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:59.950 [2024-07-25 06:45:13.403586] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:59.950 [2024-07-25 06:45:13.403601] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:59.950 [2024-07-25 06:45:13.403608] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.950 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.209 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:00.209 "name": "raid_bdev1", 00:29:00.209 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:00.209 "strip_size_kb": 0, 00:29:00.209 "state": "online", 00:29:00.209 "raid_level": "raid1", 00:29:00.209 "superblock": true, 00:29:00.209 "num_base_bdevs": 2, 00:29:00.209 "num_base_bdevs_discovered": 1, 00:29:00.209 "num_base_bdevs_operational": 1, 00:29:00.209 "base_bdevs_list": [ 00:29:00.209 { 00:29:00.209 "name": null, 00:29:00.209 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:00.209 "is_configured": false, 00:29:00.209 "data_offset": 256, 00:29:00.209 "data_size": 7936 00:29:00.209 }, 00:29:00.209 { 00:29:00.209 "name": "BaseBdev2", 00:29:00.209 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:00.209 "is_configured": true, 00:29:00.209 "data_offset": 256, 00:29:00.209 "data_size": 7936 00:29:00.209 } 00:29:00.209 ] 00:29:00.209 }' 00:29:00.209 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:00.209 06:45:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:00.775 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:00.775 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:00.775 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:00.775 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:00.775 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:00.775 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.775 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.033 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:01.033 "name": "raid_bdev1", 00:29:01.033 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:01.033 "strip_size_kb": 0, 00:29:01.033 "state": "online", 00:29:01.033 "raid_level": "raid1", 00:29:01.033 "superblock": true, 00:29:01.033 "num_base_bdevs": 2, 00:29:01.033 "num_base_bdevs_discovered": 1, 00:29:01.033 "num_base_bdevs_operational": 1, 00:29:01.033 "base_bdevs_list": [ 00:29:01.033 { 00:29:01.033 "name": null, 00:29:01.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:01.033 "is_configured": false, 00:29:01.033 "data_offset": 256, 00:29:01.033 "data_size": 7936 00:29:01.033 }, 00:29:01.033 { 00:29:01.033 "name": "BaseBdev2", 00:29:01.033 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:01.033 "is_configured": true, 00:29:01.033 "data_offset": 256, 00:29:01.033 "data_size": 7936 00:29:01.033 } 00:29:01.033 ] 00:29:01.033 }' 00:29:01.033 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:01.033 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:01.033 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:01.033 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:01.033 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:01.292 [2024-07-25 06:45:14.770771] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:01.292 [2024-07-25 06:45:14.775421] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e48bf0 00:29:01.292 [2024-07-25 06:45:14.776751] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:01.292 06:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@678 -- # sleep 1 00:29:02.670 06:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:02.670 06:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:02.670 06:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:02.670 06:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:02.670 06:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:02.670 06:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:02.670 06:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:02.670 "name": "raid_bdev1", 00:29:02.670 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:02.670 "strip_size_kb": 0, 00:29:02.670 "state": "online", 00:29:02.670 "raid_level": "raid1", 00:29:02.670 "superblock": true, 00:29:02.670 "num_base_bdevs": 2, 00:29:02.670 "num_base_bdevs_discovered": 2, 00:29:02.670 "num_base_bdevs_operational": 2, 00:29:02.670 "process": { 00:29:02.670 "type": "rebuild", 00:29:02.670 "target": "spare", 00:29:02.670 "progress": { 00:29:02.670 "blocks": 3072, 00:29:02.670 "percent": 38 00:29:02.670 } 00:29:02.670 }, 00:29:02.670 "base_bdevs_list": [ 00:29:02.670 { 00:29:02.670 "name": "spare", 00:29:02.670 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:02.670 "is_configured": true, 00:29:02.670 "data_offset": 256, 00:29:02.670 "data_size": 7936 00:29:02.670 }, 00:29:02.670 { 00:29:02.670 "name": "BaseBdev2", 00:29:02.670 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:02.670 "is_configured": true, 00:29:02.670 "data_offset": 256, 00:29:02.670 "data_size": 7936 00:29:02.670 } 00:29:02.670 ] 00:29:02.670 }' 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:29:02.670 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # local timeout=965 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.670 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:02.929 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:02.929 "name": "raid_bdev1", 00:29:02.929 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:02.929 "strip_size_kb": 0, 00:29:02.929 "state": "online", 00:29:02.929 "raid_level": "raid1", 00:29:02.929 "superblock": true, 00:29:02.929 "num_base_bdevs": 2, 00:29:02.929 "num_base_bdevs_discovered": 2, 00:29:02.929 "num_base_bdevs_operational": 2, 00:29:02.929 "process": { 00:29:02.929 "type": "rebuild", 00:29:02.929 "target": "spare", 00:29:02.929 "progress": { 00:29:02.929 "blocks": 3840, 00:29:02.929 "percent": 48 00:29:02.929 } 00:29:02.929 }, 00:29:02.929 "base_bdevs_list": [ 00:29:02.929 { 00:29:02.929 "name": "spare", 00:29:02.929 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:02.929 "is_configured": true, 00:29:02.929 "data_offset": 256, 00:29:02.929 "data_size": 7936 00:29:02.929 }, 00:29:02.929 { 00:29:02.929 "name": "BaseBdev2", 00:29:02.929 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:02.929 "is_configured": true, 00:29:02.929 "data_offset": 256, 00:29:02.929 "data_size": 7936 00:29:02.929 } 00:29:02.929 ] 00:29:02.929 }' 00:29:02.929 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:02.929 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:02.929 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:02.929 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:02.929 06:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@726 -- # sleep 1 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:04.308 "name": "raid_bdev1", 00:29:04.308 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:04.308 "strip_size_kb": 0, 00:29:04.308 "state": "online", 00:29:04.308 "raid_level": "raid1", 00:29:04.308 "superblock": true, 00:29:04.308 "num_base_bdevs": 2, 00:29:04.308 "num_base_bdevs_discovered": 2, 00:29:04.308 "num_base_bdevs_operational": 2, 00:29:04.308 "process": { 00:29:04.308 "type": "rebuild", 00:29:04.308 "target": "spare", 00:29:04.308 "progress": { 00:29:04.308 "blocks": 7168, 00:29:04.308 "percent": 90 00:29:04.308 } 00:29:04.308 }, 00:29:04.308 "base_bdevs_list": [ 00:29:04.308 { 00:29:04.308 "name": "spare", 00:29:04.308 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:04.308 "is_configured": true, 00:29:04.308 "data_offset": 256, 00:29:04.308 "data_size": 7936 00:29:04.308 }, 00:29:04.308 { 00:29:04.308 "name": "BaseBdev2", 00:29:04.308 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:04.308 "is_configured": true, 00:29:04.308 "data_offset": 256, 00:29:04.308 "data_size": 7936 00:29:04.308 } 00:29:04.308 ] 00:29:04.308 }' 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:04.308 06:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@726 -- # sleep 1 00:29:04.566 [2024-07-25 06:45:17.899168] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:04.566 [2024-07-25 06:45:17.899221] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:04.566 [2024-07-25 06:45:17.899297] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:05.502 06:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:05.502 06:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:05.502 06:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:05.502 06:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:05.502 06:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:05.502 06:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:05.502 06:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.502 06:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:05.502 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:05.502 "name": "raid_bdev1", 00:29:05.502 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:05.502 "strip_size_kb": 0, 00:29:05.502 "state": "online", 00:29:05.502 "raid_level": "raid1", 00:29:05.502 "superblock": true, 00:29:05.502 "num_base_bdevs": 2, 00:29:05.502 "num_base_bdevs_discovered": 2, 00:29:05.502 "num_base_bdevs_operational": 2, 00:29:05.502 "base_bdevs_list": [ 00:29:05.502 { 00:29:05.502 "name": "spare", 00:29:05.502 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:05.502 "is_configured": true, 00:29:05.502 "data_offset": 256, 00:29:05.502 "data_size": 7936 00:29:05.502 }, 00:29:05.502 { 00:29:05.502 "name": "BaseBdev2", 00:29:05.502 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:05.502 "is_configured": true, 00:29:05.502 "data_offset": 256, 00:29:05.502 "data_size": 7936 00:29:05.502 } 00:29:05.502 ] 00:29:05.502 }' 00:29:05.502 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@724 -- # break 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.761 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:06.019 "name": "raid_bdev1", 00:29:06.019 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:06.019 "strip_size_kb": 0, 00:29:06.019 "state": "online", 00:29:06.019 "raid_level": "raid1", 00:29:06.019 "superblock": true, 00:29:06.019 "num_base_bdevs": 2, 00:29:06.019 "num_base_bdevs_discovered": 2, 00:29:06.019 "num_base_bdevs_operational": 2, 00:29:06.019 "base_bdevs_list": [ 00:29:06.019 { 00:29:06.019 "name": "spare", 00:29:06.019 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:06.019 "is_configured": true, 00:29:06.019 "data_offset": 256, 00:29:06.019 "data_size": 7936 00:29:06.019 }, 00:29:06.019 { 00:29:06.019 "name": "BaseBdev2", 00:29:06.019 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:06.019 "is_configured": true, 00:29:06.019 "data_offset": 256, 00:29:06.019 "data_size": 7936 00:29:06.019 } 00:29:06.019 ] 00:29:06.019 }' 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.019 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.278 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.278 "name": "raid_bdev1", 00:29:06.278 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:06.278 "strip_size_kb": 0, 00:29:06.278 "state": "online", 00:29:06.278 "raid_level": "raid1", 00:29:06.278 "superblock": true, 00:29:06.278 "num_base_bdevs": 2, 00:29:06.278 "num_base_bdevs_discovered": 2, 00:29:06.278 "num_base_bdevs_operational": 2, 00:29:06.278 "base_bdevs_list": [ 00:29:06.278 { 00:29:06.278 "name": "spare", 00:29:06.278 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:06.278 "is_configured": true, 00:29:06.278 "data_offset": 256, 00:29:06.278 "data_size": 7936 00:29:06.278 }, 00:29:06.278 { 00:29:06.278 "name": "BaseBdev2", 00:29:06.278 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:06.278 "is_configured": true, 00:29:06.278 "data_offset": 256, 00:29:06.278 "data_size": 7936 00:29:06.278 } 00:29:06.278 ] 00:29:06.278 }' 00:29:06.278 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.278 06:45:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:06.844 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:07.103 [2024-07-25 06:45:20.433556] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:07.103 [2024-07-25 06:45:20.433581] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:07.103 [2024-07-25 06:45:20.433633] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:07.103 [2024-07-25 06:45:20.433684] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:07.103 [2024-07-25 06:45:20.433695] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c99d20 name raid_bdev1, state offline 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # jq length 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:07.103 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:29:07.362 /dev/nbd0 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:07.362 1+0 records in 00:29:07.362 1+0 records out 00:29:07.362 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221252 s, 18.5 MB/s 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:07.362 06:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:29:07.620 /dev/nbd1 00:29:07.620 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:07.620 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:07.620 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:29:07.620 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:29:07.620 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:29:07.620 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:29:07.621 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:29:07.621 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:29:07.621 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:29:07.621 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:29:07.621 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:07.880 1+0 records in 00:29:07.880 1+0 records out 00:29:07.880 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315519 s, 13.0 MB/s 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:07.880 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:08.140 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:08.399 06:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:08.658 [2024-07-25 06:45:22.144385] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:08.658 [2024-07-25 06:45:22.144422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:08.658 [2024-07-25 06:45:22.144442] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c98130 00:29:08.658 [2024-07-25 06:45:22.144453] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:08.658 [2024-07-25 06:45:22.145938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:08.658 [2024-07-25 06:45:22.145965] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:08.658 [2024-07-25 06:45:22.146033] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:08.658 [2024-07-25 06:45:22.146057] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:08.658 [2024-07-25 06:45:22.146155] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:08.658 spare 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.658 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:08.917 [2024-07-25 06:45:22.246461] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e45df0 00:29:08.917 [2024-07-25 06:45:22.246477] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:08.917 [2024-07-25 06:45:22.246640] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ca25a0 00:29:08.917 [2024-07-25 06:45:22.246769] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e45df0 00:29:08.917 [2024-07-25 06:45:22.246779] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e45df0 00:29:08.917 [2024-07-25 06:45:22.246870] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:08.917 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:08.917 "name": "raid_bdev1", 00:29:08.917 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:08.917 "strip_size_kb": 0, 00:29:08.917 "state": "online", 00:29:08.917 "raid_level": "raid1", 00:29:08.917 "superblock": true, 00:29:08.917 "num_base_bdevs": 2, 00:29:08.917 "num_base_bdevs_discovered": 2, 00:29:08.917 "num_base_bdevs_operational": 2, 00:29:08.917 "base_bdevs_list": [ 00:29:08.917 { 00:29:08.917 "name": "spare", 00:29:08.917 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:08.917 "is_configured": true, 00:29:08.917 "data_offset": 256, 00:29:08.917 "data_size": 7936 00:29:08.917 }, 00:29:08.917 { 00:29:08.917 "name": "BaseBdev2", 00:29:08.917 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:08.917 "is_configured": true, 00:29:08.917 "data_offset": 256, 00:29:08.917 "data_size": 7936 00:29:08.917 } 00:29:08.917 ] 00:29:08.917 }' 00:29:08.917 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:08.917 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:09.485 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:09.485 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:09.485 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:09.485 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:09.485 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:09.485 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.485 06:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:09.744 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:09.744 "name": "raid_bdev1", 00:29:09.744 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:09.744 "strip_size_kb": 0, 00:29:09.744 "state": "online", 00:29:09.744 "raid_level": "raid1", 00:29:09.744 "superblock": true, 00:29:09.744 "num_base_bdevs": 2, 00:29:09.744 "num_base_bdevs_discovered": 2, 00:29:09.744 "num_base_bdevs_operational": 2, 00:29:09.744 "base_bdevs_list": [ 00:29:09.744 { 00:29:09.744 "name": "spare", 00:29:09.744 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:09.744 "is_configured": true, 00:29:09.744 "data_offset": 256, 00:29:09.744 "data_size": 7936 00:29:09.744 }, 00:29:09.744 { 00:29:09.744 "name": "BaseBdev2", 00:29:09.744 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:09.744 "is_configured": true, 00:29:09.744 "data_offset": 256, 00:29:09.744 "data_size": 7936 00:29:09.744 } 00:29:09.744 ] 00:29:09.744 }' 00:29:09.744 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:09.744 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:09.744 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:09.744 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:09.744 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.744 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:10.002 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:29:10.002 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:10.279 [2024-07-25 06:45:23.728725] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.279 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.538 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:10.538 "name": "raid_bdev1", 00:29:10.538 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:10.538 "strip_size_kb": 0, 00:29:10.538 "state": "online", 00:29:10.538 "raid_level": "raid1", 00:29:10.538 "superblock": true, 00:29:10.538 "num_base_bdevs": 2, 00:29:10.538 "num_base_bdevs_discovered": 1, 00:29:10.538 "num_base_bdevs_operational": 1, 00:29:10.538 "base_bdevs_list": [ 00:29:10.538 { 00:29:10.538 "name": null, 00:29:10.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:10.538 "is_configured": false, 00:29:10.538 "data_offset": 256, 00:29:10.538 "data_size": 7936 00:29:10.538 }, 00:29:10.539 { 00:29:10.539 "name": "BaseBdev2", 00:29:10.539 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:10.539 "is_configured": true, 00:29:10.539 "data_offset": 256, 00:29:10.539 "data_size": 7936 00:29:10.539 } 00:29:10.539 ] 00:29:10.539 }' 00:29:10.539 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:10.539 06:45:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:11.106 06:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:11.365 [2024-07-25 06:45:24.755459] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:11.365 [2024-07-25 06:45:24.755590] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:11.365 [2024-07-25 06:45:24.755604] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:11.365 [2024-07-25 06:45:24.755629] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:11.365 [2024-07-25 06:45:24.760175] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e47cf0 00:29:11.365 [2024-07-25 06:45:24.761400] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:11.365 06:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # sleep 1 00:29:12.303 06:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:12.303 06:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:12.303 06:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:12.303 06:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:12.303 06:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:12.303 06:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.303 06:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.562 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:12.562 "name": "raid_bdev1", 00:29:12.562 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:12.562 "strip_size_kb": 0, 00:29:12.562 "state": "online", 00:29:12.562 "raid_level": "raid1", 00:29:12.562 "superblock": true, 00:29:12.562 "num_base_bdevs": 2, 00:29:12.562 "num_base_bdevs_discovered": 2, 00:29:12.562 "num_base_bdevs_operational": 2, 00:29:12.562 "process": { 00:29:12.562 "type": "rebuild", 00:29:12.562 "target": "spare", 00:29:12.562 "progress": { 00:29:12.562 "blocks": 3072, 00:29:12.562 "percent": 38 00:29:12.562 } 00:29:12.562 }, 00:29:12.562 "base_bdevs_list": [ 00:29:12.562 { 00:29:12.562 "name": "spare", 00:29:12.562 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:12.562 "is_configured": true, 00:29:12.562 "data_offset": 256, 00:29:12.562 "data_size": 7936 00:29:12.562 }, 00:29:12.562 { 00:29:12.562 "name": "BaseBdev2", 00:29:12.562 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:12.562 "is_configured": true, 00:29:12.562 "data_offset": 256, 00:29:12.562 "data_size": 7936 00:29:12.562 } 00:29:12.562 ] 00:29:12.562 }' 00:29:12.562 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:12.562 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:12.562 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:12.562 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:12.562 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:12.821 [2024-07-25 06:45:26.312521] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:12.821 [2024-07-25 06:45:26.373069] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:12.821 [2024-07-25 06:45:26.373110] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:12.821 [2024-07-25 06:45:26.373124] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:12.821 [2024-07-25 06:45:26.373132] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:13.081 "name": "raid_bdev1", 00:29:13.081 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:13.081 "strip_size_kb": 0, 00:29:13.081 "state": "online", 00:29:13.081 "raid_level": "raid1", 00:29:13.081 "superblock": true, 00:29:13.081 "num_base_bdevs": 2, 00:29:13.081 "num_base_bdevs_discovered": 1, 00:29:13.081 "num_base_bdevs_operational": 1, 00:29:13.081 "base_bdevs_list": [ 00:29:13.081 { 00:29:13.081 "name": null, 00:29:13.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:13.081 "is_configured": false, 00:29:13.081 "data_offset": 256, 00:29:13.081 "data_size": 7936 00:29:13.081 }, 00:29:13.081 { 00:29:13.081 "name": "BaseBdev2", 00:29:13.081 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:13.081 "is_configured": true, 00:29:13.081 "data_offset": 256, 00:29:13.081 "data_size": 7936 00:29:13.081 } 00:29:13.081 ] 00:29:13.081 }' 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:13.081 06:45:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:13.649 06:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:13.908 [2024-07-25 06:45:27.407898] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:13.908 [2024-07-25 06:45:27.407945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:13.908 [2024-07-25 06:45:27.407967] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c99fa0 00:29:13.908 [2024-07-25 06:45:27.407979] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:13.908 [2024-07-25 06:45:27.408343] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:13.908 [2024-07-25 06:45:27.408361] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:13.908 [2024-07-25 06:45:27.408434] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:13.908 [2024-07-25 06:45:27.408445] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:13.908 [2024-07-25 06:45:27.408454] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:13.908 [2024-07-25 06:45:27.408472] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:13.908 [2024-07-25 06:45:27.413091] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ca25a0 00:29:13.908 spare 00:29:13.908 [2024-07-25 06:45:27.414327] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:13.908 06:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # sleep 1 00:29:15.286 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:15.286 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:15.286 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:15.286 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:15.286 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:15.286 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.286 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:15.286 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:15.286 "name": "raid_bdev1", 00:29:15.286 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:15.286 "strip_size_kb": 0, 00:29:15.286 "state": "online", 00:29:15.286 "raid_level": "raid1", 00:29:15.286 "superblock": true, 00:29:15.286 "num_base_bdevs": 2, 00:29:15.286 "num_base_bdevs_discovered": 2, 00:29:15.286 "num_base_bdevs_operational": 2, 00:29:15.286 "process": { 00:29:15.286 "type": "rebuild", 00:29:15.286 "target": "spare", 00:29:15.286 "progress": { 00:29:15.286 "blocks": 3072, 00:29:15.286 "percent": 38 00:29:15.286 } 00:29:15.286 }, 00:29:15.286 "base_bdevs_list": [ 00:29:15.286 { 00:29:15.286 "name": "spare", 00:29:15.286 "uuid": "df25f693-1a76-5c76-803a-a83409c4af84", 00:29:15.286 "is_configured": true, 00:29:15.286 "data_offset": 256, 00:29:15.286 "data_size": 7936 00:29:15.286 }, 00:29:15.286 { 00:29:15.286 "name": "BaseBdev2", 00:29:15.287 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:15.287 "is_configured": true, 00:29:15.287 "data_offset": 256, 00:29:15.287 "data_size": 7936 00:29:15.287 } 00:29:15.287 ] 00:29:15.287 }' 00:29:15.287 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:15.287 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:15.287 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:15.287 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:15.287 06:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:15.545 [2024-07-25 06:45:28.961823] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:15.545 [2024-07-25 06:45:29.025954] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:15.545 [2024-07-25 06:45:29.025993] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:15.545 [2024-07-25 06:45:29.026007] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:15.545 [2024-07-25 06:45:29.026015] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.545 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:15.804 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:15.804 "name": "raid_bdev1", 00:29:15.804 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:15.804 "strip_size_kb": 0, 00:29:15.804 "state": "online", 00:29:15.804 "raid_level": "raid1", 00:29:15.804 "superblock": true, 00:29:15.804 "num_base_bdevs": 2, 00:29:15.804 "num_base_bdevs_discovered": 1, 00:29:15.804 "num_base_bdevs_operational": 1, 00:29:15.804 "base_bdevs_list": [ 00:29:15.804 { 00:29:15.804 "name": null, 00:29:15.804 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:15.804 "is_configured": false, 00:29:15.804 "data_offset": 256, 00:29:15.804 "data_size": 7936 00:29:15.804 }, 00:29:15.804 { 00:29:15.804 "name": "BaseBdev2", 00:29:15.804 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:15.804 "is_configured": true, 00:29:15.804 "data_offset": 256, 00:29:15.804 "data_size": 7936 00:29:15.804 } 00:29:15.804 ] 00:29:15.804 }' 00:29:15.804 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:15.804 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:16.372 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:16.373 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:16.373 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:16.373 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:16.373 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:16.373 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.373 06:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:16.632 06:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:16.632 "name": "raid_bdev1", 00:29:16.632 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:16.632 "strip_size_kb": 0, 00:29:16.632 "state": "online", 00:29:16.632 "raid_level": "raid1", 00:29:16.632 "superblock": true, 00:29:16.632 "num_base_bdevs": 2, 00:29:16.632 "num_base_bdevs_discovered": 1, 00:29:16.632 "num_base_bdevs_operational": 1, 00:29:16.632 "base_bdevs_list": [ 00:29:16.632 { 00:29:16.632 "name": null, 00:29:16.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:16.632 "is_configured": false, 00:29:16.632 "data_offset": 256, 00:29:16.632 "data_size": 7936 00:29:16.632 }, 00:29:16.632 { 00:29:16.632 "name": "BaseBdev2", 00:29:16.632 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:16.632 "is_configured": true, 00:29:16.632 "data_offset": 256, 00:29:16.632 "data_size": 7936 00:29:16.632 } 00:29:16.632 ] 00:29:16.632 }' 00:29:16.632 06:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:16.632 06:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:16.632 06:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:16.632 06:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:16.632 06:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:16.891 06:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:17.150 [2024-07-25 06:45:30.590363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:17.150 [2024-07-25 06:45:30.590408] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:17.150 [2024-07-25 06:45:30.590427] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c9c7c0 00:29:17.150 [2024-07-25 06:45:30.590438] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:17.150 [2024-07-25 06:45:30.590739] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:17.150 [2024-07-25 06:45:30.590755] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:17.150 [2024-07-25 06:45:30.590811] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:17.150 [2024-07-25 06:45:30.590821] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:17.150 [2024-07-25 06:45:30.590830] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:17.150 BaseBdev1 00:29:17.150 06:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@789 -- # sleep 1 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.086 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:18.344 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:18.344 "name": "raid_bdev1", 00:29:18.344 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:18.344 "strip_size_kb": 0, 00:29:18.344 "state": "online", 00:29:18.344 "raid_level": "raid1", 00:29:18.344 "superblock": true, 00:29:18.344 "num_base_bdevs": 2, 00:29:18.344 "num_base_bdevs_discovered": 1, 00:29:18.344 "num_base_bdevs_operational": 1, 00:29:18.344 "base_bdevs_list": [ 00:29:18.344 { 00:29:18.344 "name": null, 00:29:18.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:18.344 "is_configured": false, 00:29:18.344 "data_offset": 256, 00:29:18.344 "data_size": 7936 00:29:18.344 }, 00:29:18.344 { 00:29:18.344 "name": "BaseBdev2", 00:29:18.344 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:18.344 "is_configured": true, 00:29:18.344 "data_offset": 256, 00:29:18.344 "data_size": 7936 00:29:18.344 } 00:29:18.344 ] 00:29:18.344 }' 00:29:18.344 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:18.344 06:45:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:18.911 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:18.911 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:18.911 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:18.911 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:18.911 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:18.911 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.911 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:19.169 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:19.169 "name": "raid_bdev1", 00:29:19.169 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:19.169 "strip_size_kb": 0, 00:29:19.169 "state": "online", 00:29:19.169 "raid_level": "raid1", 00:29:19.169 "superblock": true, 00:29:19.169 "num_base_bdevs": 2, 00:29:19.169 "num_base_bdevs_discovered": 1, 00:29:19.169 "num_base_bdevs_operational": 1, 00:29:19.169 "base_bdevs_list": [ 00:29:19.169 { 00:29:19.169 "name": null, 00:29:19.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:19.169 "is_configured": false, 00:29:19.169 "data_offset": 256, 00:29:19.169 "data_size": 7936 00:29:19.169 }, 00:29:19.169 { 00:29:19.169 "name": "BaseBdev2", 00:29:19.169 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:19.169 "is_configured": true, 00:29:19.169 "data_offset": 256, 00:29:19.169 "data_size": 7936 00:29:19.169 } 00:29:19.169 ] 00:29:19.169 }' 00:29:19.169 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:19.169 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:19.169 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:19.428 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:19.428 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:19.428 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # local es=0 00:29:19.428 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:19.429 [2024-07-25 06:45:32.940566] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:19.429 [2024-07-25 06:45:32.940677] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:19.429 [2024-07-25 06:45:32.940691] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:19.429 request: 00:29:19.429 { 00:29:19.429 "base_bdev": "BaseBdev1", 00:29:19.429 "raid_bdev": "raid_bdev1", 00:29:19.429 "method": "bdev_raid_add_base_bdev", 00:29:19.429 "req_id": 1 00:29:19.429 } 00:29:19.429 Got JSON-RPC error response 00:29:19.429 response: 00:29:19.429 { 00:29:19.429 "code": -22, 00:29:19.429 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:19.429 } 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # es=1 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:29:19.429 06:45:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@793 -- # sleep 1 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.805 06:45:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:20.805 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:20.805 "name": "raid_bdev1", 00:29:20.805 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:20.805 "strip_size_kb": 0, 00:29:20.805 "state": "online", 00:29:20.805 "raid_level": "raid1", 00:29:20.805 "superblock": true, 00:29:20.805 "num_base_bdevs": 2, 00:29:20.805 "num_base_bdevs_discovered": 1, 00:29:20.805 "num_base_bdevs_operational": 1, 00:29:20.805 "base_bdevs_list": [ 00:29:20.805 { 00:29:20.805 "name": null, 00:29:20.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:20.805 "is_configured": false, 00:29:20.805 "data_offset": 256, 00:29:20.805 "data_size": 7936 00:29:20.805 }, 00:29:20.805 { 00:29:20.805 "name": "BaseBdev2", 00:29:20.805 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:20.805 "is_configured": true, 00:29:20.805 "data_offset": 256, 00:29:20.805 "data_size": 7936 00:29:20.805 } 00:29:20.805 ] 00:29:20.805 }' 00:29:20.805 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:20.805 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:21.373 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:21.373 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:21.373 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:21.373 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:21.373 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:21.373 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:21.373 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:21.632 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:21.632 "name": "raid_bdev1", 00:29:21.632 "uuid": "d4a9cd92-f880-439f-84ed-51ebeee6e205", 00:29:21.632 "strip_size_kb": 0, 00:29:21.632 "state": "online", 00:29:21.632 "raid_level": "raid1", 00:29:21.632 "superblock": true, 00:29:21.632 "num_base_bdevs": 2, 00:29:21.632 "num_base_bdevs_discovered": 1, 00:29:21.632 "num_base_bdevs_operational": 1, 00:29:21.632 "base_bdevs_list": [ 00:29:21.632 { 00:29:21.632 "name": null, 00:29:21.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:21.632 "is_configured": false, 00:29:21.632 "data_offset": 256, 00:29:21.632 "data_size": 7936 00:29:21.632 }, 00:29:21.632 { 00:29:21.632 "name": "BaseBdev2", 00:29:21.632 "uuid": "481c2845-95ec-5ee3-b68f-168b020669c5", 00:29:21.632 "is_configured": true, 00:29:21.632 "data_offset": 256, 00:29:21.632 "data_size": 7936 00:29:21.632 } 00:29:21.632 ] 00:29:21.632 }' 00:29:21.632 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:21.632 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:21.632 06:45:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@798 -- # killprocess 1264700 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 1264700 ']' 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 1264700 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1264700 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:21.632 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1264700' 00:29:21.633 killing process with pid 1264700 00:29:21.633 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@969 -- # kill 1264700 00:29:21.633 Received shutdown signal, test time was about 60.000000 seconds 00:29:21.633 00:29:21.633 Latency(us) 00:29:21.633 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:21.633 =================================================================================================================== 00:29:21.633 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:21.633 [2024-07-25 06:45:35.085856] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:21.633 [2024-07-25 06:45:35.085937] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:21.633 [2024-07-25 06:45:35.085979] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:21.633 [2024-07-25 06:45:35.085990] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e45df0 name raid_bdev1, state offline 00:29:21.633 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@974 -- # wait 1264700 00:29:21.633 [2024-07-25 06:45:35.109003] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:21.891 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@800 -- # return 0 00:29:21.891 00:29:21.891 real 0m29.983s 00:29:21.891 user 0m46.388s 00:29:21.891 sys 0m4.868s 00:29:21.891 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:21.891 06:45:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:21.891 ************************************ 00:29:21.891 END TEST raid_rebuild_test_sb_4k 00:29:21.891 ************************************ 00:29:21.891 06:45:35 bdev_raid -- bdev/bdev_raid.sh@984 -- # base_malloc_params='-m 32' 00:29:21.891 06:45:35 bdev_raid -- bdev/bdev_raid.sh@985 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:29:21.891 06:45:35 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:29:21.891 06:45:35 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:21.891 06:45:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:21.891 ************************************ 00:29:21.891 START TEST raid_state_function_test_sb_md_separate 00:29:21.891 ************************************ 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1270236 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1270236' 00:29:21.891 Process raid pid: 1270236 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1270236 /var/tmp/spdk-raid.sock 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1270236 ']' 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:21.891 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:21.891 06:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:21.891 [2024-07-25 06:45:35.435536] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:29:21.891 [2024-07-25 06:45:35.435590] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:22.150 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:22.150 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:22.150 [2024-07-25 06:45:35.570943] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:22.150 [2024-07-25 06:45:35.615076] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:22.150 [2024-07-25 06:45:35.674697] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:22.150 [2024-07-25 06:45:35.674727] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:23.086 [2024-07-25 06:45:36.542948] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:23.086 [2024-07-25 06:45:36.542984] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:23.086 [2024-07-25 06:45:36.542993] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:23.086 [2024-07-25 06:45:36.543004] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.086 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:23.345 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:23.345 "name": "Existed_Raid", 00:29:23.345 "uuid": "e2a061bf-c61a-41a3-8d74-cda30f0446f6", 00:29:23.345 "strip_size_kb": 0, 00:29:23.345 "state": "configuring", 00:29:23.345 "raid_level": "raid1", 00:29:23.345 "superblock": true, 00:29:23.345 "num_base_bdevs": 2, 00:29:23.345 "num_base_bdevs_discovered": 0, 00:29:23.345 "num_base_bdevs_operational": 2, 00:29:23.345 "base_bdevs_list": [ 00:29:23.345 { 00:29:23.345 "name": "BaseBdev1", 00:29:23.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.345 "is_configured": false, 00:29:23.345 "data_offset": 0, 00:29:23.345 "data_size": 0 00:29:23.345 }, 00:29:23.345 { 00:29:23.345 "name": "BaseBdev2", 00:29:23.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.345 "is_configured": false, 00:29:23.345 "data_offset": 0, 00:29:23.345 "data_size": 0 00:29:23.345 } 00:29:23.345 ] 00:29:23.345 }' 00:29:23.345 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:23.345 06:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:23.912 06:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:24.173 [2024-07-25 06:45:37.545467] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:24.173 [2024-07-25 06:45:37.545492] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x120a470 name Existed_Raid, state configuring 00:29:24.173 06:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:24.483 [2024-07-25 06:45:37.774092] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:24.483 [2024-07-25 06:45:37.774121] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:24.483 [2024-07-25 06:45:37.774130] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:24.483 [2024-07-25 06:45:37.774146] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:24.483 06:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:29:24.483 [2024-07-25 06:45:38.012742] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:24.483 BaseBdev1 00:29:24.483 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:24.483 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:29:24.483 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:24.483 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:29:24.483 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:24.483 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:24.483 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:24.740 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:24.998 [ 00:29:24.998 { 00:29:24.998 "name": "BaseBdev1", 00:29:24.998 "aliases": [ 00:29:24.998 "2178c676-5811-4988-b226-a7dc9611ee4b" 00:29:24.998 ], 00:29:24.998 "product_name": "Malloc disk", 00:29:24.998 "block_size": 4096, 00:29:24.998 "num_blocks": 8192, 00:29:24.998 "uuid": "2178c676-5811-4988-b226-a7dc9611ee4b", 00:29:24.998 "md_size": 32, 00:29:24.998 "md_interleave": false, 00:29:24.998 "dif_type": 0, 00:29:24.998 "assigned_rate_limits": { 00:29:24.998 "rw_ios_per_sec": 0, 00:29:24.998 "rw_mbytes_per_sec": 0, 00:29:24.998 "r_mbytes_per_sec": 0, 00:29:24.998 "w_mbytes_per_sec": 0 00:29:24.998 }, 00:29:24.998 "claimed": true, 00:29:24.998 "claim_type": "exclusive_write", 00:29:24.998 "zoned": false, 00:29:24.998 "supported_io_types": { 00:29:24.998 "read": true, 00:29:24.998 "write": true, 00:29:24.998 "unmap": true, 00:29:24.998 "flush": true, 00:29:24.998 "reset": true, 00:29:24.998 "nvme_admin": false, 00:29:24.998 "nvme_io": false, 00:29:24.998 "nvme_io_md": false, 00:29:24.998 "write_zeroes": true, 00:29:24.998 "zcopy": true, 00:29:24.998 "get_zone_info": false, 00:29:24.998 "zone_management": false, 00:29:24.998 "zone_append": false, 00:29:24.998 "compare": false, 00:29:24.998 "compare_and_write": false, 00:29:24.998 "abort": true, 00:29:24.998 "seek_hole": false, 00:29:24.998 "seek_data": false, 00:29:24.998 "copy": true, 00:29:24.998 "nvme_iov_md": false 00:29:24.998 }, 00:29:24.998 "memory_domains": [ 00:29:24.998 { 00:29:24.998 "dma_device_id": "system", 00:29:24.998 "dma_device_type": 1 00:29:24.998 }, 00:29:24.998 { 00:29:24.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:24.998 "dma_device_type": 2 00:29:24.998 } 00:29:24.998 ], 00:29:24.998 "driver_specific": {} 00:29:24.998 } 00:29:24.998 ] 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.998 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:25.256 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:25.256 "name": "Existed_Raid", 00:29:25.256 "uuid": "4a4e342d-9e07-4a7f-ac3d-a00309602d24", 00:29:25.256 "strip_size_kb": 0, 00:29:25.256 "state": "configuring", 00:29:25.256 "raid_level": "raid1", 00:29:25.256 "superblock": true, 00:29:25.256 "num_base_bdevs": 2, 00:29:25.256 "num_base_bdevs_discovered": 1, 00:29:25.256 "num_base_bdevs_operational": 2, 00:29:25.256 "base_bdevs_list": [ 00:29:25.256 { 00:29:25.256 "name": "BaseBdev1", 00:29:25.256 "uuid": "2178c676-5811-4988-b226-a7dc9611ee4b", 00:29:25.256 "is_configured": true, 00:29:25.256 "data_offset": 256, 00:29:25.256 "data_size": 7936 00:29:25.256 }, 00:29:25.256 { 00:29:25.256 "name": "BaseBdev2", 00:29:25.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:25.256 "is_configured": false, 00:29:25.256 "data_offset": 0, 00:29:25.256 "data_size": 0 00:29:25.256 } 00:29:25.256 ] 00:29:25.256 }' 00:29:25.256 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:25.256 06:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:25.822 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:26.081 [2024-07-25 06:45:39.440588] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:26.081 [2024-07-25 06:45:39.440622] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1209ce0 name Existed_Raid, state configuring 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:26.081 [2024-07-25 06:45:39.613077] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:26.081 [2024-07-25 06:45:39.614427] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:26.081 [2024-07-25 06:45:39.614457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:26.081 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:26.339 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:26.339 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:26.339 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:26.339 "name": "Existed_Raid", 00:29:26.339 "uuid": "03d27d9e-f84f-4b6e-bf0e-05cf883ac859", 00:29:26.339 "strip_size_kb": 0, 00:29:26.339 "state": "configuring", 00:29:26.339 "raid_level": "raid1", 00:29:26.339 "superblock": true, 00:29:26.339 "num_base_bdevs": 2, 00:29:26.339 "num_base_bdevs_discovered": 1, 00:29:26.339 "num_base_bdevs_operational": 2, 00:29:26.339 "base_bdevs_list": [ 00:29:26.339 { 00:29:26.339 "name": "BaseBdev1", 00:29:26.339 "uuid": "2178c676-5811-4988-b226-a7dc9611ee4b", 00:29:26.339 "is_configured": true, 00:29:26.339 "data_offset": 256, 00:29:26.339 "data_size": 7936 00:29:26.339 }, 00:29:26.339 { 00:29:26.339 "name": "BaseBdev2", 00:29:26.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:26.339 "is_configured": false, 00:29:26.339 "data_offset": 0, 00:29:26.339 "data_size": 0 00:29:26.339 } 00:29:26.339 ] 00:29:26.339 }' 00:29:26.339 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:26.339 06:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:26.907 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:29:27.166 [2024-07-25 06:45:40.591383] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:27.166 [2024-07-25 06:45:40.591506] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13a6480 00:29:27.166 [2024-07-25 06:45:40.591518] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:27.166 [2024-07-25 06:45:40.591574] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1209360 00:29:27.166 [2024-07-25 06:45:40.591669] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13a6480 00:29:27.166 [2024-07-25 06:45:40.591678] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13a6480 00:29:27.166 [2024-07-25 06:45:40.591739] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:27.166 BaseBdev2 00:29:27.166 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:27.166 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:29:27.166 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:27.166 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:29:27.166 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:27.166 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:27.166 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:27.424 06:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:27.683 [ 00:29:27.683 { 00:29:27.683 "name": "BaseBdev2", 00:29:27.683 "aliases": [ 00:29:27.683 "4b6c8918-147e-404d-a92a-3d778cbb3199" 00:29:27.683 ], 00:29:27.683 "product_name": "Malloc disk", 00:29:27.683 "block_size": 4096, 00:29:27.683 "num_blocks": 8192, 00:29:27.683 "uuid": "4b6c8918-147e-404d-a92a-3d778cbb3199", 00:29:27.683 "md_size": 32, 00:29:27.683 "md_interleave": false, 00:29:27.683 "dif_type": 0, 00:29:27.683 "assigned_rate_limits": { 00:29:27.683 "rw_ios_per_sec": 0, 00:29:27.683 "rw_mbytes_per_sec": 0, 00:29:27.683 "r_mbytes_per_sec": 0, 00:29:27.683 "w_mbytes_per_sec": 0 00:29:27.683 }, 00:29:27.683 "claimed": true, 00:29:27.683 "claim_type": "exclusive_write", 00:29:27.683 "zoned": false, 00:29:27.683 "supported_io_types": { 00:29:27.683 "read": true, 00:29:27.683 "write": true, 00:29:27.683 "unmap": true, 00:29:27.683 "flush": true, 00:29:27.683 "reset": true, 00:29:27.683 "nvme_admin": false, 00:29:27.683 "nvme_io": false, 00:29:27.683 "nvme_io_md": false, 00:29:27.683 "write_zeroes": true, 00:29:27.683 "zcopy": true, 00:29:27.683 "get_zone_info": false, 00:29:27.683 "zone_management": false, 00:29:27.683 "zone_append": false, 00:29:27.683 "compare": false, 00:29:27.683 "compare_and_write": false, 00:29:27.683 "abort": true, 00:29:27.683 "seek_hole": false, 00:29:27.683 "seek_data": false, 00:29:27.683 "copy": true, 00:29:27.683 "nvme_iov_md": false 00:29:27.683 }, 00:29:27.683 "memory_domains": [ 00:29:27.683 { 00:29:27.683 "dma_device_id": "system", 00:29:27.683 "dma_device_type": 1 00:29:27.683 }, 00:29:27.683 { 00:29:27.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:27.683 "dma_device_type": 2 00:29:27.683 } 00:29:27.683 ], 00:29:27.683 "driver_specific": {} 00:29:27.683 } 00:29:27.683 ] 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.683 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:27.942 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:27.942 "name": "Existed_Raid", 00:29:27.942 "uuid": "03d27d9e-f84f-4b6e-bf0e-05cf883ac859", 00:29:27.942 "strip_size_kb": 0, 00:29:27.942 "state": "online", 00:29:27.942 "raid_level": "raid1", 00:29:27.942 "superblock": true, 00:29:27.942 "num_base_bdevs": 2, 00:29:27.942 "num_base_bdevs_discovered": 2, 00:29:27.942 "num_base_bdevs_operational": 2, 00:29:27.942 "base_bdevs_list": [ 00:29:27.942 { 00:29:27.942 "name": "BaseBdev1", 00:29:27.942 "uuid": "2178c676-5811-4988-b226-a7dc9611ee4b", 00:29:27.942 "is_configured": true, 00:29:27.942 "data_offset": 256, 00:29:27.942 "data_size": 7936 00:29:27.942 }, 00:29:27.942 { 00:29:27.942 "name": "BaseBdev2", 00:29:27.942 "uuid": "4b6c8918-147e-404d-a92a-3d778cbb3199", 00:29:27.942 "is_configured": true, 00:29:27.942 "data_offset": 256, 00:29:27.942 "data_size": 7936 00:29:27.942 } 00:29:27.942 ] 00:29:27.942 }' 00:29:27.942 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:27.942 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:28.508 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:28.508 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:28.508 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:28.508 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:28.508 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:28.508 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:28.508 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:28.508 06:45:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:28.767 [2024-07-25 06:45:42.067548] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:28.767 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:28.767 "name": "Existed_Raid", 00:29:28.767 "aliases": [ 00:29:28.767 "03d27d9e-f84f-4b6e-bf0e-05cf883ac859" 00:29:28.767 ], 00:29:28.767 "product_name": "Raid Volume", 00:29:28.767 "block_size": 4096, 00:29:28.767 "num_blocks": 7936, 00:29:28.767 "uuid": "03d27d9e-f84f-4b6e-bf0e-05cf883ac859", 00:29:28.767 "md_size": 32, 00:29:28.767 "md_interleave": false, 00:29:28.767 "dif_type": 0, 00:29:28.767 "assigned_rate_limits": { 00:29:28.767 "rw_ios_per_sec": 0, 00:29:28.767 "rw_mbytes_per_sec": 0, 00:29:28.767 "r_mbytes_per_sec": 0, 00:29:28.767 "w_mbytes_per_sec": 0 00:29:28.767 }, 00:29:28.767 "claimed": false, 00:29:28.767 "zoned": false, 00:29:28.767 "supported_io_types": { 00:29:28.767 "read": true, 00:29:28.767 "write": true, 00:29:28.767 "unmap": false, 00:29:28.767 "flush": false, 00:29:28.767 "reset": true, 00:29:28.767 "nvme_admin": false, 00:29:28.767 "nvme_io": false, 00:29:28.767 "nvme_io_md": false, 00:29:28.767 "write_zeroes": true, 00:29:28.767 "zcopy": false, 00:29:28.767 "get_zone_info": false, 00:29:28.767 "zone_management": false, 00:29:28.767 "zone_append": false, 00:29:28.767 "compare": false, 00:29:28.767 "compare_and_write": false, 00:29:28.767 "abort": false, 00:29:28.767 "seek_hole": false, 00:29:28.767 "seek_data": false, 00:29:28.767 "copy": false, 00:29:28.767 "nvme_iov_md": false 00:29:28.767 }, 00:29:28.767 "memory_domains": [ 00:29:28.767 { 00:29:28.767 "dma_device_id": "system", 00:29:28.767 "dma_device_type": 1 00:29:28.767 }, 00:29:28.767 { 00:29:28.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:28.767 "dma_device_type": 2 00:29:28.767 }, 00:29:28.767 { 00:29:28.767 "dma_device_id": "system", 00:29:28.767 "dma_device_type": 1 00:29:28.767 }, 00:29:28.767 { 00:29:28.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:28.767 "dma_device_type": 2 00:29:28.767 } 00:29:28.767 ], 00:29:28.767 "driver_specific": { 00:29:28.767 "raid": { 00:29:28.767 "uuid": "03d27d9e-f84f-4b6e-bf0e-05cf883ac859", 00:29:28.767 "strip_size_kb": 0, 00:29:28.767 "state": "online", 00:29:28.767 "raid_level": "raid1", 00:29:28.767 "superblock": true, 00:29:28.767 "num_base_bdevs": 2, 00:29:28.767 "num_base_bdevs_discovered": 2, 00:29:28.767 "num_base_bdevs_operational": 2, 00:29:28.767 "base_bdevs_list": [ 00:29:28.767 { 00:29:28.767 "name": "BaseBdev1", 00:29:28.768 "uuid": "2178c676-5811-4988-b226-a7dc9611ee4b", 00:29:28.768 "is_configured": true, 00:29:28.768 "data_offset": 256, 00:29:28.768 "data_size": 7936 00:29:28.768 }, 00:29:28.768 { 00:29:28.768 "name": "BaseBdev2", 00:29:28.768 "uuid": "4b6c8918-147e-404d-a92a-3d778cbb3199", 00:29:28.768 "is_configured": true, 00:29:28.768 "data_offset": 256, 00:29:28.768 "data_size": 7936 00:29:28.768 } 00:29:28.768 ] 00:29:28.768 } 00:29:28.768 } 00:29:28.768 }' 00:29:28.768 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:28.768 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:28.768 BaseBdev2' 00:29:28.768 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:28.768 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:28.768 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:29.027 "name": "BaseBdev1", 00:29:29.027 "aliases": [ 00:29:29.027 "2178c676-5811-4988-b226-a7dc9611ee4b" 00:29:29.027 ], 00:29:29.027 "product_name": "Malloc disk", 00:29:29.027 "block_size": 4096, 00:29:29.027 "num_blocks": 8192, 00:29:29.027 "uuid": "2178c676-5811-4988-b226-a7dc9611ee4b", 00:29:29.027 "md_size": 32, 00:29:29.027 "md_interleave": false, 00:29:29.027 "dif_type": 0, 00:29:29.027 "assigned_rate_limits": { 00:29:29.027 "rw_ios_per_sec": 0, 00:29:29.027 "rw_mbytes_per_sec": 0, 00:29:29.027 "r_mbytes_per_sec": 0, 00:29:29.027 "w_mbytes_per_sec": 0 00:29:29.027 }, 00:29:29.027 "claimed": true, 00:29:29.027 "claim_type": "exclusive_write", 00:29:29.027 "zoned": false, 00:29:29.027 "supported_io_types": { 00:29:29.027 "read": true, 00:29:29.027 "write": true, 00:29:29.027 "unmap": true, 00:29:29.027 "flush": true, 00:29:29.027 "reset": true, 00:29:29.027 "nvme_admin": false, 00:29:29.027 "nvme_io": false, 00:29:29.027 "nvme_io_md": false, 00:29:29.027 "write_zeroes": true, 00:29:29.027 "zcopy": true, 00:29:29.027 "get_zone_info": false, 00:29:29.027 "zone_management": false, 00:29:29.027 "zone_append": false, 00:29:29.027 "compare": false, 00:29:29.027 "compare_and_write": false, 00:29:29.027 "abort": true, 00:29:29.027 "seek_hole": false, 00:29:29.027 "seek_data": false, 00:29:29.027 "copy": true, 00:29:29.027 "nvme_iov_md": false 00:29:29.027 }, 00:29:29.027 "memory_domains": [ 00:29:29.027 { 00:29:29.027 "dma_device_id": "system", 00:29:29.027 "dma_device_type": 1 00:29:29.027 }, 00:29:29.027 { 00:29:29.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:29.027 "dma_device_type": 2 00:29:29.027 } 00:29:29.027 ], 00:29:29.027 "driver_specific": {} 00:29:29.027 }' 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:29.027 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:29.286 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:29.286 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:29.286 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:29.286 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:29.286 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:29.286 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:29.286 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:29.545 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:29.545 "name": "BaseBdev2", 00:29:29.545 "aliases": [ 00:29:29.545 "4b6c8918-147e-404d-a92a-3d778cbb3199" 00:29:29.545 ], 00:29:29.545 "product_name": "Malloc disk", 00:29:29.545 "block_size": 4096, 00:29:29.545 "num_blocks": 8192, 00:29:29.545 "uuid": "4b6c8918-147e-404d-a92a-3d778cbb3199", 00:29:29.545 "md_size": 32, 00:29:29.545 "md_interleave": false, 00:29:29.545 "dif_type": 0, 00:29:29.545 "assigned_rate_limits": { 00:29:29.545 "rw_ios_per_sec": 0, 00:29:29.545 "rw_mbytes_per_sec": 0, 00:29:29.545 "r_mbytes_per_sec": 0, 00:29:29.545 "w_mbytes_per_sec": 0 00:29:29.545 }, 00:29:29.545 "claimed": true, 00:29:29.545 "claim_type": "exclusive_write", 00:29:29.545 "zoned": false, 00:29:29.545 "supported_io_types": { 00:29:29.545 "read": true, 00:29:29.545 "write": true, 00:29:29.545 "unmap": true, 00:29:29.545 "flush": true, 00:29:29.545 "reset": true, 00:29:29.545 "nvme_admin": false, 00:29:29.545 "nvme_io": false, 00:29:29.545 "nvme_io_md": false, 00:29:29.545 "write_zeroes": true, 00:29:29.545 "zcopy": true, 00:29:29.545 "get_zone_info": false, 00:29:29.545 "zone_management": false, 00:29:29.545 "zone_append": false, 00:29:29.545 "compare": false, 00:29:29.545 "compare_and_write": false, 00:29:29.545 "abort": true, 00:29:29.545 "seek_hole": false, 00:29:29.545 "seek_data": false, 00:29:29.545 "copy": true, 00:29:29.545 "nvme_iov_md": false 00:29:29.545 }, 00:29:29.545 "memory_domains": [ 00:29:29.545 { 00:29:29.545 "dma_device_id": "system", 00:29:29.545 "dma_device_type": 1 00:29:29.545 }, 00:29:29.545 { 00:29:29.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:29.545 "dma_device_type": 2 00:29:29.545 } 00:29:29.545 ], 00:29:29.545 "driver_specific": {} 00:29:29.545 }' 00:29:29.545 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:29.545 06:45:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:29.545 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:29.545 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:29.545 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:29.804 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:29.804 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:29.804 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:29.805 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:29.805 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:29.805 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:29.805 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:29.805 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:30.062 [2024-07-25 06:45:43.503124] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:30.062 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:30.063 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:30.063 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:30.063 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:30.063 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.063 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:30.321 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:30.321 "name": "Existed_Raid", 00:29:30.321 "uuid": "03d27d9e-f84f-4b6e-bf0e-05cf883ac859", 00:29:30.321 "strip_size_kb": 0, 00:29:30.321 "state": "online", 00:29:30.321 "raid_level": "raid1", 00:29:30.321 "superblock": true, 00:29:30.321 "num_base_bdevs": 2, 00:29:30.321 "num_base_bdevs_discovered": 1, 00:29:30.321 "num_base_bdevs_operational": 1, 00:29:30.321 "base_bdevs_list": [ 00:29:30.321 { 00:29:30.321 "name": null, 00:29:30.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:30.321 "is_configured": false, 00:29:30.321 "data_offset": 256, 00:29:30.321 "data_size": 7936 00:29:30.321 }, 00:29:30.321 { 00:29:30.321 "name": "BaseBdev2", 00:29:30.321 "uuid": "4b6c8918-147e-404d-a92a-3d778cbb3199", 00:29:30.321 "is_configured": true, 00:29:30.321 "data_offset": 256, 00:29:30.321 "data_size": 7936 00:29:30.321 } 00:29:30.321 ] 00:29:30.321 }' 00:29:30.321 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:30.321 06:45:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:30.890 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:30.890 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:30.890 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.890 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:31.149 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:31.149 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:31.149 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:31.409 [2024-07-25 06:45:44.744679] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:31.409 [2024-07-25 06:45:44.744755] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:31.409 [2024-07-25 06:45:44.755612] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:31.409 [2024-07-25 06:45:44.755643] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:31.409 [2024-07-25 06:45:44.755653] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13a6480 name Existed_Raid, state offline 00:29:31.409 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:31.409 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:31.409 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:31.409 06:45:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1270236 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1270236 ']' 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 1270236 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1270236 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1270236' 00:29:31.668 killing process with pid 1270236 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 1270236 00:29:31.668 [2024-07-25 06:45:45.062841] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:31.668 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 1270236 00:29:31.668 [2024-07-25 06:45:45.063684] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:31.929 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:29:31.929 00:29:31.929 real 0m9.867s 00:29:31.929 user 0m17.472s 00:29:31.929 sys 0m1.938s 00:29:31.929 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:31.929 06:45:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:31.929 ************************************ 00:29:31.929 END TEST raid_state_function_test_sb_md_separate 00:29:31.929 ************************************ 00:29:31.929 06:45:45 bdev_raid -- bdev/bdev_raid.sh@986 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:29:31.929 06:45:45 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:29:31.929 06:45:45 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:31.929 06:45:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:31.929 ************************************ 00:29:31.929 START TEST raid_superblock_test_md_separate 00:29:31.929 ************************************ 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@414 -- # local strip_size 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@427 -- # raid_pid=1272085 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@428 -- # waitforlisten 1272085 /var/tmp/spdk-raid.sock 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1272085 ']' 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:31.929 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:31.929 06:45:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:31.929 [2024-07-25 06:45:45.383733] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:29:31.929 [2024-07-25 06:45:45.383788] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1272085 ] 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.929 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:31.929 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.930 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:31.930 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:31.930 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:32.189 [2024-07-25 06:45:45.507775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:32.189 [2024-07-25 06:45:45.552373] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:32.189 [2024-07-25 06:45:45.610719] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:32.189 [2024-07-25 06:45:45.610755] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@864 -- # return 0 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:32.758 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:29:33.017 malloc1 00:29:33.017 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:33.276 [2024-07-25 06:45:46.718324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:33.276 [2024-07-25 06:45:46.718369] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:33.276 [2024-07-25 06:45:46.718389] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1026470 00:29:33.276 [2024-07-25 06:45:46.718401] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:33.276 [2024-07-25 06:45:46.719720] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:33.276 [2024-07-25 06:45:46.719746] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:33.276 pt1 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:33.277 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:29:33.536 malloc2 00:29:33.536 06:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:33.795 [2024-07-25 06:45:47.172741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:33.795 [2024-07-25 06:45:47.172783] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:33.795 [2024-07-25 06:45:47.172801] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1167c90 00:29:33.795 [2024-07-25 06:45:47.172813] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:33.795 [2024-07-25 06:45:47.174107] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:33.795 [2024-07-25 06:45:47.174132] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:33.795 pt2 00:29:33.795 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:29:33.795 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:33.795 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:34.055 [2024-07-25 06:45:47.397348] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:34.055 [2024-07-25 06:45:47.398506] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:34.055 [2024-07-25 06:45:47.398638] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1169540 00:29:34.055 [2024-07-25 06:45:47.398650] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:34.055 [2024-07-25 06:45:47.398713] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1169cb0 00:29:34.055 [2024-07-25 06:45:47.398813] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1169540 00:29:34.055 [2024-07-25 06:45:47.398822] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1169540 00:29:34.055 [2024-07-25 06:45:47.398885] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.055 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:34.314 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:34.314 "name": "raid_bdev1", 00:29:34.314 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:34.314 "strip_size_kb": 0, 00:29:34.314 "state": "online", 00:29:34.314 "raid_level": "raid1", 00:29:34.314 "superblock": true, 00:29:34.314 "num_base_bdevs": 2, 00:29:34.314 "num_base_bdevs_discovered": 2, 00:29:34.314 "num_base_bdevs_operational": 2, 00:29:34.314 "base_bdevs_list": [ 00:29:34.314 { 00:29:34.314 "name": "pt1", 00:29:34.314 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:34.314 "is_configured": true, 00:29:34.314 "data_offset": 256, 00:29:34.314 "data_size": 7936 00:29:34.314 }, 00:29:34.314 { 00:29:34.314 "name": "pt2", 00:29:34.314 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:34.314 "is_configured": true, 00:29:34.314 "data_offset": 256, 00:29:34.314 "data_size": 7936 00:29:34.314 } 00:29:34.314 ] 00:29:34.314 }' 00:29:34.314 06:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:34.314 06:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:34.883 [2024-07-25 06:45:48.404215] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:34.883 "name": "raid_bdev1", 00:29:34.883 "aliases": [ 00:29:34.883 "e2022ad0-e3c1-4b15-8507-05b0cc5316e0" 00:29:34.883 ], 00:29:34.883 "product_name": "Raid Volume", 00:29:34.883 "block_size": 4096, 00:29:34.883 "num_blocks": 7936, 00:29:34.883 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:34.883 "md_size": 32, 00:29:34.883 "md_interleave": false, 00:29:34.883 "dif_type": 0, 00:29:34.883 "assigned_rate_limits": { 00:29:34.883 "rw_ios_per_sec": 0, 00:29:34.883 "rw_mbytes_per_sec": 0, 00:29:34.883 "r_mbytes_per_sec": 0, 00:29:34.883 "w_mbytes_per_sec": 0 00:29:34.883 }, 00:29:34.883 "claimed": false, 00:29:34.883 "zoned": false, 00:29:34.883 "supported_io_types": { 00:29:34.883 "read": true, 00:29:34.883 "write": true, 00:29:34.883 "unmap": false, 00:29:34.883 "flush": false, 00:29:34.883 "reset": true, 00:29:34.883 "nvme_admin": false, 00:29:34.883 "nvme_io": false, 00:29:34.883 "nvme_io_md": false, 00:29:34.883 "write_zeroes": true, 00:29:34.883 "zcopy": false, 00:29:34.883 "get_zone_info": false, 00:29:34.883 "zone_management": false, 00:29:34.883 "zone_append": false, 00:29:34.883 "compare": false, 00:29:34.883 "compare_and_write": false, 00:29:34.883 "abort": false, 00:29:34.883 "seek_hole": false, 00:29:34.883 "seek_data": false, 00:29:34.883 "copy": false, 00:29:34.883 "nvme_iov_md": false 00:29:34.883 }, 00:29:34.883 "memory_domains": [ 00:29:34.883 { 00:29:34.883 "dma_device_id": "system", 00:29:34.883 "dma_device_type": 1 00:29:34.883 }, 00:29:34.883 { 00:29:34.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.883 "dma_device_type": 2 00:29:34.883 }, 00:29:34.883 { 00:29:34.883 "dma_device_id": "system", 00:29:34.883 "dma_device_type": 1 00:29:34.883 }, 00:29:34.883 { 00:29:34.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.883 "dma_device_type": 2 00:29:34.883 } 00:29:34.883 ], 00:29:34.883 "driver_specific": { 00:29:34.883 "raid": { 00:29:34.883 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:34.883 "strip_size_kb": 0, 00:29:34.883 "state": "online", 00:29:34.883 "raid_level": "raid1", 00:29:34.883 "superblock": true, 00:29:34.883 "num_base_bdevs": 2, 00:29:34.883 "num_base_bdevs_discovered": 2, 00:29:34.883 "num_base_bdevs_operational": 2, 00:29:34.883 "base_bdevs_list": [ 00:29:34.883 { 00:29:34.883 "name": "pt1", 00:29:34.883 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:34.883 "is_configured": true, 00:29:34.883 "data_offset": 256, 00:29:34.883 "data_size": 7936 00:29:34.883 }, 00:29:34.883 { 00:29:34.883 "name": "pt2", 00:29:34.883 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:34.883 "is_configured": true, 00:29:34.883 "data_offset": 256, 00:29:34.883 "data_size": 7936 00:29:34.883 } 00:29:34.883 ] 00:29:34.883 } 00:29:34.883 } 00:29:34.883 }' 00:29:34.883 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:35.143 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:35.143 pt2' 00:29:35.143 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:35.143 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:35.143 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:35.143 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:35.143 "name": "pt1", 00:29:35.143 "aliases": [ 00:29:35.143 "00000000-0000-0000-0000-000000000001" 00:29:35.143 ], 00:29:35.143 "product_name": "passthru", 00:29:35.143 "block_size": 4096, 00:29:35.143 "num_blocks": 8192, 00:29:35.143 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:35.143 "md_size": 32, 00:29:35.143 "md_interleave": false, 00:29:35.143 "dif_type": 0, 00:29:35.143 "assigned_rate_limits": { 00:29:35.143 "rw_ios_per_sec": 0, 00:29:35.143 "rw_mbytes_per_sec": 0, 00:29:35.143 "r_mbytes_per_sec": 0, 00:29:35.143 "w_mbytes_per_sec": 0 00:29:35.143 }, 00:29:35.143 "claimed": true, 00:29:35.143 "claim_type": "exclusive_write", 00:29:35.143 "zoned": false, 00:29:35.143 "supported_io_types": { 00:29:35.143 "read": true, 00:29:35.143 "write": true, 00:29:35.143 "unmap": true, 00:29:35.143 "flush": true, 00:29:35.143 "reset": true, 00:29:35.143 "nvme_admin": false, 00:29:35.143 "nvme_io": false, 00:29:35.143 "nvme_io_md": false, 00:29:35.143 "write_zeroes": true, 00:29:35.143 "zcopy": true, 00:29:35.143 "get_zone_info": false, 00:29:35.143 "zone_management": false, 00:29:35.143 "zone_append": false, 00:29:35.143 "compare": false, 00:29:35.143 "compare_and_write": false, 00:29:35.143 "abort": true, 00:29:35.143 "seek_hole": false, 00:29:35.143 "seek_data": false, 00:29:35.143 "copy": true, 00:29:35.143 "nvme_iov_md": false 00:29:35.143 }, 00:29:35.143 "memory_domains": [ 00:29:35.143 { 00:29:35.143 "dma_device_id": "system", 00:29:35.143 "dma_device_type": 1 00:29:35.143 }, 00:29:35.143 { 00:29:35.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.143 "dma_device_type": 2 00:29:35.143 } 00:29:35.143 ], 00:29:35.143 "driver_specific": { 00:29:35.143 "passthru": { 00:29:35.143 "name": "pt1", 00:29:35.143 "base_bdev_name": "malloc1" 00:29:35.143 } 00:29:35.143 } 00:29:35.143 }' 00:29:35.143 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.402 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.402 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:35.402 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.402 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.402 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:35.402 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:35.402 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:35.402 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:35.403 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:35.662 06:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:35.662 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:35.662 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:35.662 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:35.662 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:35.921 "name": "pt2", 00:29:35.921 "aliases": [ 00:29:35.921 "00000000-0000-0000-0000-000000000002" 00:29:35.921 ], 00:29:35.921 "product_name": "passthru", 00:29:35.921 "block_size": 4096, 00:29:35.921 "num_blocks": 8192, 00:29:35.921 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:35.921 "md_size": 32, 00:29:35.921 "md_interleave": false, 00:29:35.921 "dif_type": 0, 00:29:35.921 "assigned_rate_limits": { 00:29:35.921 "rw_ios_per_sec": 0, 00:29:35.921 "rw_mbytes_per_sec": 0, 00:29:35.921 "r_mbytes_per_sec": 0, 00:29:35.921 "w_mbytes_per_sec": 0 00:29:35.921 }, 00:29:35.921 "claimed": true, 00:29:35.921 "claim_type": "exclusive_write", 00:29:35.921 "zoned": false, 00:29:35.921 "supported_io_types": { 00:29:35.921 "read": true, 00:29:35.921 "write": true, 00:29:35.921 "unmap": true, 00:29:35.921 "flush": true, 00:29:35.921 "reset": true, 00:29:35.921 "nvme_admin": false, 00:29:35.921 "nvme_io": false, 00:29:35.921 "nvme_io_md": false, 00:29:35.921 "write_zeroes": true, 00:29:35.921 "zcopy": true, 00:29:35.921 "get_zone_info": false, 00:29:35.921 "zone_management": false, 00:29:35.921 "zone_append": false, 00:29:35.921 "compare": false, 00:29:35.921 "compare_and_write": false, 00:29:35.921 "abort": true, 00:29:35.921 "seek_hole": false, 00:29:35.921 "seek_data": false, 00:29:35.921 "copy": true, 00:29:35.921 "nvme_iov_md": false 00:29:35.921 }, 00:29:35.921 "memory_domains": [ 00:29:35.921 { 00:29:35.921 "dma_device_id": "system", 00:29:35.921 "dma_device_type": 1 00:29:35.921 }, 00:29:35.921 { 00:29:35.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.921 "dma_device_type": 2 00:29:35.921 } 00:29:35.921 ], 00:29:35.921 "driver_specific": { 00:29:35.921 "passthru": { 00:29:35.921 "name": "pt2", 00:29:35.921 "base_bdev_name": "malloc2" 00:29:35.921 } 00:29:35.921 } 00:29:35.921 }' 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:35.921 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:36.180 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:36.180 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:36.180 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:36.180 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:36.180 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:29:36.180 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:36.439 [2024-07-25 06:45:49.803883] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:36.439 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=e2022ad0-e3c1-4b15-8507-05b0cc5316e0 00:29:36.439 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # '[' -z e2022ad0-e3c1-4b15-8507-05b0cc5316e0 ']' 00:29:36.439 06:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:36.699 [2024-07-25 06:45:50.032237] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:36.699 [2024-07-25 06:45:50.032255] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:36.699 [2024-07-25 06:45:50.032304] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:36.699 [2024-07-25 06:45:50.032354] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:36.699 [2024-07-25 06:45:50.032365] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1169540 name raid_bdev1, state offline 00:29:36.699 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.699 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:29:36.959 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:29:36.959 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:29:36.959 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:29:36.959 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:36.959 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:29:36.960 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:37.255 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:37.255 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:37.514 06:45:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:37.773 [2024-07-25 06:45:51.167177] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:37.773 [2024-07-25 06:45:51.168413] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:37.773 [2024-07-25 06:45:51.168465] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:37.773 [2024-07-25 06:45:51.168502] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:37.773 [2024-07-25 06:45:51.168519] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:37.773 [2024-07-25 06:45:51.168528] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1024eb0 name raid_bdev1, state configuring 00:29:37.773 request: 00:29:37.773 { 00:29:37.773 "name": "raid_bdev1", 00:29:37.773 "raid_level": "raid1", 00:29:37.773 "base_bdevs": [ 00:29:37.773 "malloc1", 00:29:37.773 "malloc2" 00:29:37.773 ], 00:29:37.773 "superblock": false, 00:29:37.773 "method": "bdev_raid_create", 00:29:37.773 "req_id": 1 00:29:37.773 } 00:29:37.773 Got JSON-RPC error response 00:29:37.773 response: 00:29:37.773 { 00:29:37.773 "code": -17, 00:29:37.773 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:37.773 } 00:29:37.773 06:45:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # es=1 00:29:37.773 06:45:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:29:37.773 06:45:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:29:37.773 06:45:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:29:37.773 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.773 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:29:38.032 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:29:38.032 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:29:38.033 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:38.292 [2024-07-25 06:45:51.628335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:38.292 [2024-07-25 06:45:51.628376] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:38.292 [2024-07-25 06:45:51.628392] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1026ff0 00:29:38.292 [2024-07-25 06:45:51.628404] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:38.292 [2024-07-25 06:45:51.629707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:38.292 [2024-07-25 06:45:51.629732] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:38.292 [2024-07-25 06:45:51.629773] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:38.292 [2024-07-25 06:45:51.629794] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:38.292 pt1 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.292 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:38.551 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:38.551 "name": "raid_bdev1", 00:29:38.551 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:38.551 "strip_size_kb": 0, 00:29:38.551 "state": "configuring", 00:29:38.551 "raid_level": "raid1", 00:29:38.551 "superblock": true, 00:29:38.551 "num_base_bdevs": 2, 00:29:38.551 "num_base_bdevs_discovered": 1, 00:29:38.551 "num_base_bdevs_operational": 2, 00:29:38.551 "base_bdevs_list": [ 00:29:38.551 { 00:29:38.551 "name": "pt1", 00:29:38.551 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:38.551 "is_configured": true, 00:29:38.551 "data_offset": 256, 00:29:38.551 "data_size": 7936 00:29:38.551 }, 00:29:38.551 { 00:29:38.551 "name": null, 00:29:38.551 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:38.551 "is_configured": false, 00:29:38.551 "data_offset": 256, 00:29:38.551 "data_size": 7936 00:29:38.551 } 00:29:38.551 ] 00:29:38.551 }' 00:29:38.551 06:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:38.551 06:45:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:39.119 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:29:39.119 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:29:39.119 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:29:39.119 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:39.119 [2024-07-25 06:45:52.671162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:39.119 [2024-07-25 06:45:52.671208] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:39.119 [2024-07-25 06:45:52.671226] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1025210 00:29:39.119 [2024-07-25 06:45:52.671238] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:39.119 [2024-07-25 06:45:52.671408] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:39.119 [2024-07-25 06:45:52.671423] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:39.119 [2024-07-25 06:45:52.671461] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:39.119 [2024-07-25 06:45:52.671478] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:39.119 [2024-07-25 06:45:52.671558] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x116b520 00:29:39.119 [2024-07-25 06:45:52.671567] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:39.119 [2024-07-25 06:45:52.671619] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11c2500 00:29:39.119 [2024-07-25 06:45:52.671709] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116b520 00:29:39.119 [2024-07-25 06:45:52.671717] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116b520 00:29:39.119 [2024-07-25 06:45:52.671781] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:39.378 pt2 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:39.378 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:39.378 "name": "raid_bdev1", 00:29:39.378 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:39.378 "strip_size_kb": 0, 00:29:39.378 "state": "online", 00:29:39.378 "raid_level": "raid1", 00:29:39.378 "superblock": true, 00:29:39.378 "num_base_bdevs": 2, 00:29:39.378 "num_base_bdevs_discovered": 2, 00:29:39.378 "num_base_bdevs_operational": 2, 00:29:39.378 "base_bdevs_list": [ 00:29:39.378 { 00:29:39.378 "name": "pt1", 00:29:39.378 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:39.378 "is_configured": true, 00:29:39.378 "data_offset": 256, 00:29:39.378 "data_size": 7936 00:29:39.378 }, 00:29:39.379 { 00:29:39.379 "name": "pt2", 00:29:39.379 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:39.379 "is_configured": true, 00:29:39.379 "data_offset": 256, 00:29:39.379 "data_size": 7936 00:29:39.379 } 00:29:39.379 ] 00:29:39.379 }' 00:29:39.379 06:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:39.379 06:45:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:40.317 [2024-07-25 06:45:53.666017] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:40.317 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:40.317 "name": "raid_bdev1", 00:29:40.317 "aliases": [ 00:29:40.317 "e2022ad0-e3c1-4b15-8507-05b0cc5316e0" 00:29:40.317 ], 00:29:40.317 "product_name": "Raid Volume", 00:29:40.317 "block_size": 4096, 00:29:40.317 "num_blocks": 7936, 00:29:40.317 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:40.317 "md_size": 32, 00:29:40.318 "md_interleave": false, 00:29:40.318 "dif_type": 0, 00:29:40.318 "assigned_rate_limits": { 00:29:40.318 "rw_ios_per_sec": 0, 00:29:40.318 "rw_mbytes_per_sec": 0, 00:29:40.318 "r_mbytes_per_sec": 0, 00:29:40.318 "w_mbytes_per_sec": 0 00:29:40.318 }, 00:29:40.318 "claimed": false, 00:29:40.318 "zoned": false, 00:29:40.318 "supported_io_types": { 00:29:40.318 "read": true, 00:29:40.318 "write": true, 00:29:40.318 "unmap": false, 00:29:40.318 "flush": false, 00:29:40.318 "reset": true, 00:29:40.318 "nvme_admin": false, 00:29:40.318 "nvme_io": false, 00:29:40.318 "nvme_io_md": false, 00:29:40.318 "write_zeroes": true, 00:29:40.318 "zcopy": false, 00:29:40.318 "get_zone_info": false, 00:29:40.318 "zone_management": false, 00:29:40.318 "zone_append": false, 00:29:40.318 "compare": false, 00:29:40.318 "compare_and_write": false, 00:29:40.318 "abort": false, 00:29:40.318 "seek_hole": false, 00:29:40.318 "seek_data": false, 00:29:40.318 "copy": false, 00:29:40.318 "nvme_iov_md": false 00:29:40.318 }, 00:29:40.318 "memory_domains": [ 00:29:40.318 { 00:29:40.318 "dma_device_id": "system", 00:29:40.318 "dma_device_type": 1 00:29:40.318 }, 00:29:40.318 { 00:29:40.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.318 "dma_device_type": 2 00:29:40.318 }, 00:29:40.318 { 00:29:40.318 "dma_device_id": "system", 00:29:40.318 "dma_device_type": 1 00:29:40.318 }, 00:29:40.318 { 00:29:40.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.318 "dma_device_type": 2 00:29:40.318 } 00:29:40.318 ], 00:29:40.318 "driver_specific": { 00:29:40.318 "raid": { 00:29:40.318 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:40.318 "strip_size_kb": 0, 00:29:40.318 "state": "online", 00:29:40.318 "raid_level": "raid1", 00:29:40.318 "superblock": true, 00:29:40.318 "num_base_bdevs": 2, 00:29:40.318 "num_base_bdevs_discovered": 2, 00:29:40.318 "num_base_bdevs_operational": 2, 00:29:40.318 "base_bdevs_list": [ 00:29:40.318 { 00:29:40.318 "name": "pt1", 00:29:40.318 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:40.318 "is_configured": true, 00:29:40.318 "data_offset": 256, 00:29:40.318 "data_size": 7936 00:29:40.318 }, 00:29:40.318 { 00:29:40.318 "name": "pt2", 00:29:40.318 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:40.318 "is_configured": true, 00:29:40.318 "data_offset": 256, 00:29:40.318 "data_size": 7936 00:29:40.318 } 00:29:40.318 ] 00:29:40.318 } 00:29:40.318 } 00:29:40.318 }' 00:29:40.318 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:40.318 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:40.318 pt2' 00:29:40.318 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:40.318 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:40.318 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:40.576 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:40.577 "name": "pt1", 00:29:40.577 "aliases": [ 00:29:40.577 "00000000-0000-0000-0000-000000000001" 00:29:40.577 ], 00:29:40.577 "product_name": "passthru", 00:29:40.577 "block_size": 4096, 00:29:40.577 "num_blocks": 8192, 00:29:40.577 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:40.577 "md_size": 32, 00:29:40.577 "md_interleave": false, 00:29:40.577 "dif_type": 0, 00:29:40.577 "assigned_rate_limits": { 00:29:40.577 "rw_ios_per_sec": 0, 00:29:40.577 "rw_mbytes_per_sec": 0, 00:29:40.577 "r_mbytes_per_sec": 0, 00:29:40.577 "w_mbytes_per_sec": 0 00:29:40.577 }, 00:29:40.577 "claimed": true, 00:29:40.577 "claim_type": "exclusive_write", 00:29:40.577 "zoned": false, 00:29:40.577 "supported_io_types": { 00:29:40.577 "read": true, 00:29:40.577 "write": true, 00:29:40.577 "unmap": true, 00:29:40.577 "flush": true, 00:29:40.577 "reset": true, 00:29:40.577 "nvme_admin": false, 00:29:40.577 "nvme_io": false, 00:29:40.577 "nvme_io_md": false, 00:29:40.577 "write_zeroes": true, 00:29:40.577 "zcopy": true, 00:29:40.577 "get_zone_info": false, 00:29:40.577 "zone_management": false, 00:29:40.577 "zone_append": false, 00:29:40.577 "compare": false, 00:29:40.577 "compare_and_write": false, 00:29:40.577 "abort": true, 00:29:40.577 "seek_hole": false, 00:29:40.577 "seek_data": false, 00:29:40.577 "copy": true, 00:29:40.577 "nvme_iov_md": false 00:29:40.577 }, 00:29:40.577 "memory_domains": [ 00:29:40.577 { 00:29:40.577 "dma_device_id": "system", 00:29:40.577 "dma_device_type": 1 00:29:40.577 }, 00:29:40.577 { 00:29:40.577 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.577 "dma_device_type": 2 00:29:40.577 } 00:29:40.577 ], 00:29:40.577 "driver_specific": { 00:29:40.577 "passthru": { 00:29:40.577 "name": "pt1", 00:29:40.577 "base_bdev_name": "malloc1" 00:29:40.577 } 00:29:40.577 } 00:29:40.577 }' 00:29:40.577 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:40.577 06:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:40.577 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:40.577 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:40.577 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:40.577 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:40.577 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:40.835 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:40.835 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:40.835 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:40.835 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:40.835 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:40.835 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:40.835 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:40.835 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:41.094 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:41.094 "name": "pt2", 00:29:41.094 "aliases": [ 00:29:41.094 "00000000-0000-0000-0000-000000000002" 00:29:41.094 ], 00:29:41.094 "product_name": "passthru", 00:29:41.094 "block_size": 4096, 00:29:41.094 "num_blocks": 8192, 00:29:41.094 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:41.094 "md_size": 32, 00:29:41.094 "md_interleave": false, 00:29:41.094 "dif_type": 0, 00:29:41.094 "assigned_rate_limits": { 00:29:41.094 "rw_ios_per_sec": 0, 00:29:41.094 "rw_mbytes_per_sec": 0, 00:29:41.094 "r_mbytes_per_sec": 0, 00:29:41.094 "w_mbytes_per_sec": 0 00:29:41.094 }, 00:29:41.094 "claimed": true, 00:29:41.094 "claim_type": "exclusive_write", 00:29:41.094 "zoned": false, 00:29:41.094 "supported_io_types": { 00:29:41.094 "read": true, 00:29:41.094 "write": true, 00:29:41.094 "unmap": true, 00:29:41.094 "flush": true, 00:29:41.094 "reset": true, 00:29:41.094 "nvme_admin": false, 00:29:41.094 "nvme_io": false, 00:29:41.094 "nvme_io_md": false, 00:29:41.094 "write_zeroes": true, 00:29:41.094 "zcopy": true, 00:29:41.094 "get_zone_info": false, 00:29:41.094 "zone_management": false, 00:29:41.094 "zone_append": false, 00:29:41.094 "compare": false, 00:29:41.094 "compare_and_write": false, 00:29:41.094 "abort": true, 00:29:41.094 "seek_hole": false, 00:29:41.094 "seek_data": false, 00:29:41.094 "copy": true, 00:29:41.094 "nvme_iov_md": false 00:29:41.094 }, 00:29:41.094 "memory_domains": [ 00:29:41.094 { 00:29:41.094 "dma_device_id": "system", 00:29:41.094 "dma_device_type": 1 00:29:41.094 }, 00:29:41.094 { 00:29:41.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:41.094 "dma_device_type": 2 00:29:41.094 } 00:29:41.094 ], 00:29:41.094 "driver_specific": { 00:29:41.094 "passthru": { 00:29:41.094 "name": "pt2", 00:29:41.094 "base_bdev_name": "malloc2" 00:29:41.094 } 00:29:41.094 } 00:29:41.094 }' 00:29:41.094 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.094 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.094 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:41.094 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:41.353 06:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:29:41.611 [2024-07-25 06:45:55.065692] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:41.611 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # '[' e2022ad0-e3c1-4b15-8507-05b0cc5316e0 '!=' e2022ad0-e3c1-4b15-8507-05b0cc5316e0 ']' 00:29:41.611 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:29:41.611 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:41.611 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:29:41.611 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:41.870 [2024-07-25 06:45:55.290071] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:41.870 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:41.871 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.129 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.129 "name": "raid_bdev1", 00:29:42.129 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:42.129 "strip_size_kb": 0, 00:29:42.129 "state": "online", 00:29:42.129 "raid_level": "raid1", 00:29:42.129 "superblock": true, 00:29:42.129 "num_base_bdevs": 2, 00:29:42.129 "num_base_bdevs_discovered": 1, 00:29:42.129 "num_base_bdevs_operational": 1, 00:29:42.129 "base_bdevs_list": [ 00:29:42.129 { 00:29:42.129 "name": null, 00:29:42.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:42.129 "is_configured": false, 00:29:42.129 "data_offset": 256, 00:29:42.129 "data_size": 7936 00:29:42.129 }, 00:29:42.129 { 00:29:42.129 "name": "pt2", 00:29:42.129 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:42.129 "is_configured": true, 00:29:42.129 "data_offset": 256, 00:29:42.129 "data_size": 7936 00:29:42.129 } 00:29:42.129 ] 00:29:42.129 }' 00:29:42.129 06:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.129 06:45:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:42.696 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:42.954 [2024-07-25 06:45:56.304726] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:42.954 [2024-07-25 06:45:56.304748] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:42.954 [2024-07-25 06:45:56.304796] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:42.954 [2024-07-25 06:45:56.304837] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:42.954 [2024-07-25 06:45:56.304848] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116b520 name raid_bdev1, state offline 00:29:42.954 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.954 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@534 -- # i=1 00:29:43.213 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:43.471 [2024-07-25 06:45:56.882223] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:43.471 [2024-07-25 06:45:56.882263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:43.471 [2024-07-25 06:45:56.882278] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116b100 00:29:43.471 [2024-07-25 06:45:56.882294] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:43.471 [2024-07-25 06:45:56.883612] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:43.471 [2024-07-25 06:45:56.883636] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:43.472 [2024-07-25 06:45:56.883677] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:43.472 [2024-07-25 06:45:56.883698] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:43.472 [2024-07-25 06:45:56.883764] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x116bc60 00:29:43.472 [2024-07-25 06:45:56.883773] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:43.472 [2024-07-25 06:45:56.883825] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1169cb0 00:29:43.472 [2024-07-25 06:45:56.883913] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116bc60 00:29:43.472 [2024-07-25 06:45:56.883922] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116bc60 00:29:43.472 [2024-07-25 06:45:56.883982] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:43.472 pt2 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.472 06:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:43.731 06:45:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:43.731 "name": "raid_bdev1", 00:29:43.731 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:43.731 "strip_size_kb": 0, 00:29:43.731 "state": "online", 00:29:43.731 "raid_level": "raid1", 00:29:43.731 "superblock": true, 00:29:43.731 "num_base_bdevs": 2, 00:29:43.731 "num_base_bdevs_discovered": 1, 00:29:43.731 "num_base_bdevs_operational": 1, 00:29:43.731 "base_bdevs_list": [ 00:29:43.731 { 00:29:43.731 "name": null, 00:29:43.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:43.731 "is_configured": false, 00:29:43.731 "data_offset": 256, 00:29:43.731 "data_size": 7936 00:29:43.731 }, 00:29:43.731 { 00:29:43.731 "name": "pt2", 00:29:43.731 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:43.731 "is_configured": true, 00:29:43.731 "data_offset": 256, 00:29:43.731 "data_size": 7936 00:29:43.731 } 00:29:43.731 ] 00:29:43.731 }' 00:29:43.731 06:45:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:43.731 06:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:44.299 06:45:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:44.557 [2024-07-25 06:45:57.872810] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:44.557 [2024-07-25 06:45:57.872834] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:44.557 [2024-07-25 06:45:57.872884] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:44.557 [2024-07-25 06:45:57.872930] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:44.557 [2024-07-25 06:45:57.872941] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116bc60 name raid_bdev1, state offline 00:29:44.557 06:45:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.557 06:45:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:29:44.557 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:29:44.557 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:29:44.557 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:29:44.557 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:44.816 [2024-07-25 06:45:58.317962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:44.816 [2024-07-25 06:45:58.318007] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:44.816 [2024-07-25 06:45:58.318024] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116c270 00:29:44.816 [2024-07-25 06:45:58.318036] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:44.816 [2024-07-25 06:45:58.319367] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:44.816 [2024-07-25 06:45:58.319392] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:44.816 [2024-07-25 06:45:58.319434] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:44.816 [2024-07-25 06:45:58.319456] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:44.816 [2024-07-25 06:45:58.319538] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:44.816 [2024-07-25 06:45:58.319549] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:44.816 [2024-07-25 06:45:58.319564] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116d220 name raid_bdev1, state configuring 00:29:44.816 [2024-07-25 06:45:58.319584] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:44.816 [2024-07-25 06:45:58.319630] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x116c870 00:29:44.816 [2024-07-25 06:45:58.319639] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:44.816 [2024-07-25 06:45:58.319693] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1169cb0 00:29:44.816 [2024-07-25 06:45:58.319781] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116c870 00:29:44.816 [2024-07-25 06:45:58.319790] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116c870 00:29:44.816 [2024-07-25 06:45:58.319855] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:44.816 pt1 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.816 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.076 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.076 "name": "raid_bdev1", 00:29:45.076 "uuid": "e2022ad0-e3c1-4b15-8507-05b0cc5316e0", 00:29:45.076 "strip_size_kb": 0, 00:29:45.076 "state": "online", 00:29:45.076 "raid_level": "raid1", 00:29:45.076 "superblock": true, 00:29:45.076 "num_base_bdevs": 2, 00:29:45.076 "num_base_bdevs_discovered": 1, 00:29:45.076 "num_base_bdevs_operational": 1, 00:29:45.076 "base_bdevs_list": [ 00:29:45.076 { 00:29:45.076 "name": null, 00:29:45.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.076 "is_configured": false, 00:29:45.076 "data_offset": 256, 00:29:45.076 "data_size": 7936 00:29:45.076 }, 00:29:45.076 { 00:29:45.076 "name": "pt2", 00:29:45.076 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:45.076 "is_configured": true, 00:29:45.076 "data_offset": 256, 00:29:45.076 "data_size": 7936 00:29:45.076 } 00:29:45.076 ] 00:29:45.076 }' 00:29:45.076 06:45:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.076 06:45:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:45.643 06:45:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:45.643 06:45:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:45.902 06:45:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:29:45.902 06:45:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:45.902 06:45:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:29:46.161 [2024-07-25 06:45:59.561588] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # '[' e2022ad0-e3c1-4b15-8507-05b0cc5316e0 '!=' e2022ad0-e3c1-4b15-8507-05b0cc5316e0 ']' 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@578 -- # killprocess 1272085 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1272085 ']' 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # kill -0 1272085 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # uname 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1272085 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1272085' 00:29:46.161 killing process with pid 1272085 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@969 -- # kill 1272085 00:29:46.161 [2024-07-25 06:45:59.641144] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:46.161 [2024-07-25 06:45:59.641194] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:46.161 [2024-07-25 06:45:59.641234] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:46.161 [2024-07-25 06:45:59.641249] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116c870 name raid_bdev1, state offline 00:29:46.161 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@974 -- # wait 1272085 00:29:46.161 [2024-07-25 06:45:59.660033] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:46.420 06:45:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@580 -- # return 0 00:29:46.420 00:29:46.420 real 0m14.510s 00:29:46.420 user 0m26.191s 00:29:46.420 sys 0m2.777s 00:29:46.420 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:46.420 06:45:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:46.420 ************************************ 00:29:46.420 END TEST raid_superblock_test_md_separate 00:29:46.420 ************************************ 00:29:46.420 06:45:59 bdev_raid -- bdev/bdev_raid.sh@987 -- # '[' true = true ']' 00:29:46.420 06:45:59 bdev_raid -- bdev/bdev_raid.sh@988 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:29:46.420 06:45:59 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:29:46.420 06:45:59 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:46.420 06:45:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:46.420 ************************************ 00:29:46.420 START TEST raid_rebuild_test_sb_md_separate 00:29:46.420 ************************************ 00:29:46.420 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:29:46.420 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # local verify=true 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # local strip_size 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # local create_arg 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@594 -- # local data_offset 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # raid_pid=1274781 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@613 -- # waitforlisten 1274781 /var/tmp/spdk-raid.sock 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1274781 ']' 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:46.421 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:46.421 06:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:46.421 [2024-07-25 06:45:59.973329] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:29:46.421 [2024-07-25 06:45:59.973386] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1274781 ] 00:29:46.421 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:46.421 Zero copy mechanism will not be used. 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:01.0 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:01.1 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:01.2 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:01.3 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:01.4 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:01.5 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:01.6 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:01.7 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:02.0 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:02.1 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:02.2 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:02.3 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:02.4 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:02.5 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:02.6 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3d:02.7 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:01.0 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:01.1 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:01.2 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:01.3 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:01.4 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:01.5 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:01.6 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:01.7 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:02.0 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:02.1 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:02.2 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:02.3 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:02.4 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:02.5 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:02.6 cannot be used 00:29:46.680 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:29:46.680 EAL: Requested device 0000:3f:02.7 cannot be used 00:29:46.680 [2024-07-25 06:46:00.109105] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:46.681 [2024-07-25 06:46:00.152819] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:46.681 [2024-07-25 06:46:00.207007] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:46.681 [2024-07-25 06:46:00.207039] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:47.616 06:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:47.616 06:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:29:47.616 06:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:29:47.616 06:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:29:47.616 BaseBdev1_malloc 00:29:47.616 06:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:47.875 [2024-07-25 06:46:01.252756] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:47.875 [2024-07-25 06:46:01.252797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:47.875 [2024-07-25 06:46:01.252818] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1611eb0 00:29:47.875 [2024-07-25 06:46:01.252830] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:47.875 [2024-07-25 06:46:01.254053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:47.875 [2024-07-25 06:46:01.254078] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:47.875 BaseBdev1 00:29:47.875 06:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:29:47.875 06:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:29:48.133 BaseBdev2_malloc 00:29:48.134 06:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:48.392 [2024-07-25 06:46:01.699211] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:48.392 [2024-07-25 06:46:01.699250] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:48.392 [2024-07-25 06:46:01.699272] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1612a30 00:29:48.392 [2024-07-25 06:46:01.699284] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:48.392 [2024-07-25 06:46:01.700563] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:48.392 [2024-07-25 06:46:01.700588] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:48.392 BaseBdev2 00:29:48.392 06:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:29:48.392 spare_malloc 00:29:48.392 06:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:48.651 spare_delay 00:29:48.651 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:48.910 [2024-07-25 06:46:02.361796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:48.910 [2024-07-25 06:46:02.361836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:48.910 [2024-07-25 06:46:02.361858] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1756390 00:29:48.910 [2024-07-25 06:46:02.361870] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:48.910 [2024-07-25 06:46:02.363101] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:48.910 [2024-07-25 06:46:02.363127] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:48.910 spare 00:29:48.910 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:49.169 [2024-07-25 06:46:02.590424] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:49.169 [2024-07-25 06:46:02.591555] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:49.169 [2024-07-25 06:46:02.591704] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1758600 00:29:49.169 [2024-07-25 06:46:02.591717] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:49.169 [2024-07-25 06:46:02.591782] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1610c10 00:29:49.169 [2024-07-25 06:46:02.591880] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1758600 00:29:49.169 [2024-07-25 06:46:02.591889] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1758600 00:29:49.169 [2024-07-25 06:46:02.591950] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:49.169 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:49.428 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:49.428 "name": "raid_bdev1", 00:29:49.428 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:29:49.428 "strip_size_kb": 0, 00:29:49.428 "state": "online", 00:29:49.428 "raid_level": "raid1", 00:29:49.428 "superblock": true, 00:29:49.428 "num_base_bdevs": 2, 00:29:49.428 "num_base_bdevs_discovered": 2, 00:29:49.428 "num_base_bdevs_operational": 2, 00:29:49.428 "base_bdevs_list": [ 00:29:49.428 { 00:29:49.428 "name": "BaseBdev1", 00:29:49.428 "uuid": "56906bb1-c3d2-5048-b7fe-fb066e065d63", 00:29:49.428 "is_configured": true, 00:29:49.428 "data_offset": 256, 00:29:49.428 "data_size": 7936 00:29:49.428 }, 00:29:49.428 { 00:29:49.428 "name": "BaseBdev2", 00:29:49.428 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:29:49.428 "is_configured": true, 00:29:49.428 "data_offset": 256, 00:29:49.428 "data_size": 7936 00:29:49.428 } 00:29:49.428 ] 00:29:49.428 }' 00:29:49.428 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:49.428 06:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:50.011 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:50.011 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:29:50.278 [2024-07-25 06:46:03.601299] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:50.278 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:29:50.278 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.278 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:29:50.583 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:50.584 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:50.584 06:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:29:50.584 [2024-07-25 06:46:04.062313] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16118f0 00:29:50.584 /dev/nbd0 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:50.584 1+0 records in 00:29:50.584 1+0 records out 00:29:50.584 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243354 s, 16.8 MB/s 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:29:50.584 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:29:51.519 7936+0 records in 00:29:51.519 7936+0 records out 00:29:51.519 32505856 bytes (33 MB, 31 MiB) copied, 0.713539 s, 45.6 MB/s 00:29:51.519 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:29:51.519 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:51.519 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:51.519 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:51.519 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:29:51.519 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:51.519 06:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:51.779 [2024-07-25 06:46:05.089099] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:51.779 [2024-07-25 06:46:05.293655] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.779 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:52.037 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:52.037 "name": "raid_bdev1", 00:29:52.037 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:29:52.037 "strip_size_kb": 0, 00:29:52.037 "state": "online", 00:29:52.037 "raid_level": "raid1", 00:29:52.037 "superblock": true, 00:29:52.037 "num_base_bdevs": 2, 00:29:52.037 "num_base_bdevs_discovered": 1, 00:29:52.037 "num_base_bdevs_operational": 1, 00:29:52.037 "base_bdevs_list": [ 00:29:52.037 { 00:29:52.037 "name": null, 00:29:52.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:52.037 "is_configured": false, 00:29:52.037 "data_offset": 256, 00:29:52.037 "data_size": 7936 00:29:52.037 }, 00:29:52.037 { 00:29:52.037 "name": "BaseBdev2", 00:29:52.037 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:29:52.037 "is_configured": true, 00:29:52.037 "data_offset": 256, 00:29:52.037 "data_size": 7936 00:29:52.037 } 00:29:52.037 ] 00:29:52.037 }' 00:29:52.038 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:52.038 06:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:52.605 06:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:52.864 [2024-07-25 06:46:06.316348] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:52.864 [2024-07-25 06:46:06.318477] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1610bd0 00:29:52.864 [2024-07-25 06:46:06.320558] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:52.864 06:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:53.800 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:53.800 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:53.800 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:53.800 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:53.800 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:53.800 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.800 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:54.058 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:54.058 "name": "raid_bdev1", 00:29:54.058 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:29:54.058 "strip_size_kb": 0, 00:29:54.058 "state": "online", 00:29:54.058 "raid_level": "raid1", 00:29:54.058 "superblock": true, 00:29:54.058 "num_base_bdevs": 2, 00:29:54.058 "num_base_bdevs_discovered": 2, 00:29:54.058 "num_base_bdevs_operational": 2, 00:29:54.058 "process": { 00:29:54.058 "type": "rebuild", 00:29:54.058 "target": "spare", 00:29:54.059 "progress": { 00:29:54.059 "blocks": 3072, 00:29:54.059 "percent": 38 00:29:54.059 } 00:29:54.059 }, 00:29:54.059 "base_bdevs_list": [ 00:29:54.059 { 00:29:54.059 "name": "spare", 00:29:54.059 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:29:54.059 "is_configured": true, 00:29:54.059 "data_offset": 256, 00:29:54.059 "data_size": 7936 00:29:54.059 }, 00:29:54.059 { 00:29:54.059 "name": "BaseBdev2", 00:29:54.059 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:29:54.059 "is_configured": true, 00:29:54.059 "data_offset": 256, 00:29:54.059 "data_size": 7936 00:29:54.059 } 00:29:54.059 ] 00:29:54.059 }' 00:29:54.059 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:54.317 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:54.317 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:54.317 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:54.317 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:54.317 [2024-07-25 06:46:07.869188] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:54.576 [2024-07-25 06:46:07.932364] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:54.576 [2024-07-25 06:46:07.932404] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:54.576 [2024-07-25 06:46:07.932418] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:54.576 [2024-07-25 06:46:07.932426] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:54.576 06:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:54.835 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:54.835 "name": "raid_bdev1", 00:29:54.835 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:29:54.835 "strip_size_kb": 0, 00:29:54.835 "state": "online", 00:29:54.835 "raid_level": "raid1", 00:29:54.835 "superblock": true, 00:29:54.835 "num_base_bdevs": 2, 00:29:54.835 "num_base_bdevs_discovered": 1, 00:29:54.835 "num_base_bdevs_operational": 1, 00:29:54.835 "base_bdevs_list": [ 00:29:54.835 { 00:29:54.835 "name": null, 00:29:54.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:54.835 "is_configured": false, 00:29:54.835 "data_offset": 256, 00:29:54.835 "data_size": 7936 00:29:54.835 }, 00:29:54.835 { 00:29:54.835 "name": "BaseBdev2", 00:29:54.835 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:29:54.835 "is_configured": true, 00:29:54.835 "data_offset": 256, 00:29:54.835 "data_size": 7936 00:29:54.835 } 00:29:54.835 ] 00:29:54.835 }' 00:29:54.835 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:54.835 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:55.403 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:55.403 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:55.403 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:55.403 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:55.403 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:55.403 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:55.403 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:55.663 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:55.663 "name": "raid_bdev1", 00:29:55.663 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:29:55.663 "strip_size_kb": 0, 00:29:55.663 "state": "online", 00:29:55.663 "raid_level": "raid1", 00:29:55.663 "superblock": true, 00:29:55.663 "num_base_bdevs": 2, 00:29:55.663 "num_base_bdevs_discovered": 1, 00:29:55.663 "num_base_bdevs_operational": 1, 00:29:55.663 "base_bdevs_list": [ 00:29:55.663 { 00:29:55.663 "name": null, 00:29:55.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:55.663 "is_configured": false, 00:29:55.663 "data_offset": 256, 00:29:55.663 "data_size": 7936 00:29:55.663 }, 00:29:55.663 { 00:29:55.663 "name": "BaseBdev2", 00:29:55.663 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:29:55.663 "is_configured": true, 00:29:55.663 "data_offset": 256, 00:29:55.663 "data_size": 7936 00:29:55.663 } 00:29:55.663 ] 00:29:55.663 }' 00:29:55.663 06:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:55.663 06:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:55.663 06:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:55.663 06:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:55.663 06:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:55.922 [2024-07-25 06:46:09.282773] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:55.922 [2024-07-25 06:46:09.284889] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16118f0 00:29:55.922 [2024-07-25 06:46:09.286230] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:55.922 06:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@678 -- # sleep 1 00:29:56.859 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:56.859 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:56.859 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:56.859 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:56.859 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:56.859 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:56.859 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:57.119 "name": "raid_bdev1", 00:29:57.119 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:29:57.119 "strip_size_kb": 0, 00:29:57.119 "state": "online", 00:29:57.119 "raid_level": "raid1", 00:29:57.119 "superblock": true, 00:29:57.119 "num_base_bdevs": 2, 00:29:57.119 "num_base_bdevs_discovered": 2, 00:29:57.119 "num_base_bdevs_operational": 2, 00:29:57.119 "process": { 00:29:57.119 "type": "rebuild", 00:29:57.119 "target": "spare", 00:29:57.119 "progress": { 00:29:57.119 "blocks": 3072, 00:29:57.119 "percent": 38 00:29:57.119 } 00:29:57.119 }, 00:29:57.119 "base_bdevs_list": [ 00:29:57.119 { 00:29:57.119 "name": "spare", 00:29:57.119 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:29:57.119 "is_configured": true, 00:29:57.119 "data_offset": 256, 00:29:57.119 "data_size": 7936 00:29:57.119 }, 00:29:57.119 { 00:29:57.119 "name": "BaseBdev2", 00:29:57.119 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:29:57.119 "is_configured": true, 00:29:57.119 "data_offset": 256, 00:29:57.119 "data_size": 7936 00:29:57.119 } 00:29:57.119 ] 00:29:57.119 }' 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:29:57.119 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # local timeout=1019 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:57.119 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:57.378 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:57.378 "name": "raid_bdev1", 00:29:57.378 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:29:57.378 "strip_size_kb": 0, 00:29:57.378 "state": "online", 00:29:57.378 "raid_level": "raid1", 00:29:57.378 "superblock": true, 00:29:57.378 "num_base_bdevs": 2, 00:29:57.378 "num_base_bdevs_discovered": 2, 00:29:57.378 "num_base_bdevs_operational": 2, 00:29:57.378 "process": { 00:29:57.378 "type": "rebuild", 00:29:57.378 "target": "spare", 00:29:57.378 "progress": { 00:29:57.378 "blocks": 3840, 00:29:57.378 "percent": 48 00:29:57.378 } 00:29:57.378 }, 00:29:57.378 "base_bdevs_list": [ 00:29:57.378 { 00:29:57.378 "name": "spare", 00:29:57.378 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:29:57.378 "is_configured": true, 00:29:57.378 "data_offset": 256, 00:29:57.378 "data_size": 7936 00:29:57.378 }, 00:29:57.378 { 00:29:57.378 "name": "BaseBdev2", 00:29:57.378 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:29:57.378 "is_configured": true, 00:29:57.378 "data_offset": 256, 00:29:57.378 "data_size": 7936 00:29:57.378 } 00:29:57.378 ] 00:29:57.378 }' 00:29:57.378 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:57.378 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:57.378 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:57.637 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:57.637 06:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@726 -- # sleep 1 00:29:58.574 06:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:58.574 06:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:58.575 06:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:58.575 06:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:58.575 06:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:58.575 06:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:58.575 06:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:58.575 06:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:58.834 06:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:58.834 "name": "raid_bdev1", 00:29:58.834 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:29:58.834 "strip_size_kb": 0, 00:29:58.834 "state": "online", 00:29:58.834 "raid_level": "raid1", 00:29:58.834 "superblock": true, 00:29:58.834 "num_base_bdevs": 2, 00:29:58.834 "num_base_bdevs_discovered": 2, 00:29:58.834 "num_base_bdevs_operational": 2, 00:29:58.834 "process": { 00:29:58.834 "type": "rebuild", 00:29:58.834 "target": "spare", 00:29:58.834 "progress": { 00:29:58.834 "blocks": 7168, 00:29:58.834 "percent": 90 00:29:58.834 } 00:29:58.834 }, 00:29:58.834 "base_bdevs_list": [ 00:29:58.834 { 00:29:58.834 "name": "spare", 00:29:58.834 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:29:58.834 "is_configured": true, 00:29:58.834 "data_offset": 256, 00:29:58.834 "data_size": 7936 00:29:58.834 }, 00:29:58.834 { 00:29:58.834 "name": "BaseBdev2", 00:29:58.834 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:29:58.834 "is_configured": true, 00:29:58.834 "data_offset": 256, 00:29:58.834 "data_size": 7936 00:29:58.834 } 00:29:58.834 ] 00:29:58.834 }' 00:29:58.834 06:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:58.834 06:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:58.834 06:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:58.834 06:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:58.834 06:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@726 -- # sleep 1 00:29:59.093 [2024-07-25 06:46:12.409033] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:59.093 [2024-07-25 06:46:12.409089] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:59.093 [2024-07-25 06:46:12.409169] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:00.033 "name": "raid_bdev1", 00:30:00.033 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:00.033 "strip_size_kb": 0, 00:30:00.033 "state": "online", 00:30:00.033 "raid_level": "raid1", 00:30:00.033 "superblock": true, 00:30:00.033 "num_base_bdevs": 2, 00:30:00.033 "num_base_bdevs_discovered": 2, 00:30:00.033 "num_base_bdevs_operational": 2, 00:30:00.033 "base_bdevs_list": [ 00:30:00.033 { 00:30:00.033 "name": "spare", 00:30:00.033 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:30:00.033 "is_configured": true, 00:30:00.033 "data_offset": 256, 00:30:00.033 "data_size": 7936 00:30:00.033 }, 00:30:00.033 { 00:30:00.033 "name": "BaseBdev2", 00:30:00.033 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:00.033 "is_configured": true, 00:30:00.033 "data_offset": 256, 00:30:00.033 "data_size": 7936 00:30:00.033 } 00:30:00.033 ] 00:30:00.033 }' 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:00.033 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@724 -- # break 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:00.292 "name": "raid_bdev1", 00:30:00.292 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:00.292 "strip_size_kb": 0, 00:30:00.292 "state": "online", 00:30:00.292 "raid_level": "raid1", 00:30:00.292 "superblock": true, 00:30:00.292 "num_base_bdevs": 2, 00:30:00.292 "num_base_bdevs_discovered": 2, 00:30:00.292 "num_base_bdevs_operational": 2, 00:30:00.292 "base_bdevs_list": [ 00:30:00.292 { 00:30:00.292 "name": "spare", 00:30:00.292 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:30:00.292 "is_configured": true, 00:30:00.292 "data_offset": 256, 00:30:00.292 "data_size": 7936 00:30:00.292 }, 00:30:00.292 { 00:30:00.292 "name": "BaseBdev2", 00:30:00.292 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:00.292 "is_configured": true, 00:30:00.292 "data_offset": 256, 00:30:00.292 "data_size": 7936 00:30:00.292 } 00:30:00.292 ] 00:30:00.292 }' 00:30:00.292 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.551 06:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:00.810 06:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:00.810 "name": "raid_bdev1", 00:30:00.810 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:00.810 "strip_size_kb": 0, 00:30:00.810 "state": "online", 00:30:00.810 "raid_level": "raid1", 00:30:00.810 "superblock": true, 00:30:00.810 "num_base_bdevs": 2, 00:30:00.810 "num_base_bdevs_discovered": 2, 00:30:00.810 "num_base_bdevs_operational": 2, 00:30:00.810 "base_bdevs_list": [ 00:30:00.810 { 00:30:00.810 "name": "spare", 00:30:00.810 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:30:00.810 "is_configured": true, 00:30:00.810 "data_offset": 256, 00:30:00.810 "data_size": 7936 00:30:00.810 }, 00:30:00.810 { 00:30:00.810 "name": "BaseBdev2", 00:30:00.810 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:00.810 "is_configured": true, 00:30:00.810 "data_offset": 256, 00:30:00.810 "data_size": 7936 00:30:00.810 } 00:30:00.810 ] 00:30:00.810 }' 00:30:00.810 06:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:00.810 06:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:01.377 06:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:01.377 [2024-07-25 06:46:14.867723] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:01.377 [2024-07-25 06:46:14.867747] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:01.378 [2024-07-25 06:46:14.867803] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:01.378 [2024-07-25 06:46:14.867855] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:01.378 [2024-07-25 06:46:14.867866] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1758600 name raid_bdev1, state offline 00:30:01.378 06:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # jq length 00:30:01.378 06:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:01.637 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:30:01.896 /dev/nbd0 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:01.896 1+0 records in 00:30:01.896 1+0 records out 00:30:01.896 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224365 s, 18.3 MB/s 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:01.896 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:30:02.155 /dev/nbd1 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:02.156 1+0 records in 00:30:02.156 1+0 records out 00:30:02.156 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291319 s, 14.1 MB/s 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:02.156 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:02.415 06:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:30:02.674 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:30:02.675 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:03.001 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:03.282 [2024-07-25 06:46:16.639375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:03.282 [2024-07-25 06:46:16.639420] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:03.282 [2024-07-25 06:46:16.639440] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17565c0 00:30:03.282 [2024-07-25 06:46:16.639452] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:03.282 [2024-07-25 06:46:16.640918] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:03.282 [2024-07-25 06:46:16.640944] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:03.282 [2024-07-25 06:46:16.640996] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:03.282 [2024-07-25 06:46:16.641018] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:03.282 [2024-07-25 06:46:16.641104] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:03.282 spare 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:03.282 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:03.282 [2024-07-25 06:46:16.741417] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17ac2f0 00:30:03.282 [2024-07-25 06:46:16.741432] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:30:03.282 [2024-07-25 06:46:16.741496] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1759f90 00:30:03.282 [2024-07-25 06:46:16.741604] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17ac2f0 00:30:03.282 [2024-07-25 06:46:16.741613] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17ac2f0 00:30:03.282 [2024-07-25 06:46:16.741681] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:03.542 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:03.542 "name": "raid_bdev1", 00:30:03.542 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:03.542 "strip_size_kb": 0, 00:30:03.542 "state": "online", 00:30:03.542 "raid_level": "raid1", 00:30:03.542 "superblock": true, 00:30:03.542 "num_base_bdevs": 2, 00:30:03.542 "num_base_bdevs_discovered": 2, 00:30:03.542 "num_base_bdevs_operational": 2, 00:30:03.542 "base_bdevs_list": [ 00:30:03.542 { 00:30:03.542 "name": "spare", 00:30:03.542 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:30:03.542 "is_configured": true, 00:30:03.542 "data_offset": 256, 00:30:03.542 "data_size": 7936 00:30:03.542 }, 00:30:03.542 { 00:30:03.542 "name": "BaseBdev2", 00:30:03.542 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:03.542 "is_configured": true, 00:30:03.542 "data_offset": 256, 00:30:03.542 "data_size": 7936 00:30:03.542 } 00:30:03.542 ] 00:30:03.542 }' 00:30:03.542 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:03.542 06:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:04.111 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:04.111 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:04.111 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:04.111 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:04.111 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:04.111 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.111 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:04.370 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:04.370 "name": "raid_bdev1", 00:30:04.370 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:04.370 "strip_size_kb": 0, 00:30:04.370 "state": "online", 00:30:04.370 "raid_level": "raid1", 00:30:04.370 "superblock": true, 00:30:04.370 "num_base_bdevs": 2, 00:30:04.370 "num_base_bdevs_discovered": 2, 00:30:04.370 "num_base_bdevs_operational": 2, 00:30:04.370 "base_bdevs_list": [ 00:30:04.370 { 00:30:04.370 "name": "spare", 00:30:04.370 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:30:04.370 "is_configured": true, 00:30:04.370 "data_offset": 256, 00:30:04.370 "data_size": 7936 00:30:04.370 }, 00:30:04.370 { 00:30:04.370 "name": "BaseBdev2", 00:30:04.370 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:04.370 "is_configured": true, 00:30:04.370 "data_offset": 256, 00:30:04.370 "data_size": 7936 00:30:04.370 } 00:30:04.370 ] 00:30:04.370 }' 00:30:04.370 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:04.370 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:04.370 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:04.370 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:04.370 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.371 06:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:30:04.630 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:30:04.630 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:04.889 [2024-07-25 06:46:18.219645] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.889 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:05.149 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:05.149 "name": "raid_bdev1", 00:30:05.149 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:05.149 "strip_size_kb": 0, 00:30:05.149 "state": "online", 00:30:05.149 "raid_level": "raid1", 00:30:05.149 "superblock": true, 00:30:05.149 "num_base_bdevs": 2, 00:30:05.149 "num_base_bdevs_discovered": 1, 00:30:05.149 "num_base_bdevs_operational": 1, 00:30:05.149 "base_bdevs_list": [ 00:30:05.149 { 00:30:05.149 "name": null, 00:30:05.149 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:05.149 "is_configured": false, 00:30:05.149 "data_offset": 256, 00:30:05.149 "data_size": 7936 00:30:05.149 }, 00:30:05.149 { 00:30:05.149 "name": "BaseBdev2", 00:30:05.149 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:05.149 "is_configured": true, 00:30:05.149 "data_offset": 256, 00:30:05.149 "data_size": 7936 00:30:05.149 } 00:30:05.149 ] 00:30:05.149 }' 00:30:05.149 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:05.149 06:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:05.718 06:46:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:05.718 [2024-07-25 06:46:19.250362] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:05.718 [2024-07-25 06:46:19.250496] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:05.718 [2024-07-25 06:46:19.250511] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:05.718 [2024-07-25 06:46:19.250534] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:05.718 [2024-07-25 06:46:19.252536] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16130c0 00:30:05.718 [2024-07-25 06:46:19.254551] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:05.718 06:46:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # sleep 1 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:07.162 "name": "raid_bdev1", 00:30:07.162 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:07.162 "strip_size_kb": 0, 00:30:07.162 "state": "online", 00:30:07.162 "raid_level": "raid1", 00:30:07.162 "superblock": true, 00:30:07.162 "num_base_bdevs": 2, 00:30:07.162 "num_base_bdevs_discovered": 2, 00:30:07.162 "num_base_bdevs_operational": 2, 00:30:07.162 "process": { 00:30:07.162 "type": "rebuild", 00:30:07.162 "target": "spare", 00:30:07.162 "progress": { 00:30:07.162 "blocks": 3072, 00:30:07.162 "percent": 38 00:30:07.162 } 00:30:07.162 }, 00:30:07.162 "base_bdevs_list": [ 00:30:07.162 { 00:30:07.162 "name": "spare", 00:30:07.162 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:30:07.162 "is_configured": true, 00:30:07.162 "data_offset": 256, 00:30:07.162 "data_size": 7936 00:30:07.162 }, 00:30:07.162 { 00:30:07.162 "name": "BaseBdev2", 00:30:07.162 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:07.162 "is_configured": true, 00:30:07.162 "data_offset": 256, 00:30:07.162 "data_size": 7936 00:30:07.162 } 00:30:07.162 ] 00:30:07.162 }' 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:07.162 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:07.421 [2024-07-25 06:46:20.808119] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:07.421 [2024-07-25 06:46:20.866366] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:07.421 [2024-07-25 06:46:20.866414] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:07.421 [2024-07-25 06:46:20.866428] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:07.421 [2024-07-25 06:46:20.866435] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:07.421 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:07.421 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:07.422 06:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:07.681 06:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:07.681 "name": "raid_bdev1", 00:30:07.681 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:07.681 "strip_size_kb": 0, 00:30:07.681 "state": "online", 00:30:07.681 "raid_level": "raid1", 00:30:07.681 "superblock": true, 00:30:07.681 "num_base_bdevs": 2, 00:30:07.681 "num_base_bdevs_discovered": 1, 00:30:07.681 "num_base_bdevs_operational": 1, 00:30:07.681 "base_bdevs_list": [ 00:30:07.681 { 00:30:07.681 "name": null, 00:30:07.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:07.681 "is_configured": false, 00:30:07.681 "data_offset": 256, 00:30:07.681 "data_size": 7936 00:30:07.681 }, 00:30:07.681 { 00:30:07.681 "name": "BaseBdev2", 00:30:07.681 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:07.681 "is_configured": true, 00:30:07.681 "data_offset": 256, 00:30:07.681 "data_size": 7936 00:30:07.681 } 00:30:07.681 ] 00:30:07.681 }' 00:30:07.681 06:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:07.681 06:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:08.248 06:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:08.507 [2024-07-25 06:46:21.859975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:08.507 [2024-07-25 06:46:21.860023] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:08.507 [2024-07-25 06:46:21.860045] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17ac670 00:30:08.507 [2024-07-25 06:46:21.860057] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:08.507 [2024-07-25 06:46:21.860259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:08.507 [2024-07-25 06:46:21.860274] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:08.507 [2024-07-25 06:46:21.860327] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:08.507 [2024-07-25 06:46:21.860337] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:08.507 [2024-07-25 06:46:21.860346] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:08.507 [2024-07-25 06:46:21.860363] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:08.507 [2024-07-25 06:46:21.862410] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16130c0 00:30:08.507 [2024-07-25 06:46:21.863756] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:08.508 spare 00:30:08.508 06:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # sleep 1 00:30:09.445 06:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:09.445 06:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:09.445 06:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:09.445 06:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:09.445 06:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:09.445 06:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.445 06:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:09.704 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:09.704 "name": "raid_bdev1", 00:30:09.704 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:09.704 "strip_size_kb": 0, 00:30:09.704 "state": "online", 00:30:09.704 "raid_level": "raid1", 00:30:09.704 "superblock": true, 00:30:09.704 "num_base_bdevs": 2, 00:30:09.704 "num_base_bdevs_discovered": 2, 00:30:09.704 "num_base_bdevs_operational": 2, 00:30:09.704 "process": { 00:30:09.704 "type": "rebuild", 00:30:09.704 "target": "spare", 00:30:09.704 "progress": { 00:30:09.704 "blocks": 3072, 00:30:09.704 "percent": 38 00:30:09.704 } 00:30:09.704 }, 00:30:09.704 "base_bdevs_list": [ 00:30:09.704 { 00:30:09.704 "name": "spare", 00:30:09.704 "uuid": "0d7efba9-53a4-5e39-9eed-8e1101d6873d", 00:30:09.704 "is_configured": true, 00:30:09.704 "data_offset": 256, 00:30:09.704 "data_size": 7936 00:30:09.704 }, 00:30:09.704 { 00:30:09.704 "name": "BaseBdev2", 00:30:09.704 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:09.704 "is_configured": true, 00:30:09.704 "data_offset": 256, 00:30:09.704 "data_size": 7936 00:30:09.704 } 00:30:09.704 ] 00:30:09.704 }' 00:30:09.704 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:09.704 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:09.704 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:09.704 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:09.704 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:09.964 [2024-07-25 06:46:23.425320] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:09.964 [2024-07-25 06:46:23.475569] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:09.964 [2024-07-25 06:46:23.475608] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:09.964 [2024-07-25 06:46:23.475622] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:09.964 [2024-07-25 06:46:23.475630] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.964 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:10.223 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:10.223 "name": "raid_bdev1", 00:30:10.223 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:10.223 "strip_size_kb": 0, 00:30:10.223 "state": "online", 00:30:10.223 "raid_level": "raid1", 00:30:10.223 "superblock": true, 00:30:10.223 "num_base_bdevs": 2, 00:30:10.223 "num_base_bdevs_discovered": 1, 00:30:10.223 "num_base_bdevs_operational": 1, 00:30:10.223 "base_bdevs_list": [ 00:30:10.223 { 00:30:10.223 "name": null, 00:30:10.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:10.223 "is_configured": false, 00:30:10.223 "data_offset": 256, 00:30:10.223 "data_size": 7936 00:30:10.223 }, 00:30:10.223 { 00:30:10.223 "name": "BaseBdev2", 00:30:10.223 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:10.223 "is_configured": true, 00:30:10.223 "data_offset": 256, 00:30:10.223 "data_size": 7936 00:30:10.223 } 00:30:10.223 ] 00:30:10.223 }' 00:30:10.223 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:10.223 06:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:10.790 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:10.790 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:10.790 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:10.790 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:10.790 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:10.790 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.790 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:11.048 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:11.048 "name": "raid_bdev1", 00:30:11.048 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:11.048 "strip_size_kb": 0, 00:30:11.048 "state": "online", 00:30:11.048 "raid_level": "raid1", 00:30:11.048 "superblock": true, 00:30:11.048 "num_base_bdevs": 2, 00:30:11.048 "num_base_bdevs_discovered": 1, 00:30:11.048 "num_base_bdevs_operational": 1, 00:30:11.048 "base_bdevs_list": [ 00:30:11.048 { 00:30:11.048 "name": null, 00:30:11.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:11.048 "is_configured": false, 00:30:11.048 "data_offset": 256, 00:30:11.048 "data_size": 7936 00:30:11.048 }, 00:30:11.048 { 00:30:11.048 "name": "BaseBdev2", 00:30:11.048 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:11.048 "is_configured": true, 00:30:11.048 "data_offset": 256, 00:30:11.048 "data_size": 7936 00:30:11.048 } 00:30:11.048 ] 00:30:11.048 }' 00:30:11.048 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:11.048 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:11.048 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:11.307 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:11.307 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:30:11.566 06:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:11.566 [2024-07-25 06:46:25.074126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:11.566 [2024-07-25 06:46:25.074176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:11.566 [2024-07-25 06:46:25.074195] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1759160 00:30:11.566 [2024-07-25 06:46:25.074206] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:11.566 [2024-07-25 06:46:25.074373] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:11.566 [2024-07-25 06:46:25.074387] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:11.566 [2024-07-25 06:46:25.074426] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:30:11.566 [2024-07-25 06:46:25.074437] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:11.566 [2024-07-25 06:46:25.074446] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:11.566 BaseBdev1 00:30:11.566 06:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@789 -- # sleep 1 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:12.940 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:12.940 "name": "raid_bdev1", 00:30:12.940 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:12.940 "strip_size_kb": 0, 00:30:12.940 "state": "online", 00:30:12.940 "raid_level": "raid1", 00:30:12.940 "superblock": true, 00:30:12.940 "num_base_bdevs": 2, 00:30:12.940 "num_base_bdevs_discovered": 1, 00:30:12.940 "num_base_bdevs_operational": 1, 00:30:12.940 "base_bdevs_list": [ 00:30:12.940 { 00:30:12.940 "name": null, 00:30:12.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:12.940 "is_configured": false, 00:30:12.940 "data_offset": 256, 00:30:12.940 "data_size": 7936 00:30:12.940 }, 00:30:12.940 { 00:30:12.940 "name": "BaseBdev2", 00:30:12.940 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:12.940 "is_configured": true, 00:30:12.940 "data_offset": 256, 00:30:12.940 "data_size": 7936 00:30:12.940 } 00:30:12.940 ] 00:30:12.940 }' 00:30:12.941 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:12.941 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:13.507 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:13.507 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:13.507 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:13.507 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:13.507 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:13.507 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:13.507 06:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:13.766 "name": "raid_bdev1", 00:30:13.766 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:13.766 "strip_size_kb": 0, 00:30:13.766 "state": "online", 00:30:13.766 "raid_level": "raid1", 00:30:13.766 "superblock": true, 00:30:13.766 "num_base_bdevs": 2, 00:30:13.766 "num_base_bdevs_discovered": 1, 00:30:13.766 "num_base_bdevs_operational": 1, 00:30:13.766 "base_bdevs_list": [ 00:30:13.766 { 00:30:13.766 "name": null, 00:30:13.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:13.766 "is_configured": false, 00:30:13.766 "data_offset": 256, 00:30:13.766 "data_size": 7936 00:30:13.766 }, 00:30:13.766 { 00:30:13.766 "name": "BaseBdev2", 00:30:13.766 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:13.766 "is_configured": true, 00:30:13.766 "data_offset": 256, 00:30:13.766 "data_size": 7936 00:30:13.766 } 00:30:13.766 ] 00:30:13.766 }' 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:13.766 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:14.024 [2024-07-25 06:46:27.412290] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:14.024 [2024-07-25 06:46:27.412399] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:14.024 [2024-07-25 06:46:27.412413] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:14.024 request: 00:30:14.024 { 00:30:14.024 "base_bdev": "BaseBdev1", 00:30:14.024 "raid_bdev": "raid_bdev1", 00:30:14.024 "method": "bdev_raid_add_base_bdev", 00:30:14.024 "req_id": 1 00:30:14.024 } 00:30:14.024 Got JSON-RPC error response 00:30:14.024 response: 00:30:14.024 { 00:30:14.024 "code": -22, 00:30:14.024 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:30:14.024 } 00:30:14.024 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # es=1 00:30:14.024 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:30:14.024 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:30:14.024 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:30:14.024 06:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@793 -- # sleep 1 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:14.960 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:15.218 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:15.218 "name": "raid_bdev1", 00:30:15.218 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:15.218 "strip_size_kb": 0, 00:30:15.218 "state": "online", 00:30:15.218 "raid_level": "raid1", 00:30:15.218 "superblock": true, 00:30:15.218 "num_base_bdevs": 2, 00:30:15.218 "num_base_bdevs_discovered": 1, 00:30:15.218 "num_base_bdevs_operational": 1, 00:30:15.218 "base_bdevs_list": [ 00:30:15.218 { 00:30:15.218 "name": null, 00:30:15.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:15.218 "is_configured": false, 00:30:15.218 "data_offset": 256, 00:30:15.218 "data_size": 7936 00:30:15.218 }, 00:30:15.218 { 00:30:15.218 "name": "BaseBdev2", 00:30:15.218 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:15.218 "is_configured": true, 00:30:15.218 "data_offset": 256, 00:30:15.218 "data_size": 7936 00:30:15.218 } 00:30:15.218 ] 00:30:15.218 }' 00:30:15.218 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:15.218 06:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:15.785 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:15.786 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:15.786 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:15.786 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:15.786 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:15.786 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:15.786 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:16.043 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:16.043 "name": "raid_bdev1", 00:30:16.043 "uuid": "eff45904-05bb-4572-bda7-e1d5fd4f047f", 00:30:16.043 "strip_size_kb": 0, 00:30:16.043 "state": "online", 00:30:16.043 "raid_level": "raid1", 00:30:16.043 "superblock": true, 00:30:16.043 "num_base_bdevs": 2, 00:30:16.043 "num_base_bdevs_discovered": 1, 00:30:16.043 "num_base_bdevs_operational": 1, 00:30:16.043 "base_bdevs_list": [ 00:30:16.043 { 00:30:16.043 "name": null, 00:30:16.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:16.043 "is_configured": false, 00:30:16.043 "data_offset": 256, 00:30:16.043 "data_size": 7936 00:30:16.043 }, 00:30:16.043 { 00:30:16.043 "name": "BaseBdev2", 00:30:16.044 "uuid": "64e59770-df61-5d21-883f-7af67483f9a7", 00:30:16.044 "is_configured": true, 00:30:16.044 "data_offset": 256, 00:30:16.044 "data_size": 7936 00:30:16.044 } 00:30:16.044 ] 00:30:16.044 }' 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@798 -- # killprocess 1274781 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1274781 ']' 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 1274781 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:16.044 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1274781 00:30:16.301 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:16.301 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:16.301 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1274781' 00:30:16.301 killing process with pid 1274781 00:30:16.301 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 1274781 00:30:16.301 Received shutdown signal, test time was about 60.000000 seconds 00:30:16.301 00:30:16.301 Latency(us) 00:30:16.301 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:16.301 =================================================================================================================== 00:30:16.301 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:30:16.301 [2024-07-25 06:46:29.626077] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:16.301 [2024-07-25 06:46:29.626167] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:16.301 [2024-07-25 06:46:29.626208] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:16.301 [2024-07-25 06:46:29.626219] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17ac2f0 name raid_bdev1, state offline 00:30:16.301 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 1274781 00:30:16.301 [2024-07-25 06:46:29.653425] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:16.301 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@800 -- # return 0 00:30:16.301 00:30:16.301 real 0m29.918s 00:30:16.301 user 0m46.171s 00:30:16.301 sys 0m4.919s 00:30:16.301 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:16.301 06:46:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:16.301 ************************************ 00:30:16.301 END TEST raid_rebuild_test_sb_md_separate 00:30:16.301 ************************************ 00:30:16.559 06:46:29 bdev_raid -- bdev/bdev_raid.sh@991 -- # base_malloc_params='-m 32 -i' 00:30:16.559 06:46:29 bdev_raid -- bdev/bdev_raid.sh@992 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:30:16.559 06:46:29 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:30:16.559 06:46:29 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:16.559 06:46:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:16.559 ************************************ 00:30:16.559 START TEST raid_state_function_test_sb_md_interleaved 00:30:16.559 ************************************ 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:30:16.559 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1280202 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1280202' 00:30:16.560 Process raid pid: 1280202 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1280202 /var/tmp/spdk-raid.sock 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1280202 ']' 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:16.560 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:16.560 06:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:16.560 [2024-07-25 06:46:29.978119] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:30:16.560 [2024-07-25 06:46:29.978178] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:16.560 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:16.560 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:16.819 [2024-07-25 06:46:30.116702] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:16.819 [2024-07-25 06:46:30.161452] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:16.819 [2024-07-25 06:46:30.219361] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:16.819 [2024-07-25 06:46:30.219398] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:17.386 06:46:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:17.386 06:46:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:30:17.386 06:46:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:17.644 [2024-07-25 06:46:31.087945] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:17.644 [2024-07-25 06:46:31.087981] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:17.644 [2024-07-25 06:46:31.087990] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:17.644 [2024-07-25 06:46:31.088001] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:17.644 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:17.644 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:17.644 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:17.644 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:17.644 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:17.644 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:17.644 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:17.645 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:17.645 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:17.645 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:17.645 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:17.645 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:17.904 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:17.904 "name": "Existed_Raid", 00:30:17.904 "uuid": "0811766c-dc33-48a3-93d5-1ee7047954ce", 00:30:17.904 "strip_size_kb": 0, 00:30:17.904 "state": "configuring", 00:30:17.904 "raid_level": "raid1", 00:30:17.904 "superblock": true, 00:30:17.904 "num_base_bdevs": 2, 00:30:17.904 "num_base_bdevs_discovered": 0, 00:30:17.904 "num_base_bdevs_operational": 2, 00:30:17.904 "base_bdevs_list": [ 00:30:17.904 { 00:30:17.904 "name": "BaseBdev1", 00:30:17.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:17.904 "is_configured": false, 00:30:17.904 "data_offset": 0, 00:30:17.904 "data_size": 0 00:30:17.904 }, 00:30:17.904 { 00:30:17.904 "name": "BaseBdev2", 00:30:17.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:17.904 "is_configured": false, 00:30:17.904 "data_offset": 0, 00:30:17.904 "data_size": 0 00:30:17.904 } 00:30:17.904 ] 00:30:17.904 }' 00:30:17.904 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:17.904 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:18.471 06:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:18.730 [2024-07-25 06:46:32.122533] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:18.730 [2024-07-25 06:46:32.122558] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c4470 name Existed_Raid, state configuring 00:30:18.730 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:18.989 [2024-07-25 06:46:32.351137] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:18.989 [2024-07-25 06:46:32.351166] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:18.989 [2024-07-25 06:46:32.351175] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:18.989 [2024-07-25 06:46:32.351185] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:18.989 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:30:19.305 [2024-07-25 06:46:32.585198] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:19.305 BaseBdev1 00:30:19.305 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:30:19.305 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:30:19.305 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:19.305 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:30:19.305 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:19.305 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:19.305 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:19.305 06:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:30:19.564 [ 00:30:19.564 { 00:30:19.564 "name": "BaseBdev1", 00:30:19.564 "aliases": [ 00:30:19.564 "e47306ad-58de-463c-88bf-0e24cd760af9" 00:30:19.564 ], 00:30:19.564 "product_name": "Malloc disk", 00:30:19.564 "block_size": 4128, 00:30:19.564 "num_blocks": 8192, 00:30:19.564 "uuid": "e47306ad-58de-463c-88bf-0e24cd760af9", 00:30:19.564 "md_size": 32, 00:30:19.564 "md_interleave": true, 00:30:19.564 "dif_type": 0, 00:30:19.564 "assigned_rate_limits": { 00:30:19.564 "rw_ios_per_sec": 0, 00:30:19.564 "rw_mbytes_per_sec": 0, 00:30:19.564 "r_mbytes_per_sec": 0, 00:30:19.564 "w_mbytes_per_sec": 0 00:30:19.564 }, 00:30:19.564 "claimed": true, 00:30:19.564 "claim_type": "exclusive_write", 00:30:19.564 "zoned": false, 00:30:19.564 "supported_io_types": { 00:30:19.564 "read": true, 00:30:19.564 "write": true, 00:30:19.564 "unmap": true, 00:30:19.564 "flush": true, 00:30:19.564 "reset": true, 00:30:19.564 "nvme_admin": false, 00:30:19.564 "nvme_io": false, 00:30:19.564 "nvme_io_md": false, 00:30:19.564 "write_zeroes": true, 00:30:19.564 "zcopy": true, 00:30:19.564 "get_zone_info": false, 00:30:19.564 "zone_management": false, 00:30:19.564 "zone_append": false, 00:30:19.564 "compare": false, 00:30:19.564 "compare_and_write": false, 00:30:19.564 "abort": true, 00:30:19.564 "seek_hole": false, 00:30:19.564 "seek_data": false, 00:30:19.564 "copy": true, 00:30:19.564 "nvme_iov_md": false 00:30:19.564 }, 00:30:19.564 "memory_domains": [ 00:30:19.564 { 00:30:19.564 "dma_device_id": "system", 00:30:19.564 "dma_device_type": 1 00:30:19.564 }, 00:30:19.564 { 00:30:19.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:19.564 "dma_device_type": 2 00:30:19.564 } 00:30:19.564 ], 00:30:19.564 "driver_specific": {} 00:30:19.564 } 00:30:19.564 ] 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:19.564 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:19.822 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:19.822 "name": "Existed_Raid", 00:30:19.822 "uuid": "236f9867-013c-4e7f-9050-5fb0a3de6988", 00:30:19.822 "strip_size_kb": 0, 00:30:19.822 "state": "configuring", 00:30:19.822 "raid_level": "raid1", 00:30:19.822 "superblock": true, 00:30:19.822 "num_base_bdevs": 2, 00:30:19.822 "num_base_bdevs_discovered": 1, 00:30:19.822 "num_base_bdevs_operational": 2, 00:30:19.822 "base_bdevs_list": [ 00:30:19.822 { 00:30:19.822 "name": "BaseBdev1", 00:30:19.822 "uuid": "e47306ad-58de-463c-88bf-0e24cd760af9", 00:30:19.822 "is_configured": true, 00:30:19.822 "data_offset": 256, 00:30:19.822 "data_size": 7936 00:30:19.822 }, 00:30:19.822 { 00:30:19.822 "name": "BaseBdev2", 00:30:19.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:19.822 "is_configured": false, 00:30:19.822 "data_offset": 0, 00:30:19.822 "data_size": 0 00:30:19.822 } 00:30:19.822 ] 00:30:19.822 }' 00:30:19.822 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:19.822 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:20.388 06:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:20.645 [2024-07-25 06:46:34.077148] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:20.645 [2024-07-25 06:46:34.077182] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c3ce0 name Existed_Raid, state configuring 00:30:20.645 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:20.903 [2024-07-25 06:46:34.301762] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:20.903 [2024-07-25 06:46:34.303084] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:20.903 [2024-07-25 06:46:34.303113] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:20.903 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:20.904 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:20.904 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:21.161 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:21.161 "name": "Existed_Raid", 00:30:21.161 "uuid": "c48bbcf2-9a63-49b3-9d78-7082f70300a5", 00:30:21.161 "strip_size_kb": 0, 00:30:21.161 "state": "configuring", 00:30:21.161 "raid_level": "raid1", 00:30:21.161 "superblock": true, 00:30:21.161 "num_base_bdevs": 2, 00:30:21.161 "num_base_bdevs_discovered": 1, 00:30:21.161 "num_base_bdevs_operational": 2, 00:30:21.161 "base_bdevs_list": [ 00:30:21.161 { 00:30:21.161 "name": "BaseBdev1", 00:30:21.161 "uuid": "e47306ad-58de-463c-88bf-0e24cd760af9", 00:30:21.161 "is_configured": true, 00:30:21.161 "data_offset": 256, 00:30:21.161 "data_size": 7936 00:30:21.161 }, 00:30:21.161 { 00:30:21.161 "name": "BaseBdev2", 00:30:21.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.161 "is_configured": false, 00:30:21.161 "data_offset": 0, 00:30:21.161 "data_size": 0 00:30:21.161 } 00:30:21.161 ] 00:30:21.161 }' 00:30:21.161 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:21.161 06:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:21.727 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:30:21.985 [2024-07-25 06:46:35.359767] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:21.985 [2024-07-25 06:46:35.359885] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13c3930 00:30:21.985 [2024-07-25 06:46:35.359897] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:21.986 [2024-07-25 06:46:35.359955] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1560360 00:30:21.986 [2024-07-25 06:46:35.360028] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13c3930 00:30:21.986 [2024-07-25 06:46:35.360037] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13c3930 00:30:21.986 [2024-07-25 06:46:35.360086] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:21.986 BaseBdev2 00:30:21.986 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:30:21.986 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:30:21.986 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:21.986 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:30:21.986 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:21.986 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:21.986 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:22.244 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:22.503 [ 00:30:22.503 { 00:30:22.503 "name": "BaseBdev2", 00:30:22.503 "aliases": [ 00:30:22.503 "24420c4a-6086-4e7e-9b38-0f19d9e051d1" 00:30:22.503 ], 00:30:22.503 "product_name": "Malloc disk", 00:30:22.503 "block_size": 4128, 00:30:22.503 "num_blocks": 8192, 00:30:22.503 "uuid": "24420c4a-6086-4e7e-9b38-0f19d9e051d1", 00:30:22.503 "md_size": 32, 00:30:22.503 "md_interleave": true, 00:30:22.503 "dif_type": 0, 00:30:22.503 "assigned_rate_limits": { 00:30:22.503 "rw_ios_per_sec": 0, 00:30:22.503 "rw_mbytes_per_sec": 0, 00:30:22.503 "r_mbytes_per_sec": 0, 00:30:22.503 "w_mbytes_per_sec": 0 00:30:22.503 }, 00:30:22.503 "claimed": true, 00:30:22.503 "claim_type": "exclusive_write", 00:30:22.503 "zoned": false, 00:30:22.503 "supported_io_types": { 00:30:22.503 "read": true, 00:30:22.503 "write": true, 00:30:22.503 "unmap": true, 00:30:22.503 "flush": true, 00:30:22.503 "reset": true, 00:30:22.503 "nvme_admin": false, 00:30:22.503 "nvme_io": false, 00:30:22.503 "nvme_io_md": false, 00:30:22.503 "write_zeroes": true, 00:30:22.503 "zcopy": true, 00:30:22.503 "get_zone_info": false, 00:30:22.503 "zone_management": false, 00:30:22.503 "zone_append": false, 00:30:22.503 "compare": false, 00:30:22.503 "compare_and_write": false, 00:30:22.503 "abort": true, 00:30:22.503 "seek_hole": false, 00:30:22.503 "seek_data": false, 00:30:22.503 "copy": true, 00:30:22.503 "nvme_iov_md": false 00:30:22.503 }, 00:30:22.503 "memory_domains": [ 00:30:22.503 { 00:30:22.503 "dma_device_id": "system", 00:30:22.503 "dma_device_type": 1 00:30:22.503 }, 00:30:22.503 { 00:30:22.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:22.503 "dma_device_type": 2 00:30:22.503 } 00:30:22.503 ], 00:30:22.503 "driver_specific": {} 00:30:22.503 } 00:30:22.503 ] 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:22.503 06:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:22.762 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:22.762 "name": "Existed_Raid", 00:30:22.762 "uuid": "c48bbcf2-9a63-49b3-9d78-7082f70300a5", 00:30:22.762 "strip_size_kb": 0, 00:30:22.762 "state": "online", 00:30:22.762 "raid_level": "raid1", 00:30:22.762 "superblock": true, 00:30:22.762 "num_base_bdevs": 2, 00:30:22.762 "num_base_bdevs_discovered": 2, 00:30:22.762 "num_base_bdevs_operational": 2, 00:30:22.762 "base_bdevs_list": [ 00:30:22.762 { 00:30:22.762 "name": "BaseBdev1", 00:30:22.762 "uuid": "e47306ad-58de-463c-88bf-0e24cd760af9", 00:30:22.762 "is_configured": true, 00:30:22.762 "data_offset": 256, 00:30:22.762 "data_size": 7936 00:30:22.762 }, 00:30:22.762 { 00:30:22.762 "name": "BaseBdev2", 00:30:22.762 "uuid": "24420c4a-6086-4e7e-9b38-0f19d9e051d1", 00:30:22.762 "is_configured": true, 00:30:22.762 "data_offset": 256, 00:30:22.762 "data_size": 7936 00:30:22.762 } 00:30:22.762 ] 00:30:22.762 }' 00:30:22.762 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:22.762 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:23.329 [2024-07-25 06:46:36.860001] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:23.329 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:23.329 "name": "Existed_Raid", 00:30:23.329 "aliases": [ 00:30:23.329 "c48bbcf2-9a63-49b3-9d78-7082f70300a5" 00:30:23.329 ], 00:30:23.330 "product_name": "Raid Volume", 00:30:23.330 "block_size": 4128, 00:30:23.330 "num_blocks": 7936, 00:30:23.330 "uuid": "c48bbcf2-9a63-49b3-9d78-7082f70300a5", 00:30:23.330 "md_size": 32, 00:30:23.330 "md_interleave": true, 00:30:23.330 "dif_type": 0, 00:30:23.330 "assigned_rate_limits": { 00:30:23.330 "rw_ios_per_sec": 0, 00:30:23.330 "rw_mbytes_per_sec": 0, 00:30:23.330 "r_mbytes_per_sec": 0, 00:30:23.330 "w_mbytes_per_sec": 0 00:30:23.330 }, 00:30:23.330 "claimed": false, 00:30:23.330 "zoned": false, 00:30:23.330 "supported_io_types": { 00:30:23.330 "read": true, 00:30:23.330 "write": true, 00:30:23.330 "unmap": false, 00:30:23.330 "flush": false, 00:30:23.330 "reset": true, 00:30:23.330 "nvme_admin": false, 00:30:23.330 "nvme_io": false, 00:30:23.330 "nvme_io_md": false, 00:30:23.330 "write_zeroes": true, 00:30:23.330 "zcopy": false, 00:30:23.330 "get_zone_info": false, 00:30:23.330 "zone_management": false, 00:30:23.330 "zone_append": false, 00:30:23.330 "compare": false, 00:30:23.330 "compare_and_write": false, 00:30:23.330 "abort": false, 00:30:23.330 "seek_hole": false, 00:30:23.330 "seek_data": false, 00:30:23.330 "copy": false, 00:30:23.330 "nvme_iov_md": false 00:30:23.330 }, 00:30:23.330 "memory_domains": [ 00:30:23.330 { 00:30:23.330 "dma_device_id": "system", 00:30:23.330 "dma_device_type": 1 00:30:23.330 }, 00:30:23.330 { 00:30:23.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:23.330 "dma_device_type": 2 00:30:23.330 }, 00:30:23.330 { 00:30:23.330 "dma_device_id": "system", 00:30:23.330 "dma_device_type": 1 00:30:23.330 }, 00:30:23.330 { 00:30:23.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:23.330 "dma_device_type": 2 00:30:23.330 } 00:30:23.330 ], 00:30:23.330 "driver_specific": { 00:30:23.330 "raid": { 00:30:23.330 "uuid": "c48bbcf2-9a63-49b3-9d78-7082f70300a5", 00:30:23.330 "strip_size_kb": 0, 00:30:23.330 "state": "online", 00:30:23.330 "raid_level": "raid1", 00:30:23.330 "superblock": true, 00:30:23.330 "num_base_bdevs": 2, 00:30:23.330 "num_base_bdevs_discovered": 2, 00:30:23.330 "num_base_bdevs_operational": 2, 00:30:23.330 "base_bdevs_list": [ 00:30:23.330 { 00:30:23.330 "name": "BaseBdev1", 00:30:23.330 "uuid": "e47306ad-58de-463c-88bf-0e24cd760af9", 00:30:23.330 "is_configured": true, 00:30:23.330 "data_offset": 256, 00:30:23.330 "data_size": 7936 00:30:23.330 }, 00:30:23.330 { 00:30:23.330 "name": "BaseBdev2", 00:30:23.330 "uuid": "24420c4a-6086-4e7e-9b38-0f19d9e051d1", 00:30:23.330 "is_configured": true, 00:30:23.330 "data_offset": 256, 00:30:23.330 "data_size": 7936 00:30:23.330 } 00:30:23.330 ] 00:30:23.330 } 00:30:23.330 } 00:30:23.330 }' 00:30:23.330 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:23.589 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:30:23.589 BaseBdev2' 00:30:23.589 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:23.589 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:30:23.589 06:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:23.589 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:23.589 "name": "BaseBdev1", 00:30:23.589 "aliases": [ 00:30:23.589 "e47306ad-58de-463c-88bf-0e24cd760af9" 00:30:23.589 ], 00:30:23.589 "product_name": "Malloc disk", 00:30:23.589 "block_size": 4128, 00:30:23.589 "num_blocks": 8192, 00:30:23.589 "uuid": "e47306ad-58de-463c-88bf-0e24cd760af9", 00:30:23.589 "md_size": 32, 00:30:23.589 "md_interleave": true, 00:30:23.589 "dif_type": 0, 00:30:23.589 "assigned_rate_limits": { 00:30:23.589 "rw_ios_per_sec": 0, 00:30:23.589 "rw_mbytes_per_sec": 0, 00:30:23.589 "r_mbytes_per_sec": 0, 00:30:23.589 "w_mbytes_per_sec": 0 00:30:23.589 }, 00:30:23.589 "claimed": true, 00:30:23.589 "claim_type": "exclusive_write", 00:30:23.589 "zoned": false, 00:30:23.589 "supported_io_types": { 00:30:23.589 "read": true, 00:30:23.589 "write": true, 00:30:23.589 "unmap": true, 00:30:23.589 "flush": true, 00:30:23.589 "reset": true, 00:30:23.589 "nvme_admin": false, 00:30:23.589 "nvme_io": false, 00:30:23.589 "nvme_io_md": false, 00:30:23.589 "write_zeroes": true, 00:30:23.589 "zcopy": true, 00:30:23.589 "get_zone_info": false, 00:30:23.589 "zone_management": false, 00:30:23.589 "zone_append": false, 00:30:23.589 "compare": false, 00:30:23.589 "compare_and_write": false, 00:30:23.589 "abort": true, 00:30:23.589 "seek_hole": false, 00:30:23.589 "seek_data": false, 00:30:23.589 "copy": true, 00:30:23.589 "nvme_iov_md": false 00:30:23.589 }, 00:30:23.589 "memory_domains": [ 00:30:23.589 { 00:30:23.589 "dma_device_id": "system", 00:30:23.589 "dma_device_type": 1 00:30:23.589 }, 00:30:23.589 { 00:30:23.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:23.589 "dma_device_type": 2 00:30:23.589 } 00:30:23.589 ], 00:30:23.589 "driver_specific": {} 00:30:23.589 }' 00:30:23.847 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:23.847 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:23.847 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:23.847 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:23.847 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:23.847 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:23.847 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:23.847 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:24.106 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:24.106 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:24.106 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:24.106 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:24.106 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:24.106 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:30:24.106 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:24.364 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:24.364 "name": "BaseBdev2", 00:30:24.364 "aliases": [ 00:30:24.364 "24420c4a-6086-4e7e-9b38-0f19d9e051d1" 00:30:24.364 ], 00:30:24.364 "product_name": "Malloc disk", 00:30:24.364 "block_size": 4128, 00:30:24.364 "num_blocks": 8192, 00:30:24.364 "uuid": "24420c4a-6086-4e7e-9b38-0f19d9e051d1", 00:30:24.364 "md_size": 32, 00:30:24.364 "md_interleave": true, 00:30:24.364 "dif_type": 0, 00:30:24.364 "assigned_rate_limits": { 00:30:24.364 "rw_ios_per_sec": 0, 00:30:24.364 "rw_mbytes_per_sec": 0, 00:30:24.364 "r_mbytes_per_sec": 0, 00:30:24.364 "w_mbytes_per_sec": 0 00:30:24.365 }, 00:30:24.365 "claimed": true, 00:30:24.365 "claim_type": "exclusive_write", 00:30:24.365 "zoned": false, 00:30:24.365 "supported_io_types": { 00:30:24.365 "read": true, 00:30:24.365 "write": true, 00:30:24.365 "unmap": true, 00:30:24.365 "flush": true, 00:30:24.365 "reset": true, 00:30:24.365 "nvme_admin": false, 00:30:24.365 "nvme_io": false, 00:30:24.365 "nvme_io_md": false, 00:30:24.365 "write_zeroes": true, 00:30:24.365 "zcopy": true, 00:30:24.365 "get_zone_info": false, 00:30:24.365 "zone_management": false, 00:30:24.365 "zone_append": false, 00:30:24.365 "compare": false, 00:30:24.365 "compare_and_write": false, 00:30:24.365 "abort": true, 00:30:24.365 "seek_hole": false, 00:30:24.365 "seek_data": false, 00:30:24.365 "copy": true, 00:30:24.365 "nvme_iov_md": false 00:30:24.365 }, 00:30:24.365 "memory_domains": [ 00:30:24.365 { 00:30:24.365 "dma_device_id": "system", 00:30:24.365 "dma_device_type": 1 00:30:24.365 }, 00:30:24.365 { 00:30:24.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:24.365 "dma_device_type": 2 00:30:24.365 } 00:30:24.365 ], 00:30:24.365 "driver_specific": {} 00:30:24.365 }' 00:30:24.365 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:24.365 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:24.365 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:24.365 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:24.365 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:24.365 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:24.365 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:24.624 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:24.624 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:24.624 06:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:24.624 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:24.624 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:24.624 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:30:24.884 [2024-07-25 06:46:38.255483] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:24.884 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.143 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:25.143 "name": "Existed_Raid", 00:30:25.143 "uuid": "c48bbcf2-9a63-49b3-9d78-7082f70300a5", 00:30:25.143 "strip_size_kb": 0, 00:30:25.143 "state": "online", 00:30:25.143 "raid_level": "raid1", 00:30:25.143 "superblock": true, 00:30:25.143 "num_base_bdevs": 2, 00:30:25.143 "num_base_bdevs_discovered": 1, 00:30:25.143 "num_base_bdevs_operational": 1, 00:30:25.143 "base_bdevs_list": [ 00:30:25.143 { 00:30:25.143 "name": null, 00:30:25.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:25.143 "is_configured": false, 00:30:25.143 "data_offset": 256, 00:30:25.143 "data_size": 7936 00:30:25.143 }, 00:30:25.143 { 00:30:25.143 "name": "BaseBdev2", 00:30:25.143 "uuid": "24420c4a-6086-4e7e-9b38-0f19d9e051d1", 00:30:25.143 "is_configured": true, 00:30:25.143 "data_offset": 256, 00:30:25.143 "data_size": 7936 00:30:25.143 } 00:30:25.143 ] 00:30:25.143 }' 00:30:25.143 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:25.143 06:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:25.711 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:30:25.711 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:25.711 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.711 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:25.711 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:25.711 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:25.711 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:30:25.971 [2024-07-25 06:46:39.431692] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:25.971 [2024-07-25 06:46:39.431772] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:25.971 [2024-07-25 06:46:39.442565] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:25.971 [2024-07-25 06:46:39.442592] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:25.971 [2024-07-25 06:46:39.442602] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c3930 name Existed_Raid, state offline 00:30:25.971 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:25.971 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:25.971 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.971 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1280202 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1280202 ']' 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1280202 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1280202 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1280202' 00:30:26.231 killing process with pid 1280202 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 1280202 00:30:26.231 [2024-07-25 06:46:39.755007] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:26.231 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 1280202 00:30:26.231 [2024-07-25 06:46:39.755861] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:26.490 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:30:26.490 00:30:26.490 real 0m10.020s 00:30:26.490 user 0m17.746s 00:30:26.490 sys 0m1.931s 00:30:26.490 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:26.490 06:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:26.490 ************************************ 00:30:26.490 END TEST raid_state_function_test_sb_md_interleaved 00:30:26.490 ************************************ 00:30:26.490 06:46:39 bdev_raid -- bdev/bdev_raid.sh@993 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:30:26.490 06:46:39 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:30:26.490 06:46:39 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:26.490 06:46:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:26.490 ************************************ 00:30:26.490 START TEST raid_superblock_test_md_interleaved 00:30:26.490 ************************************ 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:30:26.490 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@414 -- # local strip_size 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@427 -- # raid_pid=1282084 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@428 -- # waitforlisten 1282084 /var/tmp/spdk-raid.sock 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1282084 ']' 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:26.491 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:26.491 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:26.750 [2024-07-25 06:46:40.084463] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:30:26.750 [2024-07-25 06:46:40.084521] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1282084 ] 00:30:26.750 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.750 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:26.750 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.750 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:26.750 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.750 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:26.750 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.750 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:26.750 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.750 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:26.750 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:26.751 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:26.751 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:26.751 [2024-07-25 06:46:40.218925] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:26.751 [2024-07-25 06:46:40.262075] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:27.010 [2024-07-25 06:46:40.322184] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:27.010 [2024-07-25 06:46:40.322217] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:27.579 06:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:30:27.838 malloc1 00:30:27.838 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:28.098 [2024-07-25 06:46:41.433900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:28.098 [2024-07-25 06:46:41.433943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:28.098 [2024-07-25 06:46:41.433963] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe32510 00:30:28.098 [2024-07-25 06:46:41.433974] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:28.098 [2024-07-25 06:46:41.435318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:28.098 [2024-07-25 06:46:41.435343] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:28.098 pt1 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:28.098 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:30:28.357 malloc2 00:30:28.357 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:28.357 [2024-07-25 06:46:41.883616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:28.357 [2024-07-25 06:46:41.883655] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:28.357 [2024-07-25 06:46:41.883673] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc0790 00:30:28.357 [2024-07-25 06:46:41.883684] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:28.357 [2024-07-25 06:46:41.884929] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:28.357 [2024-07-25 06:46:41.884955] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:28.357 pt2 00:30:28.357 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:30:28.357 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:30:28.357 06:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:30:28.616 [2024-07-25 06:46:42.108238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:28.616 [2024-07-25 06:46:42.109461] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:28.616 [2024-07-25 06:46:42.109602] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc21c0 00:30:28.616 [2024-07-25 06:46:42.109614] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:28.616 [2024-07-25 06:46:42.109679] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe31030 00:30:28.616 [2024-07-25 06:46:42.109755] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc21c0 00:30:28.616 [2024-07-25 06:46:42.109764] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc21c0 00:30:28.616 [2024-07-25 06:46:42.109816] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:28.617 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:28.876 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:28.876 "name": "raid_bdev1", 00:30:28.876 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:28.876 "strip_size_kb": 0, 00:30:28.876 "state": "online", 00:30:28.876 "raid_level": "raid1", 00:30:28.876 "superblock": true, 00:30:28.876 "num_base_bdevs": 2, 00:30:28.876 "num_base_bdevs_discovered": 2, 00:30:28.876 "num_base_bdevs_operational": 2, 00:30:28.876 "base_bdevs_list": [ 00:30:28.876 { 00:30:28.876 "name": "pt1", 00:30:28.876 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:28.876 "is_configured": true, 00:30:28.876 "data_offset": 256, 00:30:28.876 "data_size": 7936 00:30:28.876 }, 00:30:28.876 { 00:30:28.876 "name": "pt2", 00:30:28.876 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:28.876 "is_configured": true, 00:30:28.876 "data_offset": 256, 00:30:28.876 "data_size": 7936 00:30:28.876 } 00:30:28.876 ] 00:30:28.876 }' 00:30:28.876 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:28.876 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:29.443 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:30:29.443 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:29.443 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:29.443 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:29.443 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:29.443 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:29.443 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:29.443 06:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:29.702 [2024-07-25 06:46:43.095045] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:29.702 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:29.702 "name": "raid_bdev1", 00:30:29.702 "aliases": [ 00:30:29.702 "e3fe5a0c-c73e-488c-a32e-5cff2194924e" 00:30:29.702 ], 00:30:29.702 "product_name": "Raid Volume", 00:30:29.702 "block_size": 4128, 00:30:29.702 "num_blocks": 7936, 00:30:29.702 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:29.702 "md_size": 32, 00:30:29.702 "md_interleave": true, 00:30:29.702 "dif_type": 0, 00:30:29.702 "assigned_rate_limits": { 00:30:29.702 "rw_ios_per_sec": 0, 00:30:29.702 "rw_mbytes_per_sec": 0, 00:30:29.702 "r_mbytes_per_sec": 0, 00:30:29.702 "w_mbytes_per_sec": 0 00:30:29.702 }, 00:30:29.702 "claimed": false, 00:30:29.702 "zoned": false, 00:30:29.702 "supported_io_types": { 00:30:29.702 "read": true, 00:30:29.702 "write": true, 00:30:29.702 "unmap": false, 00:30:29.702 "flush": false, 00:30:29.702 "reset": true, 00:30:29.702 "nvme_admin": false, 00:30:29.702 "nvme_io": false, 00:30:29.702 "nvme_io_md": false, 00:30:29.702 "write_zeroes": true, 00:30:29.702 "zcopy": false, 00:30:29.702 "get_zone_info": false, 00:30:29.702 "zone_management": false, 00:30:29.702 "zone_append": false, 00:30:29.702 "compare": false, 00:30:29.702 "compare_and_write": false, 00:30:29.702 "abort": false, 00:30:29.702 "seek_hole": false, 00:30:29.702 "seek_data": false, 00:30:29.702 "copy": false, 00:30:29.702 "nvme_iov_md": false 00:30:29.702 }, 00:30:29.702 "memory_domains": [ 00:30:29.702 { 00:30:29.702 "dma_device_id": "system", 00:30:29.702 "dma_device_type": 1 00:30:29.702 }, 00:30:29.702 { 00:30:29.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.702 "dma_device_type": 2 00:30:29.702 }, 00:30:29.702 { 00:30:29.702 "dma_device_id": "system", 00:30:29.702 "dma_device_type": 1 00:30:29.702 }, 00:30:29.702 { 00:30:29.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.702 "dma_device_type": 2 00:30:29.702 } 00:30:29.702 ], 00:30:29.702 "driver_specific": { 00:30:29.702 "raid": { 00:30:29.702 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:29.702 "strip_size_kb": 0, 00:30:29.702 "state": "online", 00:30:29.702 "raid_level": "raid1", 00:30:29.702 "superblock": true, 00:30:29.702 "num_base_bdevs": 2, 00:30:29.702 "num_base_bdevs_discovered": 2, 00:30:29.702 "num_base_bdevs_operational": 2, 00:30:29.702 "base_bdevs_list": [ 00:30:29.702 { 00:30:29.702 "name": "pt1", 00:30:29.702 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:29.702 "is_configured": true, 00:30:29.702 "data_offset": 256, 00:30:29.702 "data_size": 7936 00:30:29.702 }, 00:30:29.702 { 00:30:29.702 "name": "pt2", 00:30:29.702 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:29.702 "is_configured": true, 00:30:29.702 "data_offset": 256, 00:30:29.702 "data_size": 7936 00:30:29.702 } 00:30:29.702 ] 00:30:29.702 } 00:30:29.702 } 00:30:29.702 }' 00:30:29.702 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:29.702 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:29.702 pt2' 00:30:29.702 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:29.702 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:29.702 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:29.960 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:29.960 "name": "pt1", 00:30:29.960 "aliases": [ 00:30:29.960 "00000000-0000-0000-0000-000000000001" 00:30:29.960 ], 00:30:29.960 "product_name": "passthru", 00:30:29.960 "block_size": 4128, 00:30:29.960 "num_blocks": 8192, 00:30:29.960 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:29.960 "md_size": 32, 00:30:29.960 "md_interleave": true, 00:30:29.960 "dif_type": 0, 00:30:29.960 "assigned_rate_limits": { 00:30:29.960 "rw_ios_per_sec": 0, 00:30:29.960 "rw_mbytes_per_sec": 0, 00:30:29.960 "r_mbytes_per_sec": 0, 00:30:29.960 "w_mbytes_per_sec": 0 00:30:29.960 }, 00:30:29.960 "claimed": true, 00:30:29.960 "claim_type": "exclusive_write", 00:30:29.960 "zoned": false, 00:30:29.960 "supported_io_types": { 00:30:29.960 "read": true, 00:30:29.960 "write": true, 00:30:29.960 "unmap": true, 00:30:29.960 "flush": true, 00:30:29.960 "reset": true, 00:30:29.960 "nvme_admin": false, 00:30:29.960 "nvme_io": false, 00:30:29.960 "nvme_io_md": false, 00:30:29.960 "write_zeroes": true, 00:30:29.960 "zcopy": true, 00:30:29.960 "get_zone_info": false, 00:30:29.960 "zone_management": false, 00:30:29.960 "zone_append": false, 00:30:29.960 "compare": false, 00:30:29.960 "compare_and_write": false, 00:30:29.960 "abort": true, 00:30:29.960 "seek_hole": false, 00:30:29.960 "seek_data": false, 00:30:29.960 "copy": true, 00:30:29.960 "nvme_iov_md": false 00:30:29.960 }, 00:30:29.960 "memory_domains": [ 00:30:29.960 { 00:30:29.960 "dma_device_id": "system", 00:30:29.960 "dma_device_type": 1 00:30:29.960 }, 00:30:29.960 { 00:30:29.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.960 "dma_device_type": 2 00:30:29.960 } 00:30:29.960 ], 00:30:29.960 "driver_specific": { 00:30:29.960 "passthru": { 00:30:29.960 "name": "pt1", 00:30:29.960 "base_bdev_name": "malloc1" 00:30:29.960 } 00:30:29.960 } 00:30:29.960 }' 00:30:29.960 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:29.960 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:29.960 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:29.960 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:30.218 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:30.477 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:30.477 "name": "pt2", 00:30:30.477 "aliases": [ 00:30:30.477 "00000000-0000-0000-0000-000000000002" 00:30:30.477 ], 00:30:30.477 "product_name": "passthru", 00:30:30.477 "block_size": 4128, 00:30:30.477 "num_blocks": 8192, 00:30:30.477 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:30.477 "md_size": 32, 00:30:30.477 "md_interleave": true, 00:30:30.477 "dif_type": 0, 00:30:30.477 "assigned_rate_limits": { 00:30:30.477 "rw_ios_per_sec": 0, 00:30:30.477 "rw_mbytes_per_sec": 0, 00:30:30.477 "r_mbytes_per_sec": 0, 00:30:30.477 "w_mbytes_per_sec": 0 00:30:30.477 }, 00:30:30.477 "claimed": true, 00:30:30.477 "claim_type": "exclusive_write", 00:30:30.477 "zoned": false, 00:30:30.477 "supported_io_types": { 00:30:30.477 "read": true, 00:30:30.477 "write": true, 00:30:30.477 "unmap": true, 00:30:30.477 "flush": true, 00:30:30.477 "reset": true, 00:30:30.477 "nvme_admin": false, 00:30:30.477 "nvme_io": false, 00:30:30.477 "nvme_io_md": false, 00:30:30.477 "write_zeroes": true, 00:30:30.477 "zcopy": true, 00:30:30.477 "get_zone_info": false, 00:30:30.477 "zone_management": false, 00:30:30.477 "zone_append": false, 00:30:30.477 "compare": false, 00:30:30.477 "compare_and_write": false, 00:30:30.477 "abort": true, 00:30:30.477 "seek_hole": false, 00:30:30.477 "seek_data": false, 00:30:30.477 "copy": true, 00:30:30.477 "nvme_iov_md": false 00:30:30.477 }, 00:30:30.477 "memory_domains": [ 00:30:30.477 { 00:30:30.477 "dma_device_id": "system", 00:30:30.477 "dma_device_type": 1 00:30:30.477 }, 00:30:30.477 { 00:30:30.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:30.477 "dma_device_type": 2 00:30:30.477 } 00:30:30.477 ], 00:30:30.477 "driver_specific": { 00:30:30.477 "passthru": { 00:30:30.477 "name": "pt2", 00:30:30.477 "base_bdev_name": "malloc2" 00:30:30.477 } 00:30:30.477 } 00:30:30.477 }' 00:30:30.477 06:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:30.477 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:30.736 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:30:31.341 [2024-07-25 06:46:44.739368] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:31.341 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=e3fe5a0c-c73e-488c-a32e-5cff2194924e 00:30:31.341 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # '[' -z e3fe5a0c-c73e-488c-a32e-5cff2194924e ']' 00:30:31.341 06:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:31.600 [2024-07-25 06:46:44.983763] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:31.600 [2024-07-25 06:46:44.983781] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:31.600 [2024-07-25 06:46:44.983830] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:31.600 [2024-07-25 06:46:44.983880] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:31.600 [2024-07-25 06:46:44.983891] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc21c0 name raid_bdev1, state offline 00:30:31.600 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:31.600 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:30:31.859 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:30:31.859 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:30:31.859 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:30:31.860 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:32.119 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:30:32.119 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:32.378 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:30:32.378 06:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:30:32.637 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:30:32.637 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:32.637 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:30:32.637 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:32.637 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:32.637 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:32.637 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:32.896 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:32.896 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:32.896 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:32.896 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:32.896 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:32.896 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:33.155 [2024-07-25 06:46:46.676350] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:30:33.155 [2024-07-25 06:46:46.677576] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:30:33.155 [2024-07-25 06:46:46.677628] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:30:33.155 [2024-07-25 06:46:46.677664] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:30:33.155 [2024-07-25 06:46:46.677680] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:33.155 [2024-07-25 06:46:46.677689] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe31930 name raid_bdev1, state configuring 00:30:33.155 request: 00:30:33.155 { 00:30:33.155 "name": "raid_bdev1", 00:30:33.155 "raid_level": "raid1", 00:30:33.155 "base_bdevs": [ 00:30:33.155 "malloc1", 00:30:33.155 "malloc2" 00:30:33.155 ], 00:30:33.155 "superblock": false, 00:30:33.155 "method": "bdev_raid_create", 00:30:33.155 "req_id": 1 00:30:33.155 } 00:30:33.155 Got JSON-RPC error response 00:30:33.155 response: 00:30:33.155 { 00:30:33.155 "code": -17, 00:30:33.155 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:30:33.155 } 00:30:33.155 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:30:33.155 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:30:33.155 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:30:33.155 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:30:33.414 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:33.414 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:30:33.414 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:30:33.414 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:30:33.414 06:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:33.673 [2024-07-25 06:46:47.125481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:33.673 [2024-07-25 06:46:47.125524] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:33.673 [2024-07-25 06:46:47.125542] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe31220 00:30:33.673 [2024-07-25 06:46:47.125553] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:33.673 [2024-07-25 06:46:47.126831] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:33.673 [2024-07-25 06:46:47.126856] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:33.673 [2024-07-25 06:46:47.126898] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:33.673 [2024-07-25 06:46:47.126919] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:33.673 pt1 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:33.673 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:33.931 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:33.931 "name": "raid_bdev1", 00:30:33.931 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:33.931 "strip_size_kb": 0, 00:30:33.931 "state": "configuring", 00:30:33.931 "raid_level": "raid1", 00:30:33.931 "superblock": true, 00:30:33.931 "num_base_bdevs": 2, 00:30:33.931 "num_base_bdevs_discovered": 1, 00:30:33.931 "num_base_bdevs_operational": 2, 00:30:33.931 "base_bdevs_list": [ 00:30:33.931 { 00:30:33.931 "name": "pt1", 00:30:33.931 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:33.931 "is_configured": true, 00:30:33.931 "data_offset": 256, 00:30:33.931 "data_size": 7936 00:30:33.931 }, 00:30:33.931 { 00:30:33.931 "name": null, 00:30:33.931 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:33.931 "is_configured": false, 00:30:33.931 "data_offset": 256, 00:30:33.931 "data_size": 7936 00:30:33.931 } 00:30:33.931 ] 00:30:33.931 }' 00:30:33.931 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:33.931 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:34.496 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:30:34.496 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:30:34.496 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:30:34.496 06:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:34.754 [2024-07-25 06:46:48.124126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:34.754 [2024-07-25 06:46:48.124174] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:34.754 [2024-07-25 06:46:48.124191] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc1010 00:30:34.754 [2024-07-25 06:46:48.124202] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:34.754 [2024-07-25 06:46:48.124347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:34.754 [2024-07-25 06:46:48.124361] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:34.754 [2024-07-25 06:46:48.124402] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:34.754 [2024-07-25 06:46:48.124418] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:34.754 [2024-07-25 06:46:48.124493] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc41a0 00:30:34.754 [2024-07-25 06:46:48.124503] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:34.754 [2024-07-25 06:46:48.124553] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc5e60 00:30:34.754 [2024-07-25 06:46:48.124622] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc41a0 00:30:34.754 [2024-07-25 06:46:48.124630] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc41a0 00:30:34.754 [2024-07-25 06:46:48.124682] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:34.754 pt2 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:34.754 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:35.012 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:35.012 "name": "raid_bdev1", 00:30:35.012 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:35.012 "strip_size_kb": 0, 00:30:35.012 "state": "online", 00:30:35.012 "raid_level": "raid1", 00:30:35.012 "superblock": true, 00:30:35.012 "num_base_bdevs": 2, 00:30:35.012 "num_base_bdevs_discovered": 2, 00:30:35.012 "num_base_bdevs_operational": 2, 00:30:35.012 "base_bdevs_list": [ 00:30:35.012 { 00:30:35.012 "name": "pt1", 00:30:35.012 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:35.012 "is_configured": true, 00:30:35.012 "data_offset": 256, 00:30:35.012 "data_size": 7936 00:30:35.012 }, 00:30:35.012 { 00:30:35.012 "name": "pt2", 00:30:35.012 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:35.012 "is_configured": true, 00:30:35.012 "data_offset": 256, 00:30:35.012 "data_size": 7936 00:30:35.012 } 00:30:35.012 ] 00:30:35.012 }' 00:30:35.012 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:35.012 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:35.577 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:30:35.577 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:35.577 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:35.577 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:35.577 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:35.577 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:35.577 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:35.577 06:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:36.143 [2024-07-25 06:46:49.455865] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:36.143 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:36.143 "name": "raid_bdev1", 00:30:36.143 "aliases": [ 00:30:36.143 "e3fe5a0c-c73e-488c-a32e-5cff2194924e" 00:30:36.143 ], 00:30:36.143 "product_name": "Raid Volume", 00:30:36.143 "block_size": 4128, 00:30:36.143 "num_blocks": 7936, 00:30:36.143 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:36.143 "md_size": 32, 00:30:36.143 "md_interleave": true, 00:30:36.143 "dif_type": 0, 00:30:36.143 "assigned_rate_limits": { 00:30:36.143 "rw_ios_per_sec": 0, 00:30:36.143 "rw_mbytes_per_sec": 0, 00:30:36.143 "r_mbytes_per_sec": 0, 00:30:36.143 "w_mbytes_per_sec": 0 00:30:36.143 }, 00:30:36.143 "claimed": false, 00:30:36.143 "zoned": false, 00:30:36.143 "supported_io_types": { 00:30:36.143 "read": true, 00:30:36.143 "write": true, 00:30:36.143 "unmap": false, 00:30:36.143 "flush": false, 00:30:36.144 "reset": true, 00:30:36.144 "nvme_admin": false, 00:30:36.144 "nvme_io": false, 00:30:36.144 "nvme_io_md": false, 00:30:36.144 "write_zeroes": true, 00:30:36.144 "zcopy": false, 00:30:36.144 "get_zone_info": false, 00:30:36.144 "zone_management": false, 00:30:36.144 "zone_append": false, 00:30:36.144 "compare": false, 00:30:36.144 "compare_and_write": false, 00:30:36.144 "abort": false, 00:30:36.144 "seek_hole": false, 00:30:36.144 "seek_data": false, 00:30:36.144 "copy": false, 00:30:36.144 "nvme_iov_md": false 00:30:36.144 }, 00:30:36.144 "memory_domains": [ 00:30:36.144 { 00:30:36.144 "dma_device_id": "system", 00:30:36.144 "dma_device_type": 1 00:30:36.144 }, 00:30:36.144 { 00:30:36.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.144 "dma_device_type": 2 00:30:36.144 }, 00:30:36.144 { 00:30:36.144 "dma_device_id": "system", 00:30:36.144 "dma_device_type": 1 00:30:36.144 }, 00:30:36.144 { 00:30:36.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.144 "dma_device_type": 2 00:30:36.144 } 00:30:36.144 ], 00:30:36.144 "driver_specific": { 00:30:36.144 "raid": { 00:30:36.144 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:36.144 "strip_size_kb": 0, 00:30:36.144 "state": "online", 00:30:36.144 "raid_level": "raid1", 00:30:36.144 "superblock": true, 00:30:36.144 "num_base_bdevs": 2, 00:30:36.144 "num_base_bdevs_discovered": 2, 00:30:36.144 "num_base_bdevs_operational": 2, 00:30:36.144 "base_bdevs_list": [ 00:30:36.144 { 00:30:36.144 "name": "pt1", 00:30:36.144 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:36.144 "is_configured": true, 00:30:36.144 "data_offset": 256, 00:30:36.144 "data_size": 7936 00:30:36.144 }, 00:30:36.144 { 00:30:36.144 "name": "pt2", 00:30:36.144 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:36.144 "is_configured": true, 00:30:36.144 "data_offset": 256, 00:30:36.144 "data_size": 7936 00:30:36.144 } 00:30:36.144 ] 00:30:36.144 } 00:30:36.144 } 00:30:36.144 }' 00:30:36.144 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:36.144 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:36.144 pt2' 00:30:36.144 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:36.144 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:36.144 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:36.144 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:36.144 "name": "pt1", 00:30:36.144 "aliases": [ 00:30:36.144 "00000000-0000-0000-0000-000000000001" 00:30:36.144 ], 00:30:36.144 "product_name": "passthru", 00:30:36.144 "block_size": 4128, 00:30:36.144 "num_blocks": 8192, 00:30:36.144 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:36.144 "md_size": 32, 00:30:36.144 "md_interleave": true, 00:30:36.144 "dif_type": 0, 00:30:36.144 "assigned_rate_limits": { 00:30:36.144 "rw_ios_per_sec": 0, 00:30:36.144 "rw_mbytes_per_sec": 0, 00:30:36.144 "r_mbytes_per_sec": 0, 00:30:36.144 "w_mbytes_per_sec": 0 00:30:36.144 }, 00:30:36.144 "claimed": true, 00:30:36.144 "claim_type": "exclusive_write", 00:30:36.144 "zoned": false, 00:30:36.144 "supported_io_types": { 00:30:36.144 "read": true, 00:30:36.144 "write": true, 00:30:36.144 "unmap": true, 00:30:36.144 "flush": true, 00:30:36.144 "reset": true, 00:30:36.144 "nvme_admin": false, 00:30:36.144 "nvme_io": false, 00:30:36.144 "nvme_io_md": false, 00:30:36.144 "write_zeroes": true, 00:30:36.144 "zcopy": true, 00:30:36.144 "get_zone_info": false, 00:30:36.144 "zone_management": false, 00:30:36.144 "zone_append": false, 00:30:36.144 "compare": false, 00:30:36.144 "compare_and_write": false, 00:30:36.144 "abort": true, 00:30:36.144 "seek_hole": false, 00:30:36.144 "seek_data": false, 00:30:36.144 "copy": true, 00:30:36.144 "nvme_iov_md": false 00:30:36.144 }, 00:30:36.144 "memory_domains": [ 00:30:36.144 { 00:30:36.144 "dma_device_id": "system", 00:30:36.144 "dma_device_type": 1 00:30:36.144 }, 00:30:36.144 { 00:30:36.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.144 "dma_device_type": 2 00:30:36.144 } 00:30:36.144 ], 00:30:36.144 "driver_specific": { 00:30:36.144 "passthru": { 00:30:36.144 "name": "pt1", 00:30:36.144 "base_bdev_name": "malloc1" 00:30:36.144 } 00:30:36.144 } 00:30:36.144 }' 00:30:36.144 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:36.402 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:36.402 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:36.402 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:36.402 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:36.402 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:36.402 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:36.660 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:36.660 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:36.660 06:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:36.660 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:36.660 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:36.660 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:36.660 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:36.660 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:36.918 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:36.918 "name": "pt2", 00:30:36.918 "aliases": [ 00:30:36.918 "00000000-0000-0000-0000-000000000002" 00:30:36.918 ], 00:30:36.918 "product_name": "passthru", 00:30:36.918 "block_size": 4128, 00:30:36.918 "num_blocks": 8192, 00:30:36.918 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:36.918 "md_size": 32, 00:30:36.918 "md_interleave": true, 00:30:36.918 "dif_type": 0, 00:30:36.918 "assigned_rate_limits": { 00:30:36.918 "rw_ios_per_sec": 0, 00:30:36.918 "rw_mbytes_per_sec": 0, 00:30:36.918 "r_mbytes_per_sec": 0, 00:30:36.918 "w_mbytes_per_sec": 0 00:30:36.918 }, 00:30:36.918 "claimed": true, 00:30:36.918 "claim_type": "exclusive_write", 00:30:36.918 "zoned": false, 00:30:36.918 "supported_io_types": { 00:30:36.918 "read": true, 00:30:36.918 "write": true, 00:30:36.918 "unmap": true, 00:30:36.918 "flush": true, 00:30:36.918 "reset": true, 00:30:36.918 "nvme_admin": false, 00:30:36.918 "nvme_io": false, 00:30:36.918 "nvme_io_md": false, 00:30:36.919 "write_zeroes": true, 00:30:36.919 "zcopy": true, 00:30:36.919 "get_zone_info": false, 00:30:36.919 "zone_management": false, 00:30:36.919 "zone_append": false, 00:30:36.919 "compare": false, 00:30:36.919 "compare_and_write": false, 00:30:36.919 "abort": true, 00:30:36.919 "seek_hole": false, 00:30:36.919 "seek_data": false, 00:30:36.919 "copy": true, 00:30:36.919 "nvme_iov_md": false 00:30:36.919 }, 00:30:36.919 "memory_domains": [ 00:30:36.919 { 00:30:36.919 "dma_device_id": "system", 00:30:36.919 "dma_device_type": 1 00:30:36.919 }, 00:30:36.919 { 00:30:36.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.919 "dma_device_type": 2 00:30:36.919 } 00:30:36.919 ], 00:30:36.919 "driver_specific": { 00:30:36.919 "passthru": { 00:30:36.919 "name": "pt2", 00:30:36.919 "base_bdev_name": "malloc2" 00:30:36.919 } 00:30:36.919 } 00:30:36.919 }' 00:30:36.919 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:36.919 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:36.919 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:36.919 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:37.177 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:37.177 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:37.177 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:37.177 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:37.177 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:37.177 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:37.177 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:37.435 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:37.436 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:37.436 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:30:37.436 [2024-07-25 06:46:50.955812] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:37.436 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # '[' e3fe5a0c-c73e-488c-a32e-5cff2194924e '!=' e3fe5a0c-c73e-488c-a32e-5cff2194924e ']' 00:30:37.436 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:30:37.436 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:37.436 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:30:37.436 06:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:37.694 [2024-07-25 06:46:51.180195] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:37.694 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:38.261 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:38.261 "name": "raid_bdev1", 00:30:38.261 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:38.261 "strip_size_kb": 0, 00:30:38.261 "state": "online", 00:30:38.261 "raid_level": "raid1", 00:30:38.261 "superblock": true, 00:30:38.261 "num_base_bdevs": 2, 00:30:38.261 "num_base_bdevs_discovered": 1, 00:30:38.261 "num_base_bdevs_operational": 1, 00:30:38.261 "base_bdevs_list": [ 00:30:38.261 { 00:30:38.261 "name": null, 00:30:38.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:38.261 "is_configured": false, 00:30:38.261 "data_offset": 256, 00:30:38.261 "data_size": 7936 00:30:38.261 }, 00:30:38.261 { 00:30:38.261 "name": "pt2", 00:30:38.261 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:38.261 "is_configured": true, 00:30:38.261 "data_offset": 256, 00:30:38.261 "data_size": 7936 00:30:38.261 } 00:30:38.261 ] 00:30:38.261 }' 00:30:38.262 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:38.262 06:46:51 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:38.829 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:39.088 [2024-07-25 06:46:52.459527] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:39.088 [2024-07-25 06:46:52.459552] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:39.088 [2024-07-25 06:46:52.459601] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:39.088 [2024-07-25 06:46:52.459639] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:39.088 [2024-07-25 06:46:52.459650] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc41a0 name raid_bdev1, state offline 00:30:39.088 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:39.088 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:30:39.346 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:30:39.346 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:30:39.346 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:30:39.346 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:30:39.346 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:39.605 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:30:39.605 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:30:39.605 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:30:39.605 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:30:39.605 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@534 -- # i=1 00:30:39.605 06:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:39.863 [2024-07-25 06:46:53.409982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:39.863 [2024-07-25 06:46:53.410028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:39.863 [2024-07-25 06:46:53.410045] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc5990 00:30:39.863 [2024-07-25 06:46:53.410056] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:39.863 [2024-07-25 06:46:53.411364] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:39.863 [2024-07-25 06:46:53.411388] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:39.863 [2024-07-25 06:46:53.411433] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:39.863 [2024-07-25 06:46:53.411455] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:39.863 [2024-07-25 06:46:53.411516] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc4740 00:30:39.863 [2024-07-25 06:46:53.411525] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:39.864 [2024-07-25 06:46:53.411581] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc3120 00:30:39.864 [2024-07-25 06:46:53.411648] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc4740 00:30:39.864 [2024-07-25 06:46:53.411657] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc4740 00:30:39.864 [2024-07-25 06:46:53.411706] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:39.864 pt2 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:40.121 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:40.121 "name": "raid_bdev1", 00:30:40.121 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:40.121 "strip_size_kb": 0, 00:30:40.121 "state": "online", 00:30:40.121 "raid_level": "raid1", 00:30:40.121 "superblock": true, 00:30:40.121 "num_base_bdevs": 2, 00:30:40.121 "num_base_bdevs_discovered": 1, 00:30:40.122 "num_base_bdevs_operational": 1, 00:30:40.122 "base_bdevs_list": [ 00:30:40.122 { 00:30:40.122 "name": null, 00:30:40.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.122 "is_configured": false, 00:30:40.122 "data_offset": 256, 00:30:40.122 "data_size": 7936 00:30:40.122 }, 00:30:40.122 { 00:30:40.122 "name": "pt2", 00:30:40.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:40.122 "is_configured": true, 00:30:40.122 "data_offset": 256, 00:30:40.122 "data_size": 7936 00:30:40.122 } 00:30:40.122 ] 00:30:40.122 }' 00:30:40.122 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:40.122 06:46:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:40.686 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:40.945 [2024-07-25 06:46:54.424633] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:40.945 [2024-07-25 06:46:54.424655] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:40.945 [2024-07-25 06:46:54.424703] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:40.945 [2024-07-25 06:46:54.424742] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:40.945 [2024-07-25 06:46:54.424752] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc4740 name raid_bdev1, state offline 00:30:40.945 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:40.945 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:30:41.203 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:30:41.203 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:30:41.203 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:30:41.203 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:41.462 [2024-07-25 06:46:54.881815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:41.462 [2024-07-25 06:46:54.881855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:41.462 [2024-07-25 06:46:54.881872] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc5bc0 00:30:41.462 [2024-07-25 06:46:54.881883] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:41.462 [2024-07-25 06:46:54.883194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:41.462 [2024-07-25 06:46:54.883219] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:41.462 [2024-07-25 06:46:54.883262] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:41.462 [2024-07-25 06:46:54.883284] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:41.462 [2024-07-25 06:46:54.883358] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:30:41.462 [2024-07-25 06:46:54.883369] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:41.462 [2024-07-25 06:46:54.883382] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc5eb0 name raid_bdev1, state configuring 00:30:41.462 [2024-07-25 06:46:54.883402] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:41.462 [2024-07-25 06:46:54.883451] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc5eb0 00:30:41.462 [2024-07-25 06:46:54.883461] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:41.462 [2024-07-25 06:46:54.883513] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc4710 00:30:41.462 [2024-07-25 06:46:54.883578] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc5eb0 00:30:41.462 [2024-07-25 06:46:54.883586] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc5eb0 00:30:41.462 [2024-07-25 06:46:54.883639] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:41.462 pt1 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:41.462 06:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:41.720 06:46:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:41.720 "name": "raid_bdev1", 00:30:41.720 "uuid": "e3fe5a0c-c73e-488c-a32e-5cff2194924e", 00:30:41.720 "strip_size_kb": 0, 00:30:41.721 "state": "online", 00:30:41.721 "raid_level": "raid1", 00:30:41.721 "superblock": true, 00:30:41.721 "num_base_bdevs": 2, 00:30:41.721 "num_base_bdevs_discovered": 1, 00:30:41.721 "num_base_bdevs_operational": 1, 00:30:41.721 "base_bdevs_list": [ 00:30:41.721 { 00:30:41.721 "name": null, 00:30:41.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:41.721 "is_configured": false, 00:30:41.721 "data_offset": 256, 00:30:41.721 "data_size": 7936 00:30:41.721 }, 00:30:41.721 { 00:30:41.721 "name": "pt2", 00:30:41.721 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:41.721 "is_configured": true, 00:30:41.721 "data_offset": 256, 00:30:41.721 "data_size": 7936 00:30:41.721 } 00:30:41.721 ] 00:30:41.721 }' 00:30:41.721 06:46:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:41.721 06:46:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:42.287 06:46:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:30:42.287 06:46:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:30:42.546 06:46:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:30:42.546 06:46:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:30:42.546 06:46:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:42.804 [2024-07-25 06:46:56.149371] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # '[' e3fe5a0c-c73e-488c-a32e-5cff2194924e '!=' e3fe5a0c-c73e-488c-a32e-5cff2194924e ']' 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@578 -- # killprocess 1282084 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1282084 ']' 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1282084 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1282084 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1282084' 00:30:42.804 killing process with pid 1282084 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@969 -- # kill 1282084 00:30:42.804 [2024-07-25 06:46:56.222355] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:42.804 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@974 -- # wait 1282084 00:30:42.804 [2024-07-25 06:46:56.222404] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:42.804 [2024-07-25 06:46:56.222445] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:42.804 [2024-07-25 06:46:56.222455] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc5eb0 name raid_bdev1, state offline 00:30:42.804 [2024-07-25 06:46:56.238540] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:43.062 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@580 -- # return 0 00:30:43.062 00:30:43.062 real 0m16.390s 00:30:43.062 user 0m29.900s 00:30:43.062 sys 0m2.870s 00:30:43.062 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:43.062 06:46:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:43.062 ************************************ 00:30:43.062 END TEST raid_superblock_test_md_interleaved 00:30:43.062 ************************************ 00:30:43.062 06:46:56 bdev_raid -- bdev/bdev_raid.sh@994 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:30:43.062 06:46:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:30:43.062 06:46:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:43.062 06:46:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:43.062 ************************************ 00:30:43.062 START TEST raid_rebuild_test_sb_md_interleaved 00:30:43.062 ************************************ 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false false 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # local verify=false 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # local strip_size 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # local create_arg 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@594 -- # local data_offset 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # raid_pid=1285044 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@613 -- # waitforlisten 1285044 /var/tmp/spdk-raid.sock 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1285044 ']' 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:43.062 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:43.063 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:43.063 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:43.063 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:43.063 06:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:30:43.063 [2024-07-25 06:46:56.553079] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:30:43.063 [2024-07-25 06:46:56.553146] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1285044 ] 00:30:43.063 I/O size of 3145728 is greater than zero copy threshold (65536). 00:30:43.063 Zero copy mechanism will not be used. 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:01.0 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:01.1 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:01.2 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:01.3 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:01.4 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:01.5 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:01.6 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:01.7 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:02.0 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:02.1 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:02.2 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:02.3 cannot be used 00:30:43.321 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.321 EAL: Requested device 0000:3d:02.4 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3d:02.5 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3d:02.6 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3d:02.7 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:01.0 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:01.1 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:01.2 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:01.3 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:01.4 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:01.5 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:01.6 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:01.7 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:02.0 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:02.1 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:02.2 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:02.3 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:02.4 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:02.5 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:02.6 cannot be used 00:30:43.322 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:30:43.322 EAL: Requested device 0000:3f:02.7 cannot be used 00:30:43.322 [2024-07-25 06:46:56.687320] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:43.322 [2024-07-25 06:46:56.732214] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:43.322 [2024-07-25 06:46:56.789992] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:43.322 [2024-07-25 06:46:56.790025] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:43.888 06:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:43.888 06:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:30:43.888 06:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:30:43.888 06:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:30:44.454 BaseBdev1_malloc 00:30:44.454 06:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:45.062 [2024-07-25 06:46:58.430151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:45.062 [2024-07-25 06:46:58.430197] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:45.062 [2024-07-25 06:46:58.430221] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1297f50 00:30:45.062 [2024-07-25 06:46:58.430232] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:45.062 [2024-07-25 06:46:58.431581] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:45.062 [2024-07-25 06:46:58.431607] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:45.062 BaseBdev1 00:30:45.062 06:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:30:45.062 06:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:30:45.320 BaseBdev2_malloc 00:30:45.320 06:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:30:45.579 [2024-07-25 06:46:58.904101] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:30:45.579 [2024-07-25 06:46:58.904154] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:45.579 [2024-07-25 06:46:58.904177] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14261d0 00:30:45.579 [2024-07-25 06:46:58.904188] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:45.579 [2024-07-25 06:46:58.905488] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:45.579 [2024-07-25 06:46:58.905513] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:45.579 BaseBdev2 00:30:45.579 06:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:30:45.837 spare_malloc 00:30:45.837 06:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:30:45.837 spare_delay 00:30:45.837 06:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:46.404 [2024-07-25 06:46:59.847056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:46.404 [2024-07-25 06:46:59.847097] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:46.404 [2024-07-25 06:46:59.847124] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1426d50 00:30:46.404 [2024-07-25 06:46:59.847135] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:46.404 [2024-07-25 06:46:59.848381] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:46.404 [2024-07-25 06:46:59.848407] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:46.404 spare 00:30:46.404 06:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:30:46.662 [2024-07-25 06:47:00.075677] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:46.662 [2024-07-25 06:47:00.076848] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:46.662 [2024-07-25 06:47:00.076999] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x142a2b0 00:30:46.662 [2024-07-25 06:47:00.077012] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:46.662 [2024-07-25 06:47:00.077077] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x128ecd0 00:30:46.662 [2024-07-25 06:47:00.077161] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x142a2b0 00:30:46.662 [2024-07-25 06:47:00.077170] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x142a2b0 00:30:46.662 [2024-07-25 06:47:00.077222] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:46.662 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:46.921 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:46.921 "name": "raid_bdev1", 00:30:46.921 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:46.921 "strip_size_kb": 0, 00:30:46.921 "state": "online", 00:30:46.921 "raid_level": "raid1", 00:30:46.921 "superblock": true, 00:30:46.921 "num_base_bdevs": 2, 00:30:46.921 "num_base_bdevs_discovered": 2, 00:30:46.921 "num_base_bdevs_operational": 2, 00:30:46.921 "base_bdevs_list": [ 00:30:46.921 { 00:30:46.921 "name": "BaseBdev1", 00:30:46.921 "uuid": "dea0ca46-279d-5466-9c19-3793f87c238b", 00:30:46.921 "is_configured": true, 00:30:46.921 "data_offset": 256, 00:30:46.921 "data_size": 7936 00:30:46.921 }, 00:30:46.921 { 00:30:46.921 "name": "BaseBdev2", 00:30:46.921 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:46.921 "is_configured": true, 00:30:46.921 "data_offset": 256, 00:30:46.921 "data_size": 7936 00:30:46.921 } 00:30:46.921 ] 00:30:46.921 }' 00:30:46.921 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:46.921 06:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:47.855 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:47.855 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:30:47.855 [2024-07-25 06:47:01.395350] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:48.113 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:30:48.113 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:48.113 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:30:48.113 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:30:48.113 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:30:48.114 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # '[' false = true ']' 00:30:48.114 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:30:48.373 [2024-07-25 06:47:01.856319] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:48.373 06:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:48.632 06:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:48.632 "name": "raid_bdev1", 00:30:48.632 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:48.632 "strip_size_kb": 0, 00:30:48.632 "state": "online", 00:30:48.632 "raid_level": "raid1", 00:30:48.632 "superblock": true, 00:30:48.632 "num_base_bdevs": 2, 00:30:48.632 "num_base_bdevs_discovered": 1, 00:30:48.632 "num_base_bdevs_operational": 1, 00:30:48.632 "base_bdevs_list": [ 00:30:48.632 { 00:30:48.632 "name": null, 00:30:48.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:48.632 "is_configured": false, 00:30:48.632 "data_offset": 256, 00:30:48.632 "data_size": 7936 00:30:48.632 }, 00:30:48.632 { 00:30:48.632 "name": "BaseBdev2", 00:30:48.632 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:48.632 "is_configured": true, 00:30:48.632 "data_offset": 256, 00:30:48.632 "data_size": 7936 00:30:48.632 } 00:30:48.632 ] 00:30:48.632 }' 00:30:48.632 06:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:48.632 06:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:49.199 06:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:49.457 [2024-07-25 06:47:02.871067] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:49.458 [2024-07-25 06:47:02.874459] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x128f300 00:30:49.458 [2024-07-25 06:47:02.876576] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:49.458 06:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:50.391 06:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:50.391 06:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:50.391 06:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:50.391 06:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:50.391 06:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:50.391 06:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:50.391 06:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:50.649 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:50.649 "name": "raid_bdev1", 00:30:50.649 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:50.649 "strip_size_kb": 0, 00:30:50.649 "state": "online", 00:30:50.649 "raid_level": "raid1", 00:30:50.649 "superblock": true, 00:30:50.649 "num_base_bdevs": 2, 00:30:50.649 "num_base_bdevs_discovered": 2, 00:30:50.649 "num_base_bdevs_operational": 2, 00:30:50.649 "process": { 00:30:50.649 "type": "rebuild", 00:30:50.649 "target": "spare", 00:30:50.649 "progress": { 00:30:50.649 "blocks": 2816, 00:30:50.649 "percent": 35 00:30:50.649 } 00:30:50.649 }, 00:30:50.649 "base_bdevs_list": [ 00:30:50.649 { 00:30:50.649 "name": "spare", 00:30:50.649 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:50.649 "is_configured": true, 00:30:50.649 "data_offset": 256, 00:30:50.649 "data_size": 7936 00:30:50.649 }, 00:30:50.649 { 00:30:50.649 "name": "BaseBdev2", 00:30:50.649 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:50.649 "is_configured": true, 00:30:50.649 "data_offset": 256, 00:30:50.649 "data_size": 7936 00:30:50.649 } 00:30:50.649 ] 00:30:50.649 }' 00:30:50.649 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:50.649 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:50.649 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:50.649 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:50.649 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:50.908 [2024-07-25 06:47:04.321299] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:50.908 [2024-07-25 06:47:04.387603] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:50.908 [2024-07-25 06:47:04.387644] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:50.908 [2024-07-25 06:47:04.387658] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:50.908 [2024-07-25 06:47:04.387666] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:50.908 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:51.166 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:51.166 "name": "raid_bdev1", 00:30:51.166 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:51.166 "strip_size_kb": 0, 00:30:51.166 "state": "online", 00:30:51.167 "raid_level": "raid1", 00:30:51.167 "superblock": true, 00:30:51.167 "num_base_bdevs": 2, 00:30:51.167 "num_base_bdevs_discovered": 1, 00:30:51.167 "num_base_bdevs_operational": 1, 00:30:51.167 "base_bdevs_list": [ 00:30:51.167 { 00:30:51.167 "name": null, 00:30:51.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:51.167 "is_configured": false, 00:30:51.167 "data_offset": 256, 00:30:51.167 "data_size": 7936 00:30:51.167 }, 00:30:51.167 { 00:30:51.167 "name": "BaseBdev2", 00:30:51.167 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:51.167 "is_configured": true, 00:30:51.167 "data_offset": 256, 00:30:51.167 "data_size": 7936 00:30:51.167 } 00:30:51.167 ] 00:30:51.167 }' 00:30:51.167 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:51.167 06:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:51.733 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:51.733 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:51.733 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:51.733 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:51.733 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:51.734 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:51.734 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:51.992 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:51.992 "name": "raid_bdev1", 00:30:51.992 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:51.992 "strip_size_kb": 0, 00:30:51.992 "state": "online", 00:30:51.992 "raid_level": "raid1", 00:30:51.992 "superblock": true, 00:30:51.992 "num_base_bdevs": 2, 00:30:51.992 "num_base_bdevs_discovered": 1, 00:30:51.992 "num_base_bdevs_operational": 1, 00:30:51.992 "base_bdevs_list": [ 00:30:51.992 { 00:30:51.992 "name": null, 00:30:51.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:51.992 "is_configured": false, 00:30:51.992 "data_offset": 256, 00:30:51.992 "data_size": 7936 00:30:51.992 }, 00:30:51.992 { 00:30:51.992 "name": "BaseBdev2", 00:30:51.992 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:51.992 "is_configured": true, 00:30:51.992 "data_offset": 256, 00:30:51.992 "data_size": 7936 00:30:51.992 } 00:30:51.992 ] 00:30:51.992 }' 00:30:51.992 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:51.992 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:51.992 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:51.992 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:51.992 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:52.251 [2024-07-25 06:47:05.726821] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:52.251 [2024-07-25 06:47:05.730240] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x128f300 00:30:52.251 [2024-07-25 06:47:05.731574] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:52.251 06:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@678 -- # sleep 1 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:53.629 "name": "raid_bdev1", 00:30:53.629 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:53.629 "strip_size_kb": 0, 00:30:53.629 "state": "online", 00:30:53.629 "raid_level": "raid1", 00:30:53.629 "superblock": true, 00:30:53.629 "num_base_bdevs": 2, 00:30:53.629 "num_base_bdevs_discovered": 2, 00:30:53.629 "num_base_bdevs_operational": 2, 00:30:53.629 "process": { 00:30:53.629 "type": "rebuild", 00:30:53.629 "target": "spare", 00:30:53.629 "progress": { 00:30:53.629 "blocks": 3072, 00:30:53.629 "percent": 38 00:30:53.629 } 00:30:53.629 }, 00:30:53.629 "base_bdevs_list": [ 00:30:53.629 { 00:30:53.629 "name": "spare", 00:30:53.629 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:53.629 "is_configured": true, 00:30:53.629 "data_offset": 256, 00:30:53.629 "data_size": 7936 00:30:53.629 }, 00:30:53.629 { 00:30:53.629 "name": "BaseBdev2", 00:30:53.629 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:53.629 "is_configured": true, 00:30:53.629 "data_offset": 256, 00:30:53.629 "data_size": 7936 00:30:53.629 } 00:30:53.629 ] 00:30:53.629 }' 00:30:53.629 06:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:30:53.629 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # local timeout=1076 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:53.629 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:53.888 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:53.888 "name": "raid_bdev1", 00:30:53.888 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:53.888 "strip_size_kb": 0, 00:30:53.888 "state": "online", 00:30:53.888 "raid_level": "raid1", 00:30:53.888 "superblock": true, 00:30:53.889 "num_base_bdevs": 2, 00:30:53.889 "num_base_bdevs_discovered": 2, 00:30:53.889 "num_base_bdevs_operational": 2, 00:30:53.889 "process": { 00:30:53.889 "type": "rebuild", 00:30:53.889 "target": "spare", 00:30:53.889 "progress": { 00:30:53.889 "blocks": 3840, 00:30:53.889 "percent": 48 00:30:53.889 } 00:30:53.889 }, 00:30:53.889 "base_bdevs_list": [ 00:30:53.889 { 00:30:53.889 "name": "spare", 00:30:53.889 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:53.889 "is_configured": true, 00:30:53.889 "data_offset": 256, 00:30:53.889 "data_size": 7936 00:30:53.889 }, 00:30:53.889 { 00:30:53.889 "name": "BaseBdev2", 00:30:53.889 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:53.889 "is_configured": true, 00:30:53.889 "data_offset": 256, 00:30:53.889 "data_size": 7936 00:30:53.889 } 00:30:53.889 ] 00:30:53.889 }' 00:30:53.889 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:53.889 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:53.889 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:53.889 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:53.889 06:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@726 -- # sleep 1 00:30:54.827 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:55.086 "name": "raid_bdev1", 00:30:55.086 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:55.086 "strip_size_kb": 0, 00:30:55.086 "state": "online", 00:30:55.086 "raid_level": "raid1", 00:30:55.086 "superblock": true, 00:30:55.086 "num_base_bdevs": 2, 00:30:55.086 "num_base_bdevs_discovered": 2, 00:30:55.086 "num_base_bdevs_operational": 2, 00:30:55.086 "process": { 00:30:55.086 "type": "rebuild", 00:30:55.086 "target": "spare", 00:30:55.086 "progress": { 00:30:55.086 "blocks": 7168, 00:30:55.086 "percent": 90 00:30:55.086 } 00:30:55.086 }, 00:30:55.086 "base_bdevs_list": [ 00:30:55.086 { 00:30:55.086 "name": "spare", 00:30:55.086 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:55.086 "is_configured": true, 00:30:55.086 "data_offset": 256, 00:30:55.086 "data_size": 7936 00:30:55.086 }, 00:30:55.086 { 00:30:55.086 "name": "BaseBdev2", 00:30:55.086 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:55.086 "is_configured": true, 00:30:55.086 "data_offset": 256, 00:30:55.086 "data_size": 7936 00:30:55.086 } 00:30:55.086 ] 00:30:55.086 }' 00:30:55.086 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:55.346 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:55.346 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:55.346 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:55.346 06:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@726 -- # sleep 1 00:30:55.346 [2024-07-25 06:47:08.854116] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:30:55.346 [2024-07-25 06:47:08.854172] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:30:55.346 [2024-07-25 06:47:08.854249] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:56.282 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:30:56.282 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:56.282 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:56.282 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:56.282 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:56.282 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:56.282 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:56.282 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:56.540 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:56.540 "name": "raid_bdev1", 00:30:56.540 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:56.540 "strip_size_kb": 0, 00:30:56.540 "state": "online", 00:30:56.540 "raid_level": "raid1", 00:30:56.540 "superblock": true, 00:30:56.540 "num_base_bdevs": 2, 00:30:56.540 "num_base_bdevs_discovered": 2, 00:30:56.540 "num_base_bdevs_operational": 2, 00:30:56.540 "base_bdevs_list": [ 00:30:56.540 { 00:30:56.540 "name": "spare", 00:30:56.540 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:56.540 "is_configured": true, 00:30:56.540 "data_offset": 256, 00:30:56.540 "data_size": 7936 00:30:56.540 }, 00:30:56.540 { 00:30:56.540 "name": "BaseBdev2", 00:30:56.540 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:56.540 "is_configured": true, 00:30:56.540 "data_offset": 256, 00:30:56.540 "data_size": 7936 00:30:56.540 } 00:30:56.540 ] 00:30:56.540 }' 00:30:56.540 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:56.540 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:56.540 06:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@724 -- # break 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:56.540 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:56.799 "name": "raid_bdev1", 00:30:56.799 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:56.799 "strip_size_kb": 0, 00:30:56.799 "state": "online", 00:30:56.799 "raid_level": "raid1", 00:30:56.799 "superblock": true, 00:30:56.799 "num_base_bdevs": 2, 00:30:56.799 "num_base_bdevs_discovered": 2, 00:30:56.799 "num_base_bdevs_operational": 2, 00:30:56.799 "base_bdevs_list": [ 00:30:56.799 { 00:30:56.799 "name": "spare", 00:30:56.799 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:56.799 "is_configured": true, 00:30:56.799 "data_offset": 256, 00:30:56.799 "data_size": 7936 00:30:56.799 }, 00:30:56.799 { 00:30:56.799 "name": "BaseBdev2", 00:30:56.799 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:56.799 "is_configured": true, 00:30:56.799 "data_offset": 256, 00:30:56.799 "data_size": 7936 00:30:56.799 } 00:30:56.799 ] 00:30:56.799 }' 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:56.799 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:56.800 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:56.800 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:56.800 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:56.800 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:57.058 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:57.058 "name": "raid_bdev1", 00:30:57.058 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:57.058 "strip_size_kb": 0, 00:30:57.058 "state": "online", 00:30:57.058 "raid_level": "raid1", 00:30:57.058 "superblock": true, 00:30:57.058 "num_base_bdevs": 2, 00:30:57.058 "num_base_bdevs_discovered": 2, 00:30:57.058 "num_base_bdevs_operational": 2, 00:30:57.058 "base_bdevs_list": [ 00:30:57.058 { 00:30:57.058 "name": "spare", 00:30:57.058 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:57.058 "is_configured": true, 00:30:57.058 "data_offset": 256, 00:30:57.058 "data_size": 7936 00:30:57.058 }, 00:30:57.058 { 00:30:57.058 "name": "BaseBdev2", 00:30:57.058 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:57.058 "is_configured": true, 00:30:57.058 "data_offset": 256, 00:30:57.058 "data_size": 7936 00:30:57.058 } 00:30:57.058 ] 00:30:57.058 }' 00:30:57.058 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:57.058 06:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:57.626 06:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:57.886 [2024-07-25 06:47:11.300524] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:57.886 [2024-07-25 06:47:11.300547] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:57.886 [2024-07-25 06:47:11.300603] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:57.886 [2024-07-25 06:47:11.300653] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:57.886 [2024-07-25 06:47:11.300663] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x142a2b0 name raid_bdev1, state offline 00:30:57.886 06:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:57.886 06:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # jq length 00:30:58.145 06:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:30:58.145 06:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@737 -- # '[' false = true ']' 00:30:58.145 06:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:30:58.145 06:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:58.797 06:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:58.797 [2024-07-25 06:47:12.206926] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:58.797 [2024-07-25 06:47:12.206967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:58.797 [2024-07-25 06:47:12.206987] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1432a10 00:30:58.798 [2024-07-25 06:47:12.206999] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:58.798 [2024-07-25 06:47:12.208407] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:58.798 [2024-07-25 06:47:12.208433] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:58.798 [2024-07-25 06:47:12.208483] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:58.798 [2024-07-25 06:47:12.208506] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:58.798 [2024-07-25 06:47:12.208584] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:58.798 spare 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.798 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:58.798 [2024-07-25 06:47:12.308886] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1296f70 00:30:58.798 [2024-07-25 06:47:12.308901] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:58.798 [2024-07-25 06:47:12.308973] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12901e0 00:30:58.798 [2024-07-25 06:47:12.309058] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1296f70 00:30:58.798 [2024-07-25 06:47:12.309067] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1296f70 00:30:58.798 [2024-07-25 06:47:12.309129] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:59.057 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:59.057 "name": "raid_bdev1", 00:30:59.057 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:59.057 "strip_size_kb": 0, 00:30:59.057 "state": "online", 00:30:59.057 "raid_level": "raid1", 00:30:59.057 "superblock": true, 00:30:59.057 "num_base_bdevs": 2, 00:30:59.057 "num_base_bdevs_discovered": 2, 00:30:59.057 "num_base_bdevs_operational": 2, 00:30:59.057 "base_bdevs_list": [ 00:30:59.057 { 00:30:59.057 "name": "spare", 00:30:59.057 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:59.057 "is_configured": true, 00:30:59.057 "data_offset": 256, 00:30:59.057 "data_size": 7936 00:30:59.057 }, 00:30:59.057 { 00:30:59.057 "name": "BaseBdev2", 00:30:59.057 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:59.057 "is_configured": true, 00:30:59.057 "data_offset": 256, 00:30:59.057 "data_size": 7936 00:30:59.057 } 00:30:59.057 ] 00:30:59.057 }' 00:30:59.057 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:59.057 06:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:59.625 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:59.625 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:59.625 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:59.625 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:59.625 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:59.625 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:59.625 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:59.885 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:59.885 "name": "raid_bdev1", 00:30:59.885 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:30:59.885 "strip_size_kb": 0, 00:30:59.885 "state": "online", 00:30:59.885 "raid_level": "raid1", 00:30:59.885 "superblock": true, 00:30:59.885 "num_base_bdevs": 2, 00:30:59.885 "num_base_bdevs_discovered": 2, 00:30:59.885 "num_base_bdevs_operational": 2, 00:30:59.885 "base_bdevs_list": [ 00:30:59.885 { 00:30:59.885 "name": "spare", 00:30:59.885 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:30:59.885 "is_configured": true, 00:30:59.885 "data_offset": 256, 00:30:59.885 "data_size": 7936 00:30:59.885 }, 00:30:59.885 { 00:30:59.885 "name": "BaseBdev2", 00:30:59.885 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:30:59.885 "is_configured": true, 00:30:59.885 "data_offset": 256, 00:30:59.885 "data_size": 7936 00:30:59.885 } 00:30:59.885 ] 00:30:59.885 }' 00:30:59.885 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:59.885 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:59.885 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:59.885 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:59.885 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:59.885 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:31:00.143 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:31:00.143 06:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:31:00.709 [2024-07-25 06:47:13.983725] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:00.709 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:00.709 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:00.709 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:00.709 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:00.709 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:00.709 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:00.709 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:00.709 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:00.710 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:00.710 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:00.710 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:00.710 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:00.710 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:00.710 "name": "raid_bdev1", 00:31:00.710 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:00.710 "strip_size_kb": 0, 00:31:00.710 "state": "online", 00:31:00.710 "raid_level": "raid1", 00:31:00.710 "superblock": true, 00:31:00.710 "num_base_bdevs": 2, 00:31:00.710 "num_base_bdevs_discovered": 1, 00:31:00.710 "num_base_bdevs_operational": 1, 00:31:00.710 "base_bdevs_list": [ 00:31:00.710 { 00:31:00.710 "name": null, 00:31:00.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:00.710 "is_configured": false, 00:31:00.710 "data_offset": 256, 00:31:00.710 "data_size": 7936 00:31:00.710 }, 00:31:00.710 { 00:31:00.710 "name": "BaseBdev2", 00:31:00.710 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:00.710 "is_configured": true, 00:31:00.710 "data_offset": 256, 00:31:00.710 "data_size": 7936 00:31:00.710 } 00:31:00.710 ] 00:31:00.710 }' 00:31:00.710 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:00.710 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:01.277 06:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:31:01.536 [2024-07-25 06:47:15.010452] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:01.536 [2024-07-25 06:47:15.010585] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:31:01.536 [2024-07-25 06:47:15.010600] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:31:01.536 [2024-07-25 06:47:15.010627] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:01.536 [2024-07-25 06:47:15.013903] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12901e0 00:31:01.536 [2024-07-25 06:47:15.016057] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:01.536 06:47:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # sleep 1 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:02.914 "name": "raid_bdev1", 00:31:02.914 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:02.914 "strip_size_kb": 0, 00:31:02.914 "state": "online", 00:31:02.914 "raid_level": "raid1", 00:31:02.914 "superblock": true, 00:31:02.914 "num_base_bdevs": 2, 00:31:02.914 "num_base_bdevs_discovered": 2, 00:31:02.914 "num_base_bdevs_operational": 2, 00:31:02.914 "process": { 00:31:02.914 "type": "rebuild", 00:31:02.914 "target": "spare", 00:31:02.914 "progress": { 00:31:02.914 "blocks": 2816, 00:31:02.914 "percent": 35 00:31:02.914 } 00:31:02.914 }, 00:31:02.914 "base_bdevs_list": [ 00:31:02.914 { 00:31:02.914 "name": "spare", 00:31:02.914 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:31:02.914 "is_configured": true, 00:31:02.914 "data_offset": 256, 00:31:02.914 "data_size": 7936 00:31:02.914 }, 00:31:02.914 { 00:31:02.914 "name": "BaseBdev2", 00:31:02.914 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:02.914 "is_configured": true, 00:31:02.914 "data_offset": 256, 00:31:02.914 "data_size": 7936 00:31:02.914 } 00:31:02.914 ] 00:31:02.914 }' 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:02.914 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:03.174 [2024-07-25 06:47:16.516998] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:03.174 [2024-07-25 06:47:16.527089] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:03.174 [2024-07-25 06:47:16.527127] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:03.174 [2024-07-25 06:47:16.527147] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:03.174 [2024-07-25 06:47:16.527154] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:03.174 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:03.434 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:03.434 "name": "raid_bdev1", 00:31:03.434 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:03.434 "strip_size_kb": 0, 00:31:03.434 "state": "online", 00:31:03.434 "raid_level": "raid1", 00:31:03.434 "superblock": true, 00:31:03.434 "num_base_bdevs": 2, 00:31:03.434 "num_base_bdevs_discovered": 1, 00:31:03.434 "num_base_bdevs_operational": 1, 00:31:03.434 "base_bdevs_list": [ 00:31:03.434 { 00:31:03.434 "name": null, 00:31:03.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:03.434 "is_configured": false, 00:31:03.434 "data_offset": 256, 00:31:03.434 "data_size": 7936 00:31:03.434 }, 00:31:03.434 { 00:31:03.434 "name": "BaseBdev2", 00:31:03.434 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:03.434 "is_configured": true, 00:31:03.434 "data_offset": 256, 00:31:03.434 "data_size": 7936 00:31:03.434 } 00:31:03.434 ] 00:31:03.434 }' 00:31:03.434 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:03.434 06:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:04.002 06:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:31:04.002 [2024-07-25 06:47:17.544917] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:31:04.002 [2024-07-25 06:47:17.544962] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:04.002 [2024-07-25 06:47:17.544985] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x128ffe0 00:31:04.002 [2024-07-25 06:47:17.544997] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:04.002 [2024-07-25 06:47:17.545167] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:04.002 [2024-07-25 06:47:17.545181] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:31:04.002 [2024-07-25 06:47:17.545232] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:31:04.002 [2024-07-25 06:47:17.545242] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:31:04.002 [2024-07-25 06:47:17.545251] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:31:04.002 [2024-07-25 06:47:17.545268] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:04.002 [2024-07-25 06:47:17.548540] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x128f7e0 00:31:04.002 [2024-07-25 06:47:17.549870] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:04.002 spare 00:31:04.261 06:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # sleep 1 00:31:05.198 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:05.198 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:05.198 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:05.198 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:05.198 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:05.198 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:05.198 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:05.457 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:05.457 "name": "raid_bdev1", 00:31:05.457 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:05.457 "strip_size_kb": 0, 00:31:05.457 "state": "online", 00:31:05.457 "raid_level": "raid1", 00:31:05.457 "superblock": true, 00:31:05.457 "num_base_bdevs": 2, 00:31:05.457 "num_base_bdevs_discovered": 2, 00:31:05.457 "num_base_bdevs_operational": 2, 00:31:05.457 "process": { 00:31:05.457 "type": "rebuild", 00:31:05.457 "target": "spare", 00:31:05.457 "progress": { 00:31:05.457 "blocks": 3072, 00:31:05.457 "percent": 38 00:31:05.457 } 00:31:05.457 }, 00:31:05.457 "base_bdevs_list": [ 00:31:05.457 { 00:31:05.457 "name": "spare", 00:31:05.457 "uuid": "187e1911-7663-58aa-b6a0-3c5cc4691544", 00:31:05.457 "is_configured": true, 00:31:05.457 "data_offset": 256, 00:31:05.457 "data_size": 7936 00:31:05.458 }, 00:31:05.458 { 00:31:05.458 "name": "BaseBdev2", 00:31:05.458 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:05.458 "is_configured": true, 00:31:05.458 "data_offset": 256, 00:31:05.458 "data_size": 7936 00:31:05.458 } 00:31:05.458 ] 00:31:05.458 }' 00:31:05.458 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:05.458 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:05.458 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:05.458 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:05.458 06:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:05.717 [2024-07-25 06:47:19.102905] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:05.717 [2024-07-25 06:47:19.161550] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:05.717 [2024-07-25 06:47:19.161596] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:05.717 [2024-07-25 06:47:19.161610] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:05.717 [2024-07-25 06:47:19.161617] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:05.717 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:05.978 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:05.978 "name": "raid_bdev1", 00:31:05.978 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:05.978 "strip_size_kb": 0, 00:31:05.978 "state": "online", 00:31:05.978 "raid_level": "raid1", 00:31:05.978 "superblock": true, 00:31:05.978 "num_base_bdevs": 2, 00:31:05.978 "num_base_bdevs_discovered": 1, 00:31:05.978 "num_base_bdevs_operational": 1, 00:31:05.978 "base_bdevs_list": [ 00:31:05.978 { 00:31:05.978 "name": null, 00:31:05.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:05.978 "is_configured": false, 00:31:05.978 "data_offset": 256, 00:31:05.978 "data_size": 7936 00:31:05.978 }, 00:31:05.978 { 00:31:05.978 "name": "BaseBdev2", 00:31:05.978 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:05.978 "is_configured": true, 00:31:05.978 "data_offset": 256, 00:31:05.978 "data_size": 7936 00:31:05.978 } 00:31:05.978 ] 00:31:05.978 }' 00:31:05.978 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:05.978 06:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:06.545 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:06.545 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:06.545 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:06.545 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:06.545 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:06.545 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:06.545 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:06.803 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:06.803 "name": "raid_bdev1", 00:31:06.803 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:06.803 "strip_size_kb": 0, 00:31:06.803 "state": "online", 00:31:06.803 "raid_level": "raid1", 00:31:06.803 "superblock": true, 00:31:06.803 "num_base_bdevs": 2, 00:31:06.803 "num_base_bdevs_discovered": 1, 00:31:06.803 "num_base_bdevs_operational": 1, 00:31:06.803 "base_bdevs_list": [ 00:31:06.803 { 00:31:06.803 "name": null, 00:31:06.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:06.803 "is_configured": false, 00:31:06.803 "data_offset": 256, 00:31:06.803 "data_size": 7936 00:31:06.803 }, 00:31:06.803 { 00:31:06.803 "name": "BaseBdev2", 00:31:06.803 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:06.803 "is_configured": true, 00:31:06.803 "data_offset": 256, 00:31:06.803 "data_size": 7936 00:31:06.803 } 00:31:06.803 ] 00:31:06.803 }' 00:31:06.803 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:06.803 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:06.803 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:06.803 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:06.803 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:31:07.369 06:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:31:07.628 [2024-07-25 06:47:21.046063] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:31:07.628 [2024-07-25 06:47:21.046106] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:07.628 [2024-07-25 06:47:21.046127] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1291560 00:31:07.628 [2024-07-25 06:47:21.046144] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:07.628 [2024-07-25 06:47:21.046298] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:07.628 [2024-07-25 06:47:21.046312] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:31:07.628 [2024-07-25 06:47:21.046354] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:31:07.628 [2024-07-25 06:47:21.046364] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:31:07.628 [2024-07-25 06:47:21.046373] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:31:07.628 BaseBdev1 00:31:07.628 06:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@789 -- # sleep 1 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:08.565 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:08.824 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:08.824 "name": "raid_bdev1", 00:31:08.824 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:08.824 "strip_size_kb": 0, 00:31:08.824 "state": "online", 00:31:08.824 "raid_level": "raid1", 00:31:08.824 "superblock": true, 00:31:08.824 "num_base_bdevs": 2, 00:31:08.824 "num_base_bdevs_discovered": 1, 00:31:08.824 "num_base_bdevs_operational": 1, 00:31:08.824 "base_bdevs_list": [ 00:31:08.824 { 00:31:08.824 "name": null, 00:31:08.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:08.824 "is_configured": false, 00:31:08.824 "data_offset": 256, 00:31:08.824 "data_size": 7936 00:31:08.824 }, 00:31:08.824 { 00:31:08.824 "name": "BaseBdev2", 00:31:08.824 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:08.824 "is_configured": true, 00:31:08.824 "data_offset": 256, 00:31:08.824 "data_size": 7936 00:31:08.824 } 00:31:08.824 ] 00:31:08.824 }' 00:31:08.824 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:08.824 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:09.391 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:09.391 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:09.391 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:09.391 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:09.391 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:09.391 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:09.391 06:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:09.650 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:09.650 "name": "raid_bdev1", 00:31:09.650 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:09.650 "strip_size_kb": 0, 00:31:09.650 "state": "online", 00:31:09.650 "raid_level": "raid1", 00:31:09.650 "superblock": true, 00:31:09.650 "num_base_bdevs": 2, 00:31:09.650 "num_base_bdevs_discovered": 1, 00:31:09.650 "num_base_bdevs_operational": 1, 00:31:09.650 "base_bdevs_list": [ 00:31:09.650 { 00:31:09.650 "name": null, 00:31:09.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:09.650 "is_configured": false, 00:31:09.650 "data_offset": 256, 00:31:09.650 "data_size": 7936 00:31:09.650 }, 00:31:09.650 { 00:31:09.650 "name": "BaseBdev2", 00:31:09.650 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:09.650 "is_configured": true, 00:31:09.650 "data_offset": 256, 00:31:09.650 "data_size": 7936 00:31:09.650 } 00:31:09.650 ] 00:31:09.650 }' 00:31:09.650 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:09.650 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:09.650 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:09.650 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:09.650 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:09.650 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:31:09.651 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:09.909 [2024-07-25 06:47:23.368251] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:09.909 [2024-07-25 06:47:23.368359] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:31:09.909 [2024-07-25 06:47:23.368374] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:31:09.909 request: 00:31:09.909 { 00:31:09.909 "base_bdev": "BaseBdev1", 00:31:09.909 "raid_bdev": "raid_bdev1", 00:31:09.909 "method": "bdev_raid_add_base_bdev", 00:31:09.909 "req_id": 1 00:31:09.909 } 00:31:09.909 Got JSON-RPC error response 00:31:09.909 response: 00:31:09.909 { 00:31:09.909 "code": -22, 00:31:09.909 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:31:09.909 } 00:31:09.909 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:31:09.909 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:31:09.909 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:31:09.909 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:31:09.909 06:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@793 -- # sleep 1 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:10.842 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:11.101 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:11.101 "name": "raid_bdev1", 00:31:11.101 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:11.101 "strip_size_kb": 0, 00:31:11.101 "state": "online", 00:31:11.101 "raid_level": "raid1", 00:31:11.101 "superblock": true, 00:31:11.101 "num_base_bdevs": 2, 00:31:11.101 "num_base_bdevs_discovered": 1, 00:31:11.101 "num_base_bdevs_operational": 1, 00:31:11.101 "base_bdevs_list": [ 00:31:11.101 { 00:31:11.101 "name": null, 00:31:11.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:11.101 "is_configured": false, 00:31:11.101 "data_offset": 256, 00:31:11.101 "data_size": 7936 00:31:11.101 }, 00:31:11.101 { 00:31:11.101 "name": "BaseBdev2", 00:31:11.101 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:11.101 "is_configured": true, 00:31:11.101 "data_offset": 256, 00:31:11.101 "data_size": 7936 00:31:11.101 } 00:31:11.101 ] 00:31:11.101 }' 00:31:11.101 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:11.101 06:47:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:11.701 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:11.701 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:11.701 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:11.701 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:11.701 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:11.701 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:11.701 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:11.987 "name": "raid_bdev1", 00:31:11.987 "uuid": "df12c395-eab8-4b74-a6ab-16d967858746", 00:31:11.987 "strip_size_kb": 0, 00:31:11.987 "state": "online", 00:31:11.987 "raid_level": "raid1", 00:31:11.987 "superblock": true, 00:31:11.987 "num_base_bdevs": 2, 00:31:11.987 "num_base_bdevs_discovered": 1, 00:31:11.987 "num_base_bdevs_operational": 1, 00:31:11.987 "base_bdevs_list": [ 00:31:11.987 { 00:31:11.987 "name": null, 00:31:11.987 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:11.987 "is_configured": false, 00:31:11.987 "data_offset": 256, 00:31:11.987 "data_size": 7936 00:31:11.987 }, 00:31:11.987 { 00:31:11.987 "name": "BaseBdev2", 00:31:11.987 "uuid": "6f227085-0382-5a41-8dc5-1d0ecf1cb96e", 00:31:11.987 "is_configured": true, 00:31:11.987 "data_offset": 256, 00:31:11.987 "data_size": 7936 00:31:11.987 } 00:31:11.987 ] 00:31:11.987 }' 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@798 -- # killprocess 1285044 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1285044 ']' 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1285044 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1285044 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1285044' 00:31:11.987 killing process with pid 1285044 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 1285044 00:31:11.987 Received shutdown signal, test time was about 60.000000 seconds 00:31:11.987 00:31:11.987 Latency(us) 00:31:11.987 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:11.987 =================================================================================================================== 00:31:11.987 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:31:11.987 [2024-07-25 06:47:25.508398] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:11.987 [2024-07-25 06:47:25.508479] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:11.987 [2024-07-25 06:47:25.508519] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:11.987 [2024-07-25 06:47:25.508529] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1296f70 name raid_bdev1, state offline 00:31:11.987 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 1285044 00:31:11.987 [2024-07-25 06:47:25.532484] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:12.245 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@800 -- # return 0 00:31:12.245 00:31:12.245 real 0m29.215s 00:31:12.245 user 0m46.546s 00:31:12.245 sys 0m3.975s 00:31:12.245 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:12.245 06:47:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:12.245 ************************************ 00:31:12.245 END TEST raid_rebuild_test_sb_md_interleaved 00:31:12.245 ************************************ 00:31:12.245 06:47:25 bdev_raid -- bdev/bdev_raid.sh@996 -- # trap - EXIT 00:31:12.245 06:47:25 bdev_raid -- bdev/bdev_raid.sh@997 -- # cleanup 00:31:12.245 06:47:25 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1285044 ']' 00:31:12.245 06:47:25 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1285044 00:31:12.245 06:47:25 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:31:12.503 00:31:12.503 real 17m44.536s 00:31:12.503 user 29m57.310s 00:31:12.503 sys 3m16.779s 00:31:12.503 06:47:25 bdev_raid -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:12.503 06:47:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:12.503 ************************************ 00:31:12.503 END TEST bdev_raid 00:31:12.503 ************************************ 00:31:12.503 06:47:25 -- spdk/autotest.sh@195 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:31:12.503 06:47:25 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:31:12.503 06:47:25 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:12.503 06:47:25 -- common/autotest_common.sh@10 -- # set +x 00:31:12.503 ************************************ 00:31:12.503 START TEST bdevperf_config 00:31:12.503 ************************************ 00:31:12.503 06:47:25 bdevperf_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:31:12.503 * Looking for test storage... 00:31:12.503 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:12.503 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:12.503 00:31:12.503 06:47:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:12.503 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:12.503 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:12.503 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:12.503 06:47:26 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:15.788 06:47:28 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-25 06:47:26.083126] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:15.788 [2024-07-25 06:47:26.083193] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290289 ] 00:31:15.788 Using job config with 4 jobs 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:15.788 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.788 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:15.788 [2024-07-25 06:47:26.230410] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:15.788 [2024-07-25 06:47:26.292867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:15.788 cpumask for '\''job0'\'' is too big 00:31:15.789 cpumask for '\''job1'\'' is too big 00:31:15.789 cpumask for '\''job2'\'' is too big 00:31:15.789 cpumask for '\''job3'\'' is too big 00:31:15.789 Running I/O for 2 seconds... 00:31:15.789 00:31:15.789 Latency(us) 00:31:15.789 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:15.789 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.789 Malloc0 : 2.01 25842.08 25.24 0.00 0.00 9897.33 1690.83 15099.49 00:31:15.789 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.789 Malloc0 : 2.02 25851.53 25.25 0.00 0.00 9872.87 1677.72 13316.92 00:31:15.789 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.789 Malloc0 : 2.02 25829.66 25.22 0.00 0.00 9859.97 1677.72 11639.19 00:31:15.789 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.789 Malloc0 : 2.02 25807.68 25.20 0.00 0.00 9848.98 1677.72 10276.04 00:31:15.789 =================================================================================================================== 00:31:15.789 Total : 103330.94 100.91 0.00 0.00 9869.75 1677.72 15099.49' 00:31:15.789 06:47:28 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-25 06:47:26.083126] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:15.789 [2024-07-25 06:47:26.083193] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290289 ] 00:31:15.789 Using job config with 4 jobs 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:15.789 [2024-07-25 06:47:26.230410] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:15.789 [2024-07-25 06:47:26.292867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:15.789 cpumask for '\''job0'\'' is too big 00:31:15.789 cpumask for '\''job1'\'' is too big 00:31:15.789 cpumask for '\''job2'\'' is too big 00:31:15.789 cpumask for '\''job3'\'' is too big 00:31:15.789 Running I/O for 2 seconds... 00:31:15.789 00:31:15.789 Latency(us) 00:31:15.789 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:15.789 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.789 Malloc0 : 2.01 25842.08 25.24 0.00 0.00 9897.33 1690.83 15099.49 00:31:15.789 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.789 Malloc0 : 2.02 25851.53 25.25 0.00 0.00 9872.87 1677.72 13316.92 00:31:15.789 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.789 Malloc0 : 2.02 25829.66 25.22 0.00 0.00 9859.97 1677.72 11639.19 00:31:15.789 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.789 Malloc0 : 2.02 25807.68 25.20 0.00 0.00 9848.98 1677.72 10276.04 00:31:15.789 =================================================================================================================== 00:31:15.789 Total : 103330.94 100.91 0.00 0.00 9869.75 1677.72 15099.49' 00:31:15.789 06:47:28 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 06:47:26.083126] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:15.789 [2024-07-25 06:47:26.083193] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290289 ] 00:31:15.789 Using job config with 4 jobs 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:15.789 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.789 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:15.790 [2024-07-25 06:47:26.230410] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:15.790 [2024-07-25 06:47:26.292867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:15.790 cpumask for '\''job0'\'' is too big 00:31:15.790 cpumask for '\''job1'\'' is too big 00:31:15.790 cpumask for '\''job2'\'' is too big 00:31:15.790 cpumask for '\''job3'\'' is too big 00:31:15.790 Running I/O for 2 seconds... 00:31:15.790 00:31:15.790 Latency(us) 00:31:15.790 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:15.790 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.790 Malloc0 : 2.01 25842.08 25.24 0.00 0.00 9897.33 1690.83 15099.49 00:31:15.790 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.790 Malloc0 : 2.02 25851.53 25.25 0.00 0.00 9872.87 1677.72 13316.92 00:31:15.790 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.790 Malloc0 : 2.02 25829.66 25.22 0.00 0.00 9859.97 1677.72 11639.19 00:31:15.790 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:15.790 Malloc0 : 2.02 25807.68 25.20 0.00 0.00 9848.98 1677.72 10276.04 00:31:15.790 =================================================================================================================== 00:31:15.790 Total : 103330.94 100.91 0.00 0.00 9869.75 1677.72 15099.49' 00:31:15.790 06:47:28 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:15.790 06:47:28 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:15.790 06:47:28 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:31:15.790 06:47:28 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:15.790 [2024-07-25 06:47:28.730563] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:15.790 [2024-07-25 06:47:28.730626] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290802 ] 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:15.790 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:15.790 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:15.790 [2024-07-25 06:47:28.876477] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:15.790 [2024-07-25 06:47:28.938704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:15.790 cpumask for 'job0' is too big 00:31:15.790 cpumask for 'job1' is too big 00:31:15.790 cpumask for 'job2' is too big 00:31:15.790 cpumask for 'job3' is too big 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:31:18.327 Running I/O for 2 seconds... 00:31:18.327 00:31:18.327 Latency(us) 00:31:18.327 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:18.327 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:18.327 Malloc0 : 2.01 25807.35 25.20 0.00 0.00 9914.23 1717.04 15099.49 00:31:18.327 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:18.327 Malloc0 : 2.02 25785.36 25.18 0.00 0.00 9902.26 1690.83 13369.34 00:31:18.327 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:18.327 Malloc0 : 2.02 25826.55 25.22 0.00 0.00 9865.91 1677.72 11691.62 00:31:18.327 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:18.327 Malloc0 : 2.02 25804.59 25.20 0.00 0.00 9853.44 1690.83 10223.62 00:31:18.327 =================================================================================================================== 00:31:18.327 Total : 103223.85 100.80 0.00 0.00 9883.90 1677.72 15099.49' 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:18.327 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:18.327 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:18.327 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:18.327 06:47:31 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:20.864 06:47:33 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-25 06:47:31.386458] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:20.864 [2024-07-25 06:47:31.386522] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291279 ] 00:31:20.864 Using job config with 3 jobs 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:20.864 [2024-07-25 06:47:31.534405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:20.864 [2024-07-25 06:47:31.597399] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:20.864 cpumask for '\''job0'\'' is too big 00:31:20.864 cpumask for '\''job1'\'' is too big 00:31:20.864 cpumask for '\''job2'\'' is too big 00:31:20.864 Running I/O for 2 seconds... 00:31:20.864 00:31:20.864 Latency(us) 00:31:20.864 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:20.864 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.864 Malloc0 : 2.01 34824.85 34.01 0.00 0.00 7346.27 1690.83 10747.90 00:31:20.864 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.864 Malloc0 : 2.02 34795.17 33.98 0.00 0.00 7337.45 1658.06 9070.18 00:31:20.864 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.864 Malloc0 : 2.02 34765.54 33.95 0.00 0.00 7328.41 1664.61 7654.60 00:31:20.864 =================================================================================================================== 00:31:20.864 Total : 104385.56 101.94 0.00 0.00 7337.38 1658.06 10747.90' 00:31:20.864 06:47:33 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-25 06:47:31.386458] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:20.864 [2024-07-25 06:47:31.386522] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291279 ] 00:31:20.864 Using job config with 3 jobs 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:20.864 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.864 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:20.865 [2024-07-25 06:47:31.534405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:20.865 [2024-07-25 06:47:31.597399] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:20.865 cpumask for '\''job0'\'' is too big 00:31:20.865 cpumask for '\''job1'\'' is too big 00:31:20.865 cpumask for '\''job2'\'' is too big 00:31:20.865 Running I/O for 2 seconds... 00:31:20.865 00:31:20.865 Latency(us) 00:31:20.865 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:20.865 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.865 Malloc0 : 2.01 34824.85 34.01 0.00 0.00 7346.27 1690.83 10747.90 00:31:20.865 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.865 Malloc0 : 2.02 34795.17 33.98 0.00 0.00 7337.45 1658.06 9070.18 00:31:20.865 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.865 Malloc0 : 2.02 34765.54 33.95 0.00 0.00 7328.41 1664.61 7654.60 00:31:20.865 =================================================================================================================== 00:31:20.865 Total : 104385.56 101.94 0.00 0.00 7337.38 1658.06 10747.90' 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 06:47:31.386458] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:20.865 [2024-07-25 06:47:31.386522] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291279 ] 00:31:20.865 Using job config with 3 jobs 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:20.865 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:20.865 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:20.865 [2024-07-25 06:47:31.534405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:20.865 [2024-07-25 06:47:31.597399] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:20.865 cpumask for '\''job0'\'' is too big 00:31:20.865 cpumask for '\''job1'\'' is too big 00:31:20.865 cpumask for '\''job2'\'' is too big 00:31:20.865 Running I/O for 2 seconds... 00:31:20.865 00:31:20.865 Latency(us) 00:31:20.865 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:20.865 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.865 Malloc0 : 2.01 34824.85 34.01 0.00 0.00 7346.27 1690.83 10747.90 00:31:20.865 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.865 Malloc0 : 2.02 34795.17 33.98 0.00 0.00 7337.45 1658.06 9070.18 00:31:20.865 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:20.865 Malloc0 : 2.02 34765.54 33.95 0.00 0.00 7328.41 1664.61 7654.60 00:31:20.865 =================================================================================================================== 00:31:20.865 Total : 104385.56 101.94 0.00 0.00 7337.38 1658.06 10747.90' 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:20.865 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:20.865 06:47:33 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:20.866 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:20.866 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:20.866 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:20.866 00:31:20.866 06:47:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:20.866 06:47:34 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:23.402 06:47:36 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-25 06:47:34.058970] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:23.402 [2024-07-25 06:47:34.059033] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291611 ] 00:31:23.402 Using job config with 4 jobs 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:23.402 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.402 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:23.403 [2024-07-25 06:47:34.212787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:23.403 [2024-07-25 06:47:34.277258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:23.403 cpumask for '\''job0'\'' is too big 00:31:23.403 cpumask for '\''job1'\'' is too big 00:31:23.403 cpumask for '\''job2'\'' is too big 00:31:23.403 cpumask for '\''job3'\'' is too big 00:31:23.403 Running I/O for 2 seconds... 00:31:23.403 00:31:23.403 Latency(us) 00:31:23.403 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:23.403 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc0 : 2.03 12835.00 12.53 0.00 0.00 19929.19 3512.73 30618.42 00:31:23.403 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc1 : 2.04 12823.93 12.52 0.00 0.00 19928.47 4272.95 30618.42 00:31:23.403 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc0 : 2.04 12813.13 12.51 0.00 0.00 19877.80 3486.52 27053.26 00:31:23.403 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc1 : 2.04 12802.08 12.50 0.00 0.00 19879.49 4325.38 27053.26 00:31:23.403 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc0 : 2.04 12791.35 12.49 0.00 0.00 19831.20 3460.30 23592.96 00:31:23.403 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc1 : 2.04 12780.44 12.48 0.00 0.00 19830.32 4272.95 23592.96 00:31:23.403 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc0 : 2.04 12769.67 12.47 0.00 0.00 19781.24 3460.30 20552.09 00:31:23.403 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc1 : 2.05 12758.55 12.46 0.00 0.00 19781.43 4272.95 20552.09 00:31:23.403 =================================================================================================================== 00:31:23.403 Total : 102374.16 99.97 0.00 0.00 19854.89 3460.30 30618.42' 00:31:23.403 06:47:36 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-25 06:47:34.058970] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:23.403 [2024-07-25 06:47:34.059033] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291611 ] 00:31:23.403 Using job config with 4 jobs 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:23.403 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.403 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:23.403 [2024-07-25 06:47:34.212787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:23.403 [2024-07-25 06:47:34.277258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:23.403 cpumask for '\''job0'\'' is too big 00:31:23.403 cpumask for '\''job1'\'' is too big 00:31:23.403 cpumask for '\''job2'\'' is too big 00:31:23.403 cpumask for '\''job3'\'' is too big 00:31:23.403 Running I/O for 2 seconds... 00:31:23.403 00:31:23.403 Latency(us) 00:31:23.403 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:23.403 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc0 : 2.03 12835.00 12.53 0.00 0.00 19929.19 3512.73 30618.42 00:31:23.403 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc1 : 2.04 12823.93 12.52 0.00 0.00 19928.47 4272.95 30618.42 00:31:23.403 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc0 : 2.04 12813.13 12.51 0.00 0.00 19877.80 3486.52 27053.26 00:31:23.403 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc1 : 2.04 12802.08 12.50 0.00 0.00 19879.49 4325.38 27053.26 00:31:23.403 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc0 : 2.04 12791.35 12.49 0.00 0.00 19831.20 3460.30 23592.96 00:31:23.403 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc1 : 2.04 12780.44 12.48 0.00 0.00 19830.32 4272.95 23592.96 00:31:23.403 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc0 : 2.04 12769.67 12.47 0.00 0.00 19781.24 3460.30 20552.09 00:31:23.403 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.403 Malloc1 : 2.05 12758.55 12.46 0.00 0.00 19781.43 4272.95 20552.09 00:31:23.403 =================================================================================================================== 00:31:23.403 Total : 102374.16 99.97 0.00 0.00 19854.89 3460.30 30618.42' 00:31:23.403 06:47:36 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:23.403 06:47:36 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 06:47:34.058970] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:23.404 [2024-07-25 06:47:34.059033] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291611 ] 00:31:23.404 Using job config with 4 jobs 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:23.404 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.404 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:23.404 [2024-07-25 06:47:34.212787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:23.404 [2024-07-25 06:47:34.277258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:23.404 cpumask for '\''job0'\'' is too big 00:31:23.404 cpumask for '\''job1'\'' is too big 00:31:23.404 cpumask for '\''job2'\'' is too big 00:31:23.404 cpumask for '\''job3'\'' is too big 00:31:23.404 Running I/O for 2 seconds... 00:31:23.404 00:31:23.404 Latency(us) 00:31:23.404 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:23.404 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.404 Malloc0 : 2.03 12835.00 12.53 0.00 0.00 19929.19 3512.73 30618.42 00:31:23.404 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.404 Malloc1 : 2.04 12823.93 12.52 0.00 0.00 19928.47 4272.95 30618.42 00:31:23.404 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.404 Malloc0 : 2.04 12813.13 12.51 0.00 0.00 19877.80 3486.52 27053.26 00:31:23.404 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.404 Malloc1 : 2.04 12802.08 12.50 0.00 0.00 19879.49 4325.38 27053.26 00:31:23.404 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.404 Malloc0 : 2.04 12791.35 12.49 0.00 0.00 19831.20 3460.30 23592.96 00:31:23.404 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.404 Malloc1 : 2.04 12780.44 12.48 0.00 0.00 19830.32 4272.95 23592.96 00:31:23.404 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.404 Malloc0 : 2.04 12769.67 12.47 0.00 0.00 19781.24 3460.30 20552.09 00:31:23.404 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:23.404 Malloc1 : 2.05 12758.55 12.46 0.00 0.00 19781.43 4272.95 20552.09 00:31:23.404 =================================================================================================================== 00:31:23.404 Total : 102374.16 99.97 0.00 0.00 19854.89 3460.30 30618.42' 00:31:23.404 06:47:36 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:23.404 06:47:36 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:31:23.404 06:47:36 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:31:23.404 06:47:36 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:23.404 06:47:36 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:31:23.404 00:31:23.404 real 0m10.807s 00:31:23.404 user 0m9.527s 00:31:23.404 sys 0m1.147s 00:31:23.404 06:47:36 bdevperf_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:23.404 06:47:36 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:31:23.404 ************************************ 00:31:23.404 END TEST bdevperf_config 00:31:23.404 ************************************ 00:31:23.404 06:47:36 -- spdk/autotest.sh@196 -- # uname -s 00:31:23.404 06:47:36 -- spdk/autotest.sh@196 -- # [[ Linux == Linux ]] 00:31:23.404 06:47:36 -- spdk/autotest.sh@197 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:23.404 06:47:36 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:31:23.404 06:47:36 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:23.404 06:47:36 -- common/autotest_common.sh@10 -- # set +x 00:31:23.404 ************************************ 00:31:23.404 START TEST reactor_set_interrupt 00:31:23.404 ************************************ 00:31:23.404 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:23.404 * Looking for test storage... 00:31:23.404 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:23.404 06:47:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:31:23.404 06:47:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:23.404 06:47:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:23.404 06:47:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:23.404 06:47:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:31:23.404 06:47:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:23.404 06:47:36 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:31:23.404 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:31:23.404 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:31:23.404 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:31:23.404 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:31:23.404 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:31:23.405 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:31:23.405 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:31:23.405 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:31:23.405 06:47:36 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:31:23.405 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:31:23.405 06:47:36 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:31:23.405 #define SPDK_CONFIG_H 00:31:23.405 #define SPDK_CONFIG_APPS 1 00:31:23.405 #define SPDK_CONFIG_ARCH native 00:31:23.405 #undef SPDK_CONFIG_ASAN 00:31:23.405 #undef SPDK_CONFIG_AVAHI 00:31:23.405 #undef SPDK_CONFIG_CET 00:31:23.405 #define SPDK_CONFIG_COVERAGE 1 00:31:23.405 #define SPDK_CONFIG_CROSS_PREFIX 00:31:23.405 #define SPDK_CONFIG_CRYPTO 1 00:31:23.405 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:31:23.405 #undef SPDK_CONFIG_CUSTOMOCF 00:31:23.405 #undef SPDK_CONFIG_DAOS 00:31:23.406 #define SPDK_CONFIG_DAOS_DIR 00:31:23.406 #define SPDK_CONFIG_DEBUG 1 00:31:23.406 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:31:23.406 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:31:23.406 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:31:23.406 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:31:23.406 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:31:23.406 #undef SPDK_CONFIG_DPDK_UADK 00:31:23.406 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:23.406 #define SPDK_CONFIG_EXAMPLES 1 00:31:23.406 #undef SPDK_CONFIG_FC 00:31:23.406 #define SPDK_CONFIG_FC_PATH 00:31:23.406 #define SPDK_CONFIG_FIO_PLUGIN 1 00:31:23.406 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:31:23.406 #undef SPDK_CONFIG_FUSE 00:31:23.406 #undef SPDK_CONFIG_FUZZER 00:31:23.406 #define SPDK_CONFIG_FUZZER_LIB 00:31:23.406 #undef SPDK_CONFIG_GOLANG 00:31:23.406 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:31:23.406 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:31:23.406 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:31:23.406 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:31:23.406 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:31:23.406 #undef SPDK_CONFIG_HAVE_LIBBSD 00:31:23.406 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:31:23.406 #define SPDK_CONFIG_IDXD 1 00:31:23.406 #define SPDK_CONFIG_IDXD_KERNEL 1 00:31:23.406 #define SPDK_CONFIG_IPSEC_MB 1 00:31:23.406 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:31:23.406 #define SPDK_CONFIG_ISAL 1 00:31:23.406 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:31:23.406 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:31:23.406 #define SPDK_CONFIG_LIBDIR 00:31:23.406 #undef SPDK_CONFIG_LTO 00:31:23.406 #define SPDK_CONFIG_MAX_LCORES 128 00:31:23.406 #define SPDK_CONFIG_NVME_CUSE 1 00:31:23.406 #undef SPDK_CONFIG_OCF 00:31:23.406 #define SPDK_CONFIG_OCF_PATH 00:31:23.406 #define SPDK_CONFIG_OPENSSL_PATH 00:31:23.406 #undef SPDK_CONFIG_PGO_CAPTURE 00:31:23.406 #define SPDK_CONFIG_PGO_DIR 00:31:23.406 #undef SPDK_CONFIG_PGO_USE 00:31:23.406 #define SPDK_CONFIG_PREFIX /usr/local 00:31:23.406 #undef SPDK_CONFIG_RAID5F 00:31:23.406 #undef SPDK_CONFIG_RBD 00:31:23.406 #define SPDK_CONFIG_RDMA 1 00:31:23.406 #define SPDK_CONFIG_RDMA_PROV verbs 00:31:23.406 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:31:23.406 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:31:23.406 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:31:23.406 #define SPDK_CONFIG_SHARED 1 00:31:23.406 #undef SPDK_CONFIG_SMA 00:31:23.406 #define SPDK_CONFIG_TESTS 1 00:31:23.406 #undef SPDK_CONFIG_TSAN 00:31:23.406 #define SPDK_CONFIG_UBLK 1 00:31:23.406 #define SPDK_CONFIG_UBSAN 1 00:31:23.406 #undef SPDK_CONFIG_UNIT_TESTS 00:31:23.406 #undef SPDK_CONFIG_URING 00:31:23.406 #define SPDK_CONFIG_URING_PATH 00:31:23.406 #undef SPDK_CONFIG_URING_ZNS 00:31:23.406 #undef SPDK_CONFIG_USDT 00:31:23.406 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:31:23.406 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:31:23.406 #undef SPDK_CONFIG_VFIO_USER 00:31:23.406 #define SPDK_CONFIG_VFIO_USER_DIR 00:31:23.406 #define SPDK_CONFIG_VHOST 1 00:31:23.406 #define SPDK_CONFIG_VIRTIO 1 00:31:23.406 #undef SPDK_CONFIG_VTUNE 00:31:23.406 #define SPDK_CONFIG_VTUNE_DIR 00:31:23.406 #define SPDK_CONFIG_WERROR 1 00:31:23.406 #define SPDK_CONFIG_WPDK_DIR 00:31:23.406 #undef SPDK_CONFIG_XNVME 00:31:23.406 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:31:23.406 06:47:36 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:31:23.406 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:23.406 06:47:36 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:23.406 06:47:36 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:23.406 06:47:36 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:23.406 06:47:36 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:23.406 06:47:36 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:23.406 06:47:36 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:23.406 06:47:36 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:31:23.406 06:47:36 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:23.406 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:31:23.406 06:47:36 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:31:23.668 06:47:36 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 1 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : v23.11 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:31:23.668 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 1 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@166 -- # : 0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@173 -- # : 0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@202 -- # cat 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@265 -- # export valgrind= 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@265 -- # valgrind= 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@271 -- # uname -s 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@281 -- # MAKE=make 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j112 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@301 -- # TEST_MODE= 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@320 -- # [[ -z 1292173 ]] 00:31:23.669 06:47:36 reactor_set_interrupt -- common/autotest_common.sh@320 -- # kill -0 1292173 00:31:23.669 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:31:23.669 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:31:23.669 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:31:23.669 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local mount target_dir 00:31:23.669 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:31:23.669 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.RdzxQN 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.RdzxQN/tests/interrupt /tmp/spdk.RdzxQN 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@329 -- # df -T 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=954302464 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4330127360 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=53046001664 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=61742305280 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=8696303616 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=30866341888 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=30871150592 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4808704 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=12338565120 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=12348461056 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=9895936 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=30870024192 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=30871154688 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=1130496 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=6174224384 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=6174228480 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:31:23.670 * Looking for test storage... 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@370 -- # local target_space new_size 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@374 -- # mount=/ 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@376 -- # target_space=53046001664 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@383 -- # new_size=10910896128 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:23.670 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@391 -- # return 0 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:31:23.670 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:31:23.670 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:31:23.670 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:23.670 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:31:23.670 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:31:23.670 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:31:23.670 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:31:23.670 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:31:23.670 06:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:23.671 06:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:23.671 06:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:31:23.671 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:23.671 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:23.671 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1292215 00:31:23.671 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:23.671 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:23.671 06:47:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1292215 /var/tmp/spdk.sock 00:31:23.671 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 1292215 ']' 00:31:23.671 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:23.671 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:23.671 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:23.671 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:23.671 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:23.671 06:47:37 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:23.671 [2024-07-25 06:47:37.098692] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:23.671 [2024-07-25 06:47:37.098753] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1292215 ] 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:23.671 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:23.671 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:23.931 [2024-07-25 06:47:37.231983] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:23.931 [2024-07-25 06:47:37.276824] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:23.931 [2024-07-25 06:47:37.276918] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:23.931 [2024-07-25 06:47:37.276921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:23.931 [2024-07-25 06:47:37.339608] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:24.501 06:47:38 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:24.501 06:47:38 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:31:24.501 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:31:24.501 06:47:38 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:24.760 Malloc0 00:31:24.760 Malloc1 00:31:24.760 Malloc2 00:31:24.760 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:31:24.760 06:47:38 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:31:24.760 06:47:38 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:24.760 06:47:38 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:25.020 5000+0 records in 00:31:25.020 5000+0 records out 00:31:25.020 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0269606 s, 380 MB/s 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:25.020 AIO0 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1292215 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1292215 without_thd 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1292215 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:25.020 06:47:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:25.279 06:47:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:31:25.538 spdk_thread ids are 1 on reactor0. 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292215 0 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292215 0 idle 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292215 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292215 -w 256 00:31:25.538 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:25.796 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292215 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.33 reactor_0' 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292215 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.33 reactor_0 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292215 1 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292215 1 idle 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292215 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292215 -w 256 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292218 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.00 reactor_1' 00:31:25.797 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292218 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.00 reactor_1 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292215 2 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292215 2 idle 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292215 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292215 -w 256 00:31:26.055 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292219 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.00 reactor_2' 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292219 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.00 reactor_2 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:31:26.056 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:31:26.314 [2024-07-25 06:47:39.757842] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:26.314 06:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:31:26.572 [2024-07-25 06:47:39.997396] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:31:26.572 [2024-07-25 06:47:39.997729] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:26.572 06:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:31:26.831 [2024-07-25 06:47:40.229393] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:31:26.831 [2024-07-25 06:47:40.229525] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1292215 0 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1292215 0 busy 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292215 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292215 -w 256 00:31:26.831 06:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292215 root 20 0 128.2g 35840 22400 R 99.9 0.1 0:00.74 reactor_0' 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292215 root 20 0 128.2g 35840 22400 R 99.9 0.1 0:00.74 reactor_0 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1292215 2 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1292215 2 busy 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292215 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292215 -w 256 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292219 root 20 0 128.2g 35840 22400 R 99.9 0.1 0:00.36 reactor_2' 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292219 root 20 0 128.2g 35840 22400 R 99.9 0.1 0:00.36 reactor_2 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:27.090 06:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:31:27.349 [2024-07-25 06:47:40.817382] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:31:27.349 [2024-07-25 06:47:40.817486] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1292215 2 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292215 2 idle 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292215 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292215 -w 256 00:31:27.349 06:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292219 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.58 reactor_2' 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292219 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.58 reactor_2 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:27.608 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:31:27.867 [2024-07-25 06:47:41.229376] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:31:27.867 [2024-07-25 06:47:41.229497] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:27.867 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:31:27.867 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:31:27.867 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:31:28.126 [2024-07-25 06:47:41.457727] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1292215 0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292215 0 idle 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292215 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292215 -w 256 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292215 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:01.56 reactor_0' 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292215 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:01.56 reactor_0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:31:28.126 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1292215 00:31:28.126 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 1292215 ']' 00:31:28.126 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 1292215 00:31:28.126 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:31:28.126 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:28.126 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1292215 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1292215' 00:31:28.393 killing process with pid 1292215 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 1292215 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 1292215 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1293081 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:28.393 06:47:41 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1293081 /var/tmp/spdk.sock 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 1293081 ']' 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:28.393 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:28.393 06:47:41 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:28.393 [2024-07-25 06:47:41.938934] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:28.393 [2024-07-25 06:47:41.938996] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1293081 ] 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:28.703 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.703 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:28.704 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.704 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:28.704 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.704 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:28.704 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.704 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:28.704 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:28.704 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:28.704 [2024-07-25 06:47:42.070692] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:28.704 [2024-07-25 06:47:42.116445] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:28.704 [2024-07-25 06:47:42.116541] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:28.704 [2024-07-25 06:47:42.116545] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:28.704 [2024-07-25 06:47:42.179507] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:28.963 06:47:42 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:28.963 06:47:42 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:31:28.963 06:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:31:28.963 06:47:42 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:29.222 Malloc0 00:31:29.222 Malloc1 00:31:29.222 Malloc2 00:31:29.222 06:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:31:29.222 06:47:42 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:31:29.222 06:47:42 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:29.222 06:47:42 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:29.222 5000+0 records in 00:31:29.222 5000+0 records out 00:31:29.222 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0156923 s, 653 MB/s 00:31:29.222 06:47:42 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:29.481 AIO0 00:31:29.481 06:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1293081 00:31:29.481 06:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1293081 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1293081 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:29.482 06:47:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:30.050 06:47:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:31:30.309 spdk_thread ids are 1 on reactor0. 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1293081 0 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1293081 0 idle 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1293081 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1293081 -w 256 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1293081 root 20 0 128.2g 35840 22400 S 6.7 0.1 0:00.32 reactor_0' 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1293081 root 20 0 128.2g 35840 22400 S 6.7 0.1 0:00.32 reactor_0 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1293081 1 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1293081 1 idle 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1293081 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1293081 -w 256 00:31:30.309 06:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1293085 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.00 reactor_1' 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1293085 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.00 reactor_1 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1293081 2 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1293081 2 idle 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1293081 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1293081 -w 256 00:31:30.568 06:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1293086 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.00 reactor_2' 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1293086 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.00 reactor_2 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:31:30.827 06:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:31:31.086 [2024-07-25 06:47:44.385283] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:31:31.086 [2024-07-25 06:47:44.385471] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:31:31.086 [2024-07-25 06:47:44.385606] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:31:31.086 [2024-07-25 06:47:44.613685] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:31:31.086 [2024-07-25 06:47:44.613855] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1293081 0 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1293081 0 busy 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1293081 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1293081 -w 256 00:31:31.086 06:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1293081 root 20 0 128.2g 35840 22400 R 99.9 0.1 0:00.73 reactor_0' 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1293081 root 20 0 128.2g 35840 22400 R 99.9 0.1 0:00.73 reactor_0 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1293081 2 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1293081 2 busy 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1293081 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1293081 -w 256 00:31:31.345 06:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1293086 root 20 0 128.2g 35840 22400 R 99.9 0.1 0:00.36 reactor_2' 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1293086 root 20 0 128.2g 35840 22400 R 99.9 0.1 0:00.36 reactor_2 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:31.603 06:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:31:31.861 [2024-07-25 06:47:45.199359] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:31:31.861 [2024-07-25 06:47:45.199457] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1293081 2 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1293081 2 idle 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1293081 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1293081 -w 256 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1293086 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.58 reactor_2' 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1293086 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:00.58 reactor_2 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:31.861 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:31:32.119 [2024-07-25 06:47:45.612407] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:31:32.119 [2024-07-25 06:47:45.612594] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:31:32.119 [2024-07-25 06:47:45.612616] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1293081 0 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1293081 0 idle 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1293081 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:32.119 06:47:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1293081 -w 256 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1293081 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:01.54 reactor_0' 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1293081 root 20 0 128.2g 35840 22400 S 0.0 0.1 0:01.54 reactor_0 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:31:32.377 06:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1293081 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 1293081 ']' 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 1293081 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1293081 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1293081' 00:31:32.377 killing process with pid 1293081 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 1293081 00:31:32.377 06:47:45 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 1293081 00:31:32.635 06:47:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:31:32.635 06:47:46 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:32.635 00:31:32.636 real 0m9.313s 00:31:32.636 user 0m9.002s 00:31:32.636 sys 0m2.130s 00:31:32.636 06:47:46 reactor_set_interrupt -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:32.636 06:47:46 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:32.636 ************************************ 00:31:32.636 END TEST reactor_set_interrupt 00:31:32.636 ************************************ 00:31:32.636 06:47:46 -- spdk/autotest.sh@198 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:32.636 06:47:46 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:31:32.636 06:47:46 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:32.636 06:47:46 -- common/autotest_common.sh@10 -- # set +x 00:31:32.636 ************************************ 00:31:32.636 START TEST reap_unregistered_poller 00:31:32.636 ************************************ 00:31:32.636 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:32.895 * Looking for test storage... 00:31:32.896 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:32.896 06:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:31:32.896 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:32.896 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:32.896 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:32.896 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:31:32.896 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:32.896 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:31:32.896 06:47:46 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:31:32.896 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:32.896 06:47:46 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:32.896 06:47:46 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:32.896 06:47:46 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:32.896 06:47:46 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:32.896 06:47:46 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:31:32.897 #define SPDK_CONFIG_H 00:31:32.897 #define SPDK_CONFIG_APPS 1 00:31:32.897 #define SPDK_CONFIG_ARCH native 00:31:32.897 #undef SPDK_CONFIG_ASAN 00:31:32.897 #undef SPDK_CONFIG_AVAHI 00:31:32.897 #undef SPDK_CONFIG_CET 00:31:32.897 #define SPDK_CONFIG_COVERAGE 1 00:31:32.897 #define SPDK_CONFIG_CROSS_PREFIX 00:31:32.897 #define SPDK_CONFIG_CRYPTO 1 00:31:32.897 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:31:32.897 #undef SPDK_CONFIG_CUSTOMOCF 00:31:32.897 #undef SPDK_CONFIG_DAOS 00:31:32.897 #define SPDK_CONFIG_DAOS_DIR 00:31:32.897 #define SPDK_CONFIG_DEBUG 1 00:31:32.897 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:31:32.897 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:31:32.897 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:31:32.897 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:31:32.897 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:31:32.897 #undef SPDK_CONFIG_DPDK_UADK 00:31:32.897 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:32.897 #define SPDK_CONFIG_EXAMPLES 1 00:31:32.897 #undef SPDK_CONFIG_FC 00:31:32.897 #define SPDK_CONFIG_FC_PATH 00:31:32.897 #define SPDK_CONFIG_FIO_PLUGIN 1 00:31:32.897 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:31:32.897 #undef SPDK_CONFIG_FUSE 00:31:32.897 #undef SPDK_CONFIG_FUZZER 00:31:32.897 #define SPDK_CONFIG_FUZZER_LIB 00:31:32.897 #undef SPDK_CONFIG_GOLANG 00:31:32.897 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:31:32.897 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:31:32.897 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:31:32.897 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:31:32.897 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:31:32.897 #undef SPDK_CONFIG_HAVE_LIBBSD 00:31:32.897 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:31:32.897 #define SPDK_CONFIG_IDXD 1 00:31:32.897 #define SPDK_CONFIG_IDXD_KERNEL 1 00:31:32.897 #define SPDK_CONFIG_IPSEC_MB 1 00:31:32.897 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:31:32.897 #define SPDK_CONFIG_ISAL 1 00:31:32.897 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:31:32.897 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:31:32.897 #define SPDK_CONFIG_LIBDIR 00:31:32.897 #undef SPDK_CONFIG_LTO 00:31:32.897 #define SPDK_CONFIG_MAX_LCORES 128 00:31:32.897 #define SPDK_CONFIG_NVME_CUSE 1 00:31:32.897 #undef SPDK_CONFIG_OCF 00:31:32.897 #define SPDK_CONFIG_OCF_PATH 00:31:32.897 #define SPDK_CONFIG_OPENSSL_PATH 00:31:32.897 #undef SPDK_CONFIG_PGO_CAPTURE 00:31:32.897 #define SPDK_CONFIG_PGO_DIR 00:31:32.897 #undef SPDK_CONFIG_PGO_USE 00:31:32.897 #define SPDK_CONFIG_PREFIX /usr/local 00:31:32.897 #undef SPDK_CONFIG_RAID5F 00:31:32.897 #undef SPDK_CONFIG_RBD 00:31:32.897 #define SPDK_CONFIG_RDMA 1 00:31:32.897 #define SPDK_CONFIG_RDMA_PROV verbs 00:31:32.897 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:31:32.897 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:31:32.897 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:31:32.897 #define SPDK_CONFIG_SHARED 1 00:31:32.897 #undef SPDK_CONFIG_SMA 00:31:32.897 #define SPDK_CONFIG_TESTS 1 00:31:32.897 #undef SPDK_CONFIG_TSAN 00:31:32.897 #define SPDK_CONFIG_UBLK 1 00:31:32.897 #define SPDK_CONFIG_UBSAN 1 00:31:32.897 #undef SPDK_CONFIG_UNIT_TESTS 00:31:32.897 #undef SPDK_CONFIG_URING 00:31:32.897 #define SPDK_CONFIG_URING_PATH 00:31:32.897 #undef SPDK_CONFIG_URING_ZNS 00:31:32.897 #undef SPDK_CONFIG_USDT 00:31:32.897 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:31:32.897 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:31:32.897 #undef SPDK_CONFIG_VFIO_USER 00:31:32.897 #define SPDK_CONFIG_VFIO_USER_DIR 00:31:32.897 #define SPDK_CONFIG_VHOST 1 00:31:32.897 #define SPDK_CONFIG_VIRTIO 1 00:31:32.897 #undef SPDK_CONFIG_VTUNE 00:31:32.897 #define SPDK_CONFIG_VTUNE_DIR 00:31:32.897 #define SPDK_CONFIG_WERROR 1 00:31:32.897 #define SPDK_CONFIG_WPDK_DIR 00:31:32.897 #undef SPDK_CONFIG_XNVME 00:31:32.897 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:32.897 06:47:46 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:32.897 06:47:46 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:32.897 06:47:46 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:32.897 06:47:46 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:32.897 06:47:46 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:31:32.897 06:47:46 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:31:32.897 06:47:46 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 1 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:31:32.897 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : v23.11 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 1 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@166 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@173 -- # : 0 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:32.898 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@202 -- # cat 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@265 -- # export valgrind= 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@265 -- # valgrind= 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@271 -- # uname -s 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@281 -- # MAKE=make 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j112 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@301 -- # TEST_MODE= 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@320 -- # [[ -z 1293973 ]] 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@320 -- # kill -0 1293973 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local mount target_dir 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.Caz8f0 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.Caz8f0/tests/interrupt /tmp/spdk.Caz8f0 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@329 -- # df -T 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=954302464 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4330127360 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=53045837824 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=61742305280 00:31:32.899 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=8696467456 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=30866341888 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=30871150592 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4808704 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=12338565120 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=12348461056 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=9895936 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=30870024192 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=30871154688 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=1130496 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=6174224384 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=6174228480 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:31:32.900 * Looking for test storage... 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@370 -- # local target_space new_size 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:32.900 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@374 -- # mount=/ 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@376 -- # target_space=53045837824 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@383 -- # new_size=10911059968 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:33.159 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@391 -- # return 0 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:31:33.159 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1294014 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:33.160 06:47:46 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1294014 /var/tmp/spdk.sock 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@831 -- # '[' -z 1294014 ']' 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:33.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:33.160 06:47:46 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:33.160 [2024-07-25 06:47:46.500598] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:33.160 [2024-07-25 06:47:46.500658] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1294014 ] 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:33.160 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:33.160 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:33.160 [2024-07-25 06:47:46.640055] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:33.160 [2024-07-25 06:47:46.686354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:33.160 [2024-07-25 06:47:46.686377] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:33.160 [2024-07-25 06:47:46.686385] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:33.419 [2024-07-25 06:47:46.749216] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:33.987 06:47:47 reap_unregistered_poller -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:33.987 06:47:47 reap_unregistered_poller -- common/autotest_common.sh@864 -- # return 0 00:31:33.987 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:31:33.987 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:31:33.987 06:47:47 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:33.987 06:47:47 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:33.987 06:47:47 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:33.987 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:31:33.987 "name": "app_thread", 00:31:33.987 "id": 1, 00:31:33.987 "active_pollers": [], 00:31:33.987 "timed_pollers": [ 00:31:33.987 { 00:31:33.987 "name": "rpc_subsystem_poll_servers", 00:31:33.987 "id": 1, 00:31:33.987 "state": "waiting", 00:31:33.987 "run_count": 0, 00:31:33.987 "busy_count": 0, 00:31:33.987 "period_ticks": 10000000 00:31:33.987 } 00:31:33.987 ], 00:31:33.987 "paused_pollers": [] 00:31:33.987 }' 00:31:33.987 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:31:33.987 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:31:33.987 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:31:33.987 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:31:34.246 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:31:34.246 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:31:34.246 06:47:47 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:31:34.246 06:47:47 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:34.246 06:47:47 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:34.246 5000+0 records in 00:31:34.246 5000+0 records out 00:31:34.246 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0263503 s, 389 MB/s 00:31:34.246 06:47:47 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:34.505 AIO0 00:31:34.505 06:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:31:34.765 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:34.765 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:34.765 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:31:34.765 "name": "app_thread", 00:31:34.765 "id": 1, 00:31:34.765 "active_pollers": [], 00:31:34.765 "timed_pollers": [ 00:31:34.765 { 00:31:34.765 "name": "rpc_subsystem_poll_servers", 00:31:34.765 "id": 1, 00:31:34.765 "state": "waiting", 00:31:34.765 "run_count": 0, 00:31:34.765 "busy_count": 0, 00:31:34.765 "period_ticks": 10000000 00:31:34.765 } 00:31:34.765 ], 00:31:34.765 "paused_pollers": [] 00:31:34.765 }' 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:31:34.765 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1294014 00:31:34.765 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@950 -- # '[' -z 1294014 ']' 00:31:34.765 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@954 -- # kill -0 1294014 00:31:34.765 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@955 -- # uname 00:31:34.765 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:34.765 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1294014 00:31:35.024 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:35.024 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:35.024 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1294014' 00:31:35.024 killing process with pid 1294014 00:31:35.024 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@969 -- # kill 1294014 00:31:35.024 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@974 -- # wait 1294014 00:31:35.024 06:47:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:31:35.024 06:47:48 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:35.024 00:31:35.024 real 0m2.377s 00:31:35.024 user 0m1.408s 00:31:35.024 sys 0m0.662s 00:31:35.024 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:35.024 06:47:48 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:35.024 ************************************ 00:31:35.024 END TEST reap_unregistered_poller 00:31:35.024 ************************************ 00:31:35.282 06:47:48 -- spdk/autotest.sh@202 -- # uname -s 00:31:35.282 06:47:48 -- spdk/autotest.sh@202 -- # [[ Linux == Linux ]] 00:31:35.282 06:47:48 -- spdk/autotest.sh@203 -- # [[ 1 -eq 1 ]] 00:31:35.282 06:47:48 -- spdk/autotest.sh@209 -- # [[ 1 -eq 0 ]] 00:31:35.282 06:47:48 -- spdk/autotest.sh@215 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@264 -- # timing_exit lib 00:31:35.282 06:47:48 -- common/autotest_common.sh@730 -- # xtrace_disable 00:31:35.282 06:47:48 -- common/autotest_common.sh@10 -- # set +x 00:31:35.282 06:47:48 -- spdk/autotest.sh@266 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@283 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@347 -- # '[' 0 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@351 -- # '[' 1 -eq 1 ']' 00:31:35.282 06:47:48 -- spdk/autotest.sh@352 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:31:35.282 06:47:48 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:31:35.282 06:47:48 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:35.283 06:47:48 -- common/autotest_common.sh@10 -- # set +x 00:31:35.283 ************************************ 00:31:35.283 START TEST compress_compdev 00:31:35.283 ************************************ 00:31:35.283 06:47:48 compress_compdev -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:31:35.283 * Looking for test storage... 00:31:35.283 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:35.283 06:47:48 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:35.283 06:47:48 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:35.283 06:47:48 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:35.283 06:47:48 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:35.283 06:47:48 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:35.283 06:47:48 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:35.283 06:47:48 compress_compdev -- paths/export.sh@5 -- # export PATH 00:31:35.283 06:47:48 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:35.283 06:47:48 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1294436 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1294436 00:31:35.283 06:47:48 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1294436 ']' 00:31:35.283 06:47:48 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:35.283 06:47:48 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:35.283 06:47:48 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:35.283 06:47:48 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:35.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:35.283 06:47:48 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:35.283 06:47:48 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:35.542 [2024-07-25 06:47:48.879238] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:35.542 [2024-07-25 06:47:48.879300] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1294436 ] 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:35.542 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.542 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:35.543 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:35.543 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:35.543 [2024-07-25 06:47:49.003356] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:35.543 [2024-07-25 06:47:49.049532] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:35.543 [2024-07-25 06:47:49.049545] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:36.111 [2024-07-25 06:47:49.644977] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:36.370 06:47:49 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:36.370 06:47:49 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:31:36.370 06:47:49 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:31:36.370 06:47:49 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:36.370 06:47:49 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:39.660 [2024-07-25 06:47:52.871708] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13edc80 PMD being used: compress_qat 00:31:39.660 06:47:52 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:39.660 06:47:52 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:39.660 06:47:52 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:39.660 06:47:52 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:39.660 06:47:52 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:39.660 06:47:52 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:39.660 06:47:52 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:39.660 06:47:53 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:39.919 [ 00:31:39.919 { 00:31:39.919 "name": "Nvme0n1", 00:31:39.919 "aliases": [ 00:31:39.919 "85277eb0-93dc-4b70-80ab-0fea438749bc" 00:31:39.919 ], 00:31:39.919 "product_name": "NVMe disk", 00:31:39.919 "block_size": 512, 00:31:39.919 "num_blocks": 3907029168, 00:31:39.919 "uuid": "85277eb0-93dc-4b70-80ab-0fea438749bc", 00:31:39.919 "assigned_rate_limits": { 00:31:39.919 "rw_ios_per_sec": 0, 00:31:39.919 "rw_mbytes_per_sec": 0, 00:31:39.919 "r_mbytes_per_sec": 0, 00:31:39.919 "w_mbytes_per_sec": 0 00:31:39.919 }, 00:31:39.919 "claimed": false, 00:31:39.919 "zoned": false, 00:31:39.919 "supported_io_types": { 00:31:39.919 "read": true, 00:31:39.919 "write": true, 00:31:39.919 "unmap": true, 00:31:39.919 "flush": true, 00:31:39.919 "reset": true, 00:31:39.919 "nvme_admin": true, 00:31:39.919 "nvme_io": true, 00:31:39.919 "nvme_io_md": false, 00:31:39.919 "write_zeroes": true, 00:31:39.919 "zcopy": false, 00:31:39.919 "get_zone_info": false, 00:31:39.919 "zone_management": false, 00:31:39.919 "zone_append": false, 00:31:39.919 "compare": false, 00:31:39.919 "compare_and_write": false, 00:31:39.919 "abort": true, 00:31:39.919 "seek_hole": false, 00:31:39.919 "seek_data": false, 00:31:39.919 "copy": false, 00:31:39.919 "nvme_iov_md": false 00:31:39.919 }, 00:31:39.919 "driver_specific": { 00:31:39.919 "nvme": [ 00:31:39.919 { 00:31:39.919 "pci_address": "0000:d8:00.0", 00:31:39.919 "trid": { 00:31:39.919 "trtype": "PCIe", 00:31:39.919 "traddr": "0000:d8:00.0" 00:31:39.919 }, 00:31:39.919 "ctrlr_data": { 00:31:39.919 "cntlid": 0, 00:31:39.919 "vendor_id": "0x8086", 00:31:39.919 "model_number": "INTEL SSDPE2KX020T8", 00:31:39.919 "serial_number": "BTLJ125505KA2P0BGN", 00:31:39.919 "firmware_revision": "VDV10170", 00:31:39.919 "oacs": { 00:31:39.919 "security": 0, 00:31:39.919 "format": 1, 00:31:39.919 "firmware": 1, 00:31:39.919 "ns_manage": 1 00:31:39.919 }, 00:31:39.919 "multi_ctrlr": false, 00:31:39.919 "ana_reporting": false 00:31:39.919 }, 00:31:39.919 "vs": { 00:31:39.920 "nvme_version": "1.2" 00:31:39.920 }, 00:31:39.920 "ns_data": { 00:31:39.920 "id": 1, 00:31:39.920 "can_share": false 00:31:39.920 } 00:31:39.920 } 00:31:39.920 ], 00:31:39.920 "mp_policy": "active_passive" 00:31:39.920 } 00:31:39.920 } 00:31:39.920 ] 00:31:39.920 06:47:53 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:39.920 06:47:53 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:40.178 [2024-07-25 06:47:53.580891] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1252ee0 PMD being used: compress_qat 00:31:41.115 fa06d8f4-b50a-40c7-a287-73301158e8fd 00:31:41.115 06:47:54 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:41.377 f9c73c69-11c4-4026-b625-f50582108da2 00:31:41.377 06:47:54 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:41.377 06:47:54 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:41.377 06:47:54 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:41.377 06:47:54 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:41.377 06:47:54 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:41.377 06:47:54 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:41.377 06:47:54 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:41.636 06:47:55 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:41.895 [ 00:31:41.895 { 00:31:41.895 "name": "f9c73c69-11c4-4026-b625-f50582108da2", 00:31:41.895 "aliases": [ 00:31:41.895 "lvs0/lv0" 00:31:41.895 ], 00:31:41.895 "product_name": "Logical Volume", 00:31:41.895 "block_size": 512, 00:31:41.895 "num_blocks": 204800, 00:31:41.895 "uuid": "f9c73c69-11c4-4026-b625-f50582108da2", 00:31:41.895 "assigned_rate_limits": { 00:31:41.895 "rw_ios_per_sec": 0, 00:31:41.895 "rw_mbytes_per_sec": 0, 00:31:41.895 "r_mbytes_per_sec": 0, 00:31:41.895 "w_mbytes_per_sec": 0 00:31:41.895 }, 00:31:41.895 "claimed": false, 00:31:41.895 "zoned": false, 00:31:41.895 "supported_io_types": { 00:31:41.895 "read": true, 00:31:41.895 "write": true, 00:31:41.895 "unmap": true, 00:31:41.895 "flush": false, 00:31:41.895 "reset": true, 00:31:41.895 "nvme_admin": false, 00:31:41.895 "nvme_io": false, 00:31:41.895 "nvme_io_md": false, 00:31:41.895 "write_zeroes": true, 00:31:41.895 "zcopy": false, 00:31:41.895 "get_zone_info": false, 00:31:41.895 "zone_management": false, 00:31:41.895 "zone_append": false, 00:31:41.895 "compare": false, 00:31:41.895 "compare_and_write": false, 00:31:41.895 "abort": false, 00:31:41.895 "seek_hole": true, 00:31:41.895 "seek_data": true, 00:31:41.895 "copy": false, 00:31:41.895 "nvme_iov_md": false 00:31:41.895 }, 00:31:41.895 "driver_specific": { 00:31:41.895 "lvol": { 00:31:41.895 "lvol_store_uuid": "fa06d8f4-b50a-40c7-a287-73301158e8fd", 00:31:41.895 "base_bdev": "Nvme0n1", 00:31:41.895 "thin_provision": true, 00:31:41.895 "num_allocated_clusters": 0, 00:31:41.895 "snapshot": false, 00:31:41.895 "clone": false, 00:31:41.895 "esnap_clone": false 00:31:41.895 } 00:31:41.895 } 00:31:41.895 } 00:31:41.895 ] 00:31:41.895 06:47:55 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:41.896 06:47:55 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:41.896 06:47:55 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:42.155 [2024-07-25 06:47:55.552081] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:42.155 COMP_lvs0/lv0 00:31:42.155 06:47:55 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:42.155 06:47:55 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:42.155 06:47:55 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:42.155 06:47:55 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:42.155 06:47:55 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:42.155 06:47:55 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:42.155 06:47:55 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:42.414 06:47:55 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:42.674 [ 00:31:42.674 { 00:31:42.674 "name": "COMP_lvs0/lv0", 00:31:42.674 "aliases": [ 00:31:42.674 "ca5acfc9-a6d7-534d-afbf-f517712f17e4" 00:31:42.674 ], 00:31:42.674 "product_name": "compress", 00:31:42.674 "block_size": 512, 00:31:42.674 "num_blocks": 200704, 00:31:42.674 "uuid": "ca5acfc9-a6d7-534d-afbf-f517712f17e4", 00:31:42.674 "assigned_rate_limits": { 00:31:42.674 "rw_ios_per_sec": 0, 00:31:42.674 "rw_mbytes_per_sec": 0, 00:31:42.674 "r_mbytes_per_sec": 0, 00:31:42.674 "w_mbytes_per_sec": 0 00:31:42.674 }, 00:31:42.674 "claimed": false, 00:31:42.674 "zoned": false, 00:31:42.674 "supported_io_types": { 00:31:42.674 "read": true, 00:31:42.674 "write": true, 00:31:42.674 "unmap": false, 00:31:42.674 "flush": false, 00:31:42.674 "reset": false, 00:31:42.674 "nvme_admin": false, 00:31:42.674 "nvme_io": false, 00:31:42.674 "nvme_io_md": false, 00:31:42.674 "write_zeroes": true, 00:31:42.674 "zcopy": false, 00:31:42.674 "get_zone_info": false, 00:31:42.674 "zone_management": false, 00:31:42.674 "zone_append": false, 00:31:42.674 "compare": false, 00:31:42.674 "compare_and_write": false, 00:31:42.674 "abort": false, 00:31:42.674 "seek_hole": false, 00:31:42.674 "seek_data": false, 00:31:42.674 "copy": false, 00:31:42.674 "nvme_iov_md": false 00:31:42.674 }, 00:31:42.674 "driver_specific": { 00:31:42.674 "compress": { 00:31:42.674 "name": "COMP_lvs0/lv0", 00:31:42.674 "base_bdev_name": "f9c73c69-11c4-4026-b625-f50582108da2", 00:31:42.674 "pm_path": "/tmp/pmem/1d0dc53f-5bc2-46dc-a784-bc19aea06e56" 00:31:42.674 } 00:31:42.674 } 00:31:42.674 } 00:31:42.674 ] 00:31:42.674 06:47:56 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:42.674 06:47:56 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:42.674 [2024-07-25 06:47:56.114160] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f42481b15c0 PMD being used: compress_qat 00:31:42.674 [2024-07-25 06:47:56.116174] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12c7700 PMD being used: compress_qat 00:31:42.674 Running I/O for 3 seconds... 00:31:45.996 00:31:45.996 Latency(us) 00:31:45.996 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:45.996 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:45.996 Verification LBA range: start 0x0 length 0x3100 00:31:45.996 COMP_lvs0/lv0 : 3.01 4027.49 15.73 0.00 0.00 7895.31 128.61 15414.07 00:31:45.996 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:45.996 Verification LBA range: start 0x3100 length 0x3100 00:31:45.996 COMP_lvs0/lv0 : 3.00 4122.32 16.10 0.00 0.00 7727.79 120.42 15414.07 00:31:45.996 =================================================================================================================== 00:31:45.996 Total : 8149.82 31.84 0.00 0.00 7810.58 120.42 15414.07 00:31:45.996 0 00:31:45.996 06:47:59 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:45.996 06:47:59 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:45.996 06:47:59 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:45.996 06:47:59 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:45.996 06:47:59 compress_compdev -- compress/compress.sh@78 -- # killprocess 1294436 00:31:45.996 06:47:59 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1294436 ']' 00:31:45.996 06:47:59 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1294436 00:31:45.996 06:47:59 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:31:46.255 06:47:59 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:46.255 06:47:59 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1294436 00:31:46.255 06:47:59 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:31:46.255 06:47:59 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:31:46.255 06:47:59 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1294436' 00:31:46.255 killing process with pid 1294436 00:31:46.255 06:47:59 compress_compdev -- common/autotest_common.sh@969 -- # kill 1294436 00:31:46.255 Received shutdown signal, test time was about 3.000000 seconds 00:31:46.255 00:31:46.255 Latency(us) 00:31:46.255 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:46.255 =================================================================================================================== 00:31:46.256 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:46.256 06:47:59 compress_compdev -- common/autotest_common.sh@974 -- # wait 1294436 00:31:48.791 06:48:02 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:48.791 06:48:02 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:48.791 06:48:02 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1296787 00:31:48.791 06:48:02 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:48.791 06:48:02 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:48.791 06:48:02 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1296787 00:31:48.791 06:48:02 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1296787 ']' 00:31:48.791 06:48:02 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:48.791 06:48:02 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:48.791 06:48:02 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:48.791 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:48.791 06:48:02 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:48.791 06:48:02 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:48.791 [2024-07-25 06:48:02.071527] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:31:48.791 [2024-07-25 06:48:02.071591] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1296787 ] 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:01.0 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:01.1 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:01.2 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:01.3 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:01.4 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:01.5 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:01.6 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:01.7 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.791 EAL: Requested device 0000:3d:02.0 cannot be used 00:31:48.791 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3d:02.1 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3d:02.2 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3d:02.3 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3d:02.4 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3d:02.5 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3d:02.6 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3d:02.7 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:01.0 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:01.1 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:01.2 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:01.3 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:01.4 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:01.5 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:01.6 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:01.7 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:02.0 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:02.1 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:02.2 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:02.3 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:02.4 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:02.5 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:02.6 cannot be used 00:31:48.792 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:31:48.792 EAL: Requested device 0000:3f:02.7 cannot be used 00:31:48.792 [2024-07-25 06:48:02.196533] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:48.792 [2024-07-25 06:48:02.241467] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:48.792 [2024-07-25 06:48:02.241472] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:49.360 [2024-07-25 06:48:02.832367] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:49.360 06:48:02 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:49.360 06:48:02 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:31:49.360 06:48:02 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:31:49.619 06:48:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:49.619 06:48:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:52.907 [2024-07-25 06:48:05.953893] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2b83c80 PMD being used: compress_qat 00:31:52.907 06:48:05 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:52.907 06:48:05 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:52.907 06:48:05 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:52.907 06:48:05 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:52.907 06:48:05 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:52.907 06:48:05 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:52.907 06:48:05 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:52.907 06:48:06 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:52.907 [ 00:31:52.907 { 00:31:52.907 "name": "Nvme0n1", 00:31:52.907 "aliases": [ 00:31:52.907 "f51da95b-2865-459f-b740-2961695300fe" 00:31:52.907 ], 00:31:52.907 "product_name": "NVMe disk", 00:31:52.907 "block_size": 512, 00:31:52.907 "num_blocks": 3907029168, 00:31:52.907 "uuid": "f51da95b-2865-459f-b740-2961695300fe", 00:31:52.907 "assigned_rate_limits": { 00:31:52.907 "rw_ios_per_sec": 0, 00:31:52.907 "rw_mbytes_per_sec": 0, 00:31:52.907 "r_mbytes_per_sec": 0, 00:31:52.907 "w_mbytes_per_sec": 0 00:31:52.907 }, 00:31:52.907 "claimed": false, 00:31:52.907 "zoned": false, 00:31:52.907 "supported_io_types": { 00:31:52.907 "read": true, 00:31:52.907 "write": true, 00:31:52.907 "unmap": true, 00:31:52.907 "flush": true, 00:31:52.907 "reset": true, 00:31:52.907 "nvme_admin": true, 00:31:52.907 "nvme_io": true, 00:31:52.907 "nvme_io_md": false, 00:31:52.907 "write_zeroes": true, 00:31:52.907 "zcopy": false, 00:31:52.907 "get_zone_info": false, 00:31:52.907 "zone_management": false, 00:31:52.907 "zone_append": false, 00:31:52.907 "compare": false, 00:31:52.907 "compare_and_write": false, 00:31:52.907 "abort": true, 00:31:52.907 "seek_hole": false, 00:31:52.907 "seek_data": false, 00:31:52.907 "copy": false, 00:31:52.907 "nvme_iov_md": false 00:31:52.907 }, 00:31:52.907 "driver_specific": { 00:31:52.907 "nvme": [ 00:31:52.907 { 00:31:52.907 "pci_address": "0000:d8:00.0", 00:31:52.907 "trid": { 00:31:52.907 "trtype": "PCIe", 00:31:52.907 "traddr": "0000:d8:00.0" 00:31:52.907 }, 00:31:52.907 "ctrlr_data": { 00:31:52.907 "cntlid": 0, 00:31:52.907 "vendor_id": "0x8086", 00:31:52.907 "model_number": "INTEL SSDPE2KX020T8", 00:31:52.907 "serial_number": "BTLJ125505KA2P0BGN", 00:31:52.907 "firmware_revision": "VDV10170", 00:31:52.907 "oacs": { 00:31:52.907 "security": 0, 00:31:52.907 "format": 1, 00:31:52.907 "firmware": 1, 00:31:52.907 "ns_manage": 1 00:31:52.907 }, 00:31:52.907 "multi_ctrlr": false, 00:31:52.907 "ana_reporting": false 00:31:52.907 }, 00:31:52.907 "vs": { 00:31:52.907 "nvme_version": "1.2" 00:31:52.907 }, 00:31:52.907 "ns_data": { 00:31:52.907 "id": 1, 00:31:52.907 "can_share": false 00:31:52.907 } 00:31:52.907 } 00:31:52.907 ], 00:31:52.907 "mp_policy": "active_passive" 00:31:52.907 } 00:31:52.907 } 00:31:52.907 ] 00:31:52.907 06:48:06 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:52.907 06:48:06 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:53.166 [2024-07-25 06:48:06.594839] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x29e8950 PMD being used: compress_qat 00:31:54.099 c94f7935-3858-4cb1-97c0-43456213d450 00:31:54.099 06:48:07 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:54.358 6d6b1917-4d80-4632-b40a-286ef8869c46 00:31:54.358 06:48:07 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:54.358 06:48:07 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:54.358 06:48:07 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:54.358 06:48:07 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:54.358 06:48:07 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:54.358 06:48:07 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:54.358 06:48:07 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:54.616 06:48:07 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:54.616 [ 00:31:54.616 { 00:31:54.616 "name": "6d6b1917-4d80-4632-b40a-286ef8869c46", 00:31:54.616 "aliases": [ 00:31:54.616 "lvs0/lv0" 00:31:54.616 ], 00:31:54.616 "product_name": "Logical Volume", 00:31:54.616 "block_size": 512, 00:31:54.616 "num_blocks": 204800, 00:31:54.616 "uuid": "6d6b1917-4d80-4632-b40a-286ef8869c46", 00:31:54.616 "assigned_rate_limits": { 00:31:54.616 "rw_ios_per_sec": 0, 00:31:54.616 "rw_mbytes_per_sec": 0, 00:31:54.616 "r_mbytes_per_sec": 0, 00:31:54.616 "w_mbytes_per_sec": 0 00:31:54.616 }, 00:31:54.616 "claimed": false, 00:31:54.616 "zoned": false, 00:31:54.616 "supported_io_types": { 00:31:54.616 "read": true, 00:31:54.616 "write": true, 00:31:54.616 "unmap": true, 00:31:54.616 "flush": false, 00:31:54.616 "reset": true, 00:31:54.616 "nvme_admin": false, 00:31:54.616 "nvme_io": false, 00:31:54.616 "nvme_io_md": false, 00:31:54.616 "write_zeroes": true, 00:31:54.616 "zcopy": false, 00:31:54.616 "get_zone_info": false, 00:31:54.616 "zone_management": false, 00:31:54.616 "zone_append": false, 00:31:54.616 "compare": false, 00:31:54.616 "compare_and_write": false, 00:31:54.616 "abort": false, 00:31:54.616 "seek_hole": true, 00:31:54.616 "seek_data": true, 00:31:54.616 "copy": false, 00:31:54.616 "nvme_iov_md": false 00:31:54.616 }, 00:31:54.616 "driver_specific": { 00:31:54.616 "lvol": { 00:31:54.616 "lvol_store_uuid": "c94f7935-3858-4cb1-97c0-43456213d450", 00:31:54.616 "base_bdev": "Nvme0n1", 00:31:54.616 "thin_provision": true, 00:31:54.616 "num_allocated_clusters": 0, 00:31:54.616 "snapshot": false, 00:31:54.616 "clone": false, 00:31:54.616 "esnap_clone": false 00:31:54.616 } 00:31:54.616 } 00:31:54.616 } 00:31:54.616 ] 00:31:54.616 06:48:08 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:54.616 06:48:08 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:54.616 06:48:08 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:54.874 [2024-07-25 06:48:08.317716] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:54.874 COMP_lvs0/lv0 00:31:54.874 06:48:08 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:54.874 06:48:08 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:54.874 06:48:08 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:54.874 06:48:08 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:54.874 06:48:08 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:54.874 06:48:08 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:54.874 06:48:08 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:55.132 06:48:08 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:55.390 [ 00:31:55.390 { 00:31:55.390 "name": "COMP_lvs0/lv0", 00:31:55.390 "aliases": [ 00:31:55.390 "63981eb9-4f6c-5348-ba35-98feaeaa095f" 00:31:55.390 ], 00:31:55.390 "product_name": "compress", 00:31:55.390 "block_size": 512, 00:31:55.390 "num_blocks": 200704, 00:31:55.390 "uuid": "63981eb9-4f6c-5348-ba35-98feaeaa095f", 00:31:55.390 "assigned_rate_limits": { 00:31:55.390 "rw_ios_per_sec": 0, 00:31:55.390 "rw_mbytes_per_sec": 0, 00:31:55.390 "r_mbytes_per_sec": 0, 00:31:55.390 "w_mbytes_per_sec": 0 00:31:55.390 }, 00:31:55.390 "claimed": false, 00:31:55.390 "zoned": false, 00:31:55.390 "supported_io_types": { 00:31:55.390 "read": true, 00:31:55.390 "write": true, 00:31:55.390 "unmap": false, 00:31:55.390 "flush": false, 00:31:55.390 "reset": false, 00:31:55.390 "nvme_admin": false, 00:31:55.390 "nvme_io": false, 00:31:55.390 "nvme_io_md": false, 00:31:55.390 "write_zeroes": true, 00:31:55.390 "zcopy": false, 00:31:55.390 "get_zone_info": false, 00:31:55.390 "zone_management": false, 00:31:55.390 "zone_append": false, 00:31:55.390 "compare": false, 00:31:55.390 "compare_and_write": false, 00:31:55.390 "abort": false, 00:31:55.391 "seek_hole": false, 00:31:55.391 "seek_data": false, 00:31:55.391 "copy": false, 00:31:55.391 "nvme_iov_md": false 00:31:55.391 }, 00:31:55.391 "driver_specific": { 00:31:55.391 "compress": { 00:31:55.391 "name": "COMP_lvs0/lv0", 00:31:55.391 "base_bdev_name": "6d6b1917-4d80-4632-b40a-286ef8869c46", 00:31:55.391 "pm_path": "/tmp/pmem/f4ad04c7-71a0-4673-8bff-7c602573fdaa" 00:31:55.391 } 00:31:55.391 } 00:31:55.391 } 00:31:55.391 ] 00:31:55.391 06:48:08 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:55.391 06:48:08 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:55.391 [2024-07-25 06:48:08.879784] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fd2781b15c0 PMD being used: compress_qat 00:31:55.391 [2024-07-25 06:48:08.881803] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2a5d840 PMD being used: compress_qat 00:31:55.391 Running I/O for 3 seconds... 00:31:58.674 00:31:58.674 Latency(us) 00:31:58.674 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:58.674 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:58.674 Verification LBA range: start 0x0 length 0x3100 00:31:58.674 COMP_lvs0/lv0 : 3.01 4059.14 15.86 0.00 0.00 7825.55 127.80 14994.64 00:31:58.675 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:58.675 Verification LBA range: start 0x3100 length 0x3100 00:31:58.675 COMP_lvs0/lv0 : 3.01 4178.53 16.32 0.00 0.00 7618.69 120.42 15518.92 00:31:58.675 =================================================================================================================== 00:31:58.675 Total : 8237.67 32.18 0.00 0.00 7720.66 120.42 15518.92 00:31:58.675 0 00:31:58.675 06:48:11 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:58.675 06:48:11 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:58.675 06:48:12 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:58.933 06:48:12 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:58.933 06:48:12 compress_compdev -- compress/compress.sh@78 -- # killprocess 1296787 00:31:58.933 06:48:12 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1296787 ']' 00:31:58.933 06:48:12 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1296787 00:31:58.933 06:48:12 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:31:58.934 06:48:12 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:58.934 06:48:12 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1296787 00:31:58.934 06:48:12 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:31:58.934 06:48:12 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:31:58.934 06:48:12 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1296787' 00:31:58.934 killing process with pid 1296787 00:31:58.934 06:48:12 compress_compdev -- common/autotest_common.sh@969 -- # kill 1296787 00:31:58.934 Received shutdown signal, test time was about 3.000000 seconds 00:31:58.934 00:31:58.934 Latency(us) 00:31:58.934 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:58.934 =================================================================================================================== 00:31:58.934 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:58.934 06:48:12 compress_compdev -- common/autotest_common.sh@974 -- # wait 1296787 00:32:01.467 06:48:14 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:32:01.467 06:48:14 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:32:01.467 06:48:14 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1299318 00:32:01.467 06:48:14 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:01.467 06:48:14 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:32:01.467 06:48:14 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1299318 00:32:01.467 06:48:14 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1299318 ']' 00:32:01.467 06:48:14 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:01.467 06:48:14 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:01.467 06:48:14 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:01.467 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:01.467 06:48:14 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:01.467 06:48:14 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:01.467 [2024-07-25 06:48:14.897833] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:32:01.467 [2024-07-25 06:48:14.897895] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1299318 ] 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:01.467 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:01.467 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:01.726 [2024-07-25 06:48:15.022889] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:01.726 [2024-07-25 06:48:15.068487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:01.726 [2024-07-25 06:48:15.068493] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:02.338 [2024-07-25 06:48:15.658359] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:32:02.338 06:48:15 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:02.338 06:48:15 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:32:02.338 06:48:15 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:32:02.338 06:48:15 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:02.338 06:48:15 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:05.636 [2024-07-25 06:48:18.791717] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2259c80 PMD being used: compress_qat 00:32:05.636 06:48:18 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:05.636 06:48:18 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:32:05.636 06:48:18 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:05.636 06:48:18 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:05.636 06:48:18 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:05.636 06:48:18 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:05.636 06:48:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:05.636 06:48:18 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:05.636 [ 00:32:05.636 { 00:32:05.636 "name": "Nvme0n1", 00:32:05.636 "aliases": [ 00:32:05.636 "abf6289c-a2d6-486f-9d05-1202dc123966" 00:32:05.636 ], 00:32:05.636 "product_name": "NVMe disk", 00:32:05.636 "block_size": 512, 00:32:05.636 "num_blocks": 3907029168, 00:32:05.636 "uuid": "abf6289c-a2d6-486f-9d05-1202dc123966", 00:32:05.636 "assigned_rate_limits": { 00:32:05.636 "rw_ios_per_sec": 0, 00:32:05.636 "rw_mbytes_per_sec": 0, 00:32:05.636 "r_mbytes_per_sec": 0, 00:32:05.636 "w_mbytes_per_sec": 0 00:32:05.636 }, 00:32:05.636 "claimed": false, 00:32:05.636 "zoned": false, 00:32:05.636 "supported_io_types": { 00:32:05.636 "read": true, 00:32:05.636 "write": true, 00:32:05.636 "unmap": true, 00:32:05.636 "flush": true, 00:32:05.636 "reset": true, 00:32:05.636 "nvme_admin": true, 00:32:05.636 "nvme_io": true, 00:32:05.636 "nvme_io_md": false, 00:32:05.636 "write_zeroes": true, 00:32:05.636 "zcopy": false, 00:32:05.636 "get_zone_info": false, 00:32:05.636 "zone_management": false, 00:32:05.636 "zone_append": false, 00:32:05.636 "compare": false, 00:32:05.636 "compare_and_write": false, 00:32:05.636 "abort": true, 00:32:05.636 "seek_hole": false, 00:32:05.636 "seek_data": false, 00:32:05.636 "copy": false, 00:32:05.636 "nvme_iov_md": false 00:32:05.636 }, 00:32:05.636 "driver_specific": { 00:32:05.636 "nvme": [ 00:32:05.636 { 00:32:05.636 "pci_address": "0000:d8:00.0", 00:32:05.636 "trid": { 00:32:05.636 "trtype": "PCIe", 00:32:05.636 "traddr": "0000:d8:00.0" 00:32:05.636 }, 00:32:05.636 "ctrlr_data": { 00:32:05.636 "cntlid": 0, 00:32:05.636 "vendor_id": "0x8086", 00:32:05.636 "model_number": "INTEL SSDPE2KX020T8", 00:32:05.636 "serial_number": "BTLJ125505KA2P0BGN", 00:32:05.636 "firmware_revision": "VDV10170", 00:32:05.636 "oacs": { 00:32:05.636 "security": 0, 00:32:05.636 "format": 1, 00:32:05.636 "firmware": 1, 00:32:05.636 "ns_manage": 1 00:32:05.636 }, 00:32:05.636 "multi_ctrlr": false, 00:32:05.636 "ana_reporting": false 00:32:05.636 }, 00:32:05.636 "vs": { 00:32:05.636 "nvme_version": "1.2" 00:32:05.636 }, 00:32:05.636 "ns_data": { 00:32:05.636 "id": 1, 00:32:05.636 "can_share": false 00:32:05.636 } 00:32:05.636 } 00:32:05.636 ], 00:32:05.636 "mp_policy": "active_passive" 00:32:05.636 } 00:32:05.636 } 00:32:05.636 ] 00:32:05.636 06:48:19 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:05.636 06:48:19 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:05.895 [2024-07-25 06:48:19.307992] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20be950 PMD being used: compress_qat 00:32:06.830 9ac77406-fd0d-4a6c-a223-a78b9e18d9a6 00:32:06.830 06:48:20 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:07.088 79d96502-3b88-4367-a318-135619c14c77 00:32:07.088 06:48:20 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:07.088 06:48:20 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:32:07.088 06:48:20 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:07.088 06:48:20 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:07.088 06:48:20 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:07.088 06:48:20 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:07.088 06:48:20 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:07.347 06:48:20 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:07.347 [ 00:32:07.347 { 00:32:07.347 "name": "79d96502-3b88-4367-a318-135619c14c77", 00:32:07.347 "aliases": [ 00:32:07.347 "lvs0/lv0" 00:32:07.347 ], 00:32:07.347 "product_name": "Logical Volume", 00:32:07.347 "block_size": 512, 00:32:07.347 "num_blocks": 204800, 00:32:07.347 "uuid": "79d96502-3b88-4367-a318-135619c14c77", 00:32:07.347 "assigned_rate_limits": { 00:32:07.347 "rw_ios_per_sec": 0, 00:32:07.347 "rw_mbytes_per_sec": 0, 00:32:07.347 "r_mbytes_per_sec": 0, 00:32:07.347 "w_mbytes_per_sec": 0 00:32:07.347 }, 00:32:07.347 "claimed": false, 00:32:07.347 "zoned": false, 00:32:07.347 "supported_io_types": { 00:32:07.347 "read": true, 00:32:07.347 "write": true, 00:32:07.347 "unmap": true, 00:32:07.347 "flush": false, 00:32:07.347 "reset": true, 00:32:07.347 "nvme_admin": false, 00:32:07.347 "nvme_io": false, 00:32:07.347 "nvme_io_md": false, 00:32:07.347 "write_zeroes": true, 00:32:07.347 "zcopy": false, 00:32:07.347 "get_zone_info": false, 00:32:07.347 "zone_management": false, 00:32:07.347 "zone_append": false, 00:32:07.347 "compare": false, 00:32:07.347 "compare_and_write": false, 00:32:07.347 "abort": false, 00:32:07.347 "seek_hole": true, 00:32:07.347 "seek_data": true, 00:32:07.347 "copy": false, 00:32:07.347 "nvme_iov_md": false 00:32:07.347 }, 00:32:07.347 "driver_specific": { 00:32:07.347 "lvol": { 00:32:07.347 "lvol_store_uuid": "9ac77406-fd0d-4a6c-a223-a78b9e18d9a6", 00:32:07.347 "base_bdev": "Nvme0n1", 00:32:07.347 "thin_provision": true, 00:32:07.347 "num_allocated_clusters": 0, 00:32:07.347 "snapshot": false, 00:32:07.347 "clone": false, 00:32:07.347 "esnap_clone": false 00:32:07.347 } 00:32:07.347 } 00:32:07.347 } 00:32:07.347 ] 00:32:07.347 06:48:20 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:07.347 06:48:20 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:32:07.347 06:48:20 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:32:07.606 [2024-07-25 06:48:20.982646] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:07.606 COMP_lvs0/lv0 00:32:07.606 06:48:20 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:07.606 06:48:20 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:32:07.606 06:48:20 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:07.606 06:48:20 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:07.606 06:48:20 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:07.606 06:48:20 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:07.606 06:48:20 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:07.865 06:48:21 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:07.865 [ 00:32:07.865 { 00:32:07.865 "name": "COMP_lvs0/lv0", 00:32:07.865 "aliases": [ 00:32:07.865 "e9d54e6a-cda4-5763-95ec-9aef4dcd95a2" 00:32:07.865 ], 00:32:07.865 "product_name": "compress", 00:32:07.865 "block_size": 4096, 00:32:07.865 "num_blocks": 25088, 00:32:07.865 "uuid": "e9d54e6a-cda4-5763-95ec-9aef4dcd95a2", 00:32:07.865 "assigned_rate_limits": { 00:32:07.865 "rw_ios_per_sec": 0, 00:32:07.865 "rw_mbytes_per_sec": 0, 00:32:07.865 "r_mbytes_per_sec": 0, 00:32:07.865 "w_mbytes_per_sec": 0 00:32:07.865 }, 00:32:07.865 "claimed": false, 00:32:07.865 "zoned": false, 00:32:07.865 "supported_io_types": { 00:32:07.865 "read": true, 00:32:07.865 "write": true, 00:32:07.865 "unmap": false, 00:32:07.865 "flush": false, 00:32:07.865 "reset": false, 00:32:07.865 "nvme_admin": false, 00:32:07.865 "nvme_io": false, 00:32:07.865 "nvme_io_md": false, 00:32:07.865 "write_zeroes": true, 00:32:07.865 "zcopy": false, 00:32:07.865 "get_zone_info": false, 00:32:07.865 "zone_management": false, 00:32:07.865 "zone_append": false, 00:32:07.865 "compare": false, 00:32:07.865 "compare_and_write": false, 00:32:07.865 "abort": false, 00:32:07.865 "seek_hole": false, 00:32:07.865 "seek_data": false, 00:32:07.865 "copy": false, 00:32:07.865 "nvme_iov_md": false 00:32:07.865 }, 00:32:07.865 "driver_specific": { 00:32:07.865 "compress": { 00:32:07.865 "name": "COMP_lvs0/lv0", 00:32:07.865 "base_bdev_name": "79d96502-3b88-4367-a318-135619c14c77", 00:32:07.865 "pm_path": "/tmp/pmem/8576f12d-80c4-4a6d-8f50-2209bde36457" 00:32:07.865 } 00:32:07.865 } 00:32:07.865 } 00:32:07.865 ] 00:32:07.865 06:48:21 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:07.865 06:48:21 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:08.124 [2024-07-25 06:48:21.508606] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff36c1b15c0 PMD being used: compress_qat 00:32:08.124 [2024-07-25 06:48:21.510646] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2133840 PMD being used: compress_qat 00:32:08.124 Running I/O for 3 seconds... 00:32:11.418 00:32:11.418 Latency(us) 00:32:11.418 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:11.418 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:11.418 Verification LBA range: start 0x0 length 0x3100 00:32:11.418 COMP_lvs0/lv0 : 3.01 4036.94 15.77 0.00 0.00 7873.52 176.95 13631.49 00:32:11.418 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:11.418 Verification LBA range: start 0x3100 length 0x3100 00:32:11.418 COMP_lvs0/lv0 : 3.01 4140.28 16.17 0.00 0.00 7685.44 164.66 13212.06 00:32:11.418 =================================================================================================================== 00:32:11.418 Total : 8177.22 31.94 0.00 0.00 7778.25 164.66 13631.49 00:32:11.418 0 00:32:11.418 06:48:24 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:32:11.418 06:48:24 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:11.418 06:48:24 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:11.418 06:48:24 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:11.418 06:48:24 compress_compdev -- compress/compress.sh@78 -- # killprocess 1299318 00:32:11.418 06:48:24 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1299318 ']' 00:32:11.418 06:48:24 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1299318 00:32:11.418 06:48:24 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:32:11.418 06:48:24 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:11.418 06:48:24 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1299318 00:32:11.676 06:48:25 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:32:11.676 06:48:25 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:32:11.676 06:48:25 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1299318' 00:32:11.676 killing process with pid 1299318 00:32:11.676 06:48:25 compress_compdev -- common/autotest_common.sh@969 -- # kill 1299318 00:32:11.676 Received shutdown signal, test time was about 3.000000 seconds 00:32:11.676 00:32:11.676 Latency(us) 00:32:11.676 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:11.676 =================================================================================================================== 00:32:11.676 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:11.676 06:48:25 compress_compdev -- common/autotest_common.sh@974 -- # wait 1299318 00:32:14.210 06:48:27 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:32:14.210 06:48:27 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:32:14.210 06:48:27 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1301326 00:32:14.210 06:48:27 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:14.210 06:48:27 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:32:14.210 06:48:27 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1301326 00:32:14.210 06:48:27 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1301326 ']' 00:32:14.210 06:48:27 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:14.210 06:48:27 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:14.210 06:48:27 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:14.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:14.210 06:48:27 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:14.210 06:48:27 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:14.210 [2024-07-25 06:48:27.527224] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:32:14.210 [2024-07-25 06:48:27.527287] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1301326 ] 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:14.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:14.210 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:14.210 [2024-07-25 06:48:27.663898] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:14.210 [2024-07-25 06:48:27.710490] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:14.210 [2024-07-25 06:48:27.710584] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:14.210 [2024-07-25 06:48:27.710588] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:14.779 [2024-07-25 06:48:28.315679] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:32:15.038 06:48:28 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:15.038 06:48:28 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:32:15.038 06:48:28 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:32:15.038 06:48:28 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:15.038 06:48:28 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:18.329 [2024-07-25 06:48:31.394887] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2572840 PMD being used: compress_qat 00:32:18.329 06:48:31 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:18.329 06:48:31 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:32:18.329 06:48:31 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:18.329 06:48:31 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:18.329 06:48:31 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:18.329 06:48:31 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:18.329 06:48:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:18.329 06:48:31 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:18.329 [ 00:32:18.329 { 00:32:18.329 "name": "Nvme0n1", 00:32:18.329 "aliases": [ 00:32:18.329 "c2e14c25-2c1e-401f-8a88-ba560478148c" 00:32:18.329 ], 00:32:18.329 "product_name": "NVMe disk", 00:32:18.329 "block_size": 512, 00:32:18.329 "num_blocks": 3907029168, 00:32:18.329 "uuid": "c2e14c25-2c1e-401f-8a88-ba560478148c", 00:32:18.329 "assigned_rate_limits": { 00:32:18.329 "rw_ios_per_sec": 0, 00:32:18.329 "rw_mbytes_per_sec": 0, 00:32:18.329 "r_mbytes_per_sec": 0, 00:32:18.329 "w_mbytes_per_sec": 0 00:32:18.329 }, 00:32:18.329 "claimed": false, 00:32:18.329 "zoned": false, 00:32:18.329 "supported_io_types": { 00:32:18.329 "read": true, 00:32:18.329 "write": true, 00:32:18.329 "unmap": true, 00:32:18.329 "flush": true, 00:32:18.329 "reset": true, 00:32:18.329 "nvme_admin": true, 00:32:18.329 "nvme_io": true, 00:32:18.329 "nvme_io_md": false, 00:32:18.329 "write_zeroes": true, 00:32:18.329 "zcopy": false, 00:32:18.329 "get_zone_info": false, 00:32:18.329 "zone_management": false, 00:32:18.329 "zone_append": false, 00:32:18.329 "compare": false, 00:32:18.329 "compare_and_write": false, 00:32:18.329 "abort": true, 00:32:18.329 "seek_hole": false, 00:32:18.329 "seek_data": false, 00:32:18.329 "copy": false, 00:32:18.329 "nvme_iov_md": false 00:32:18.329 }, 00:32:18.329 "driver_specific": { 00:32:18.329 "nvme": [ 00:32:18.329 { 00:32:18.329 "pci_address": "0000:d8:00.0", 00:32:18.329 "trid": { 00:32:18.329 "trtype": "PCIe", 00:32:18.329 "traddr": "0000:d8:00.0" 00:32:18.329 }, 00:32:18.329 "ctrlr_data": { 00:32:18.329 "cntlid": 0, 00:32:18.329 "vendor_id": "0x8086", 00:32:18.329 "model_number": "INTEL SSDPE2KX020T8", 00:32:18.329 "serial_number": "BTLJ125505KA2P0BGN", 00:32:18.329 "firmware_revision": "VDV10170", 00:32:18.329 "oacs": { 00:32:18.329 "security": 0, 00:32:18.329 "format": 1, 00:32:18.329 "firmware": 1, 00:32:18.329 "ns_manage": 1 00:32:18.329 }, 00:32:18.329 "multi_ctrlr": false, 00:32:18.329 "ana_reporting": false 00:32:18.329 }, 00:32:18.329 "vs": { 00:32:18.329 "nvme_version": "1.2" 00:32:18.329 }, 00:32:18.329 "ns_data": { 00:32:18.329 "id": 1, 00:32:18.329 "can_share": false 00:32:18.329 } 00:32:18.329 } 00:32:18.329 ], 00:32:18.329 "mp_policy": "active_passive" 00:32:18.329 } 00:32:18.329 } 00:32:18.329 ] 00:32:18.329 06:48:31 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:18.329 06:48:31 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:18.588 [2024-07-25 06:48:31.903199] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23d7510 PMD being used: compress_qat 00:32:19.524 8e976017-8e48-4553-b449-0b4787f638cc 00:32:19.524 06:48:32 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:19.783 c83520b5-e188-46b7-8f4d-cef86a460296 00:32:19.783 06:48:33 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:19.783 06:48:33 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:32:19.783 06:48:33 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:19.783 06:48:33 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:19.783 06:48:33 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:19.783 06:48:33 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:19.783 06:48:33 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:19.783 06:48:33 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:20.040 [ 00:32:20.040 { 00:32:20.040 "name": "c83520b5-e188-46b7-8f4d-cef86a460296", 00:32:20.040 "aliases": [ 00:32:20.040 "lvs0/lv0" 00:32:20.040 ], 00:32:20.040 "product_name": "Logical Volume", 00:32:20.040 "block_size": 512, 00:32:20.040 "num_blocks": 204800, 00:32:20.040 "uuid": "c83520b5-e188-46b7-8f4d-cef86a460296", 00:32:20.040 "assigned_rate_limits": { 00:32:20.040 "rw_ios_per_sec": 0, 00:32:20.040 "rw_mbytes_per_sec": 0, 00:32:20.040 "r_mbytes_per_sec": 0, 00:32:20.040 "w_mbytes_per_sec": 0 00:32:20.040 }, 00:32:20.040 "claimed": false, 00:32:20.040 "zoned": false, 00:32:20.040 "supported_io_types": { 00:32:20.040 "read": true, 00:32:20.040 "write": true, 00:32:20.040 "unmap": true, 00:32:20.040 "flush": false, 00:32:20.040 "reset": true, 00:32:20.040 "nvme_admin": false, 00:32:20.040 "nvme_io": false, 00:32:20.040 "nvme_io_md": false, 00:32:20.040 "write_zeroes": true, 00:32:20.040 "zcopy": false, 00:32:20.040 "get_zone_info": false, 00:32:20.040 "zone_management": false, 00:32:20.040 "zone_append": false, 00:32:20.040 "compare": false, 00:32:20.040 "compare_and_write": false, 00:32:20.040 "abort": false, 00:32:20.040 "seek_hole": true, 00:32:20.040 "seek_data": true, 00:32:20.040 "copy": false, 00:32:20.040 "nvme_iov_md": false 00:32:20.040 }, 00:32:20.040 "driver_specific": { 00:32:20.040 "lvol": { 00:32:20.040 "lvol_store_uuid": "8e976017-8e48-4553-b449-0b4787f638cc", 00:32:20.040 "base_bdev": "Nvme0n1", 00:32:20.040 "thin_provision": true, 00:32:20.040 "num_allocated_clusters": 0, 00:32:20.040 "snapshot": false, 00:32:20.040 "clone": false, 00:32:20.040 "esnap_clone": false 00:32:20.040 } 00:32:20.040 } 00:32:20.040 } 00:32:20.040 ] 00:32:20.040 06:48:33 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:20.040 06:48:33 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:20.040 06:48:33 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:20.040 [2024-07-25 06:48:33.579398] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:20.040 COMP_lvs0/lv0 00:32:20.298 06:48:33 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:20.298 06:48:33 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:32:20.298 06:48:33 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:20.298 06:48:33 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:20.298 06:48:33 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:20.298 06:48:33 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:20.298 06:48:33 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:20.298 06:48:33 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:20.557 [ 00:32:20.557 { 00:32:20.557 "name": "COMP_lvs0/lv0", 00:32:20.557 "aliases": [ 00:32:20.557 "cd23afbb-c7d2-5961-85a6-052e654c9fb5" 00:32:20.557 ], 00:32:20.557 "product_name": "compress", 00:32:20.557 "block_size": 512, 00:32:20.557 "num_blocks": 200704, 00:32:20.557 "uuid": "cd23afbb-c7d2-5961-85a6-052e654c9fb5", 00:32:20.557 "assigned_rate_limits": { 00:32:20.557 "rw_ios_per_sec": 0, 00:32:20.557 "rw_mbytes_per_sec": 0, 00:32:20.557 "r_mbytes_per_sec": 0, 00:32:20.557 "w_mbytes_per_sec": 0 00:32:20.557 }, 00:32:20.557 "claimed": false, 00:32:20.557 "zoned": false, 00:32:20.557 "supported_io_types": { 00:32:20.557 "read": true, 00:32:20.557 "write": true, 00:32:20.557 "unmap": false, 00:32:20.557 "flush": false, 00:32:20.557 "reset": false, 00:32:20.557 "nvme_admin": false, 00:32:20.557 "nvme_io": false, 00:32:20.557 "nvme_io_md": false, 00:32:20.557 "write_zeroes": true, 00:32:20.557 "zcopy": false, 00:32:20.557 "get_zone_info": false, 00:32:20.557 "zone_management": false, 00:32:20.557 "zone_append": false, 00:32:20.557 "compare": false, 00:32:20.557 "compare_and_write": false, 00:32:20.557 "abort": false, 00:32:20.557 "seek_hole": false, 00:32:20.557 "seek_data": false, 00:32:20.557 "copy": false, 00:32:20.557 "nvme_iov_md": false 00:32:20.557 }, 00:32:20.557 "driver_specific": { 00:32:20.557 "compress": { 00:32:20.557 "name": "COMP_lvs0/lv0", 00:32:20.557 "base_bdev_name": "c83520b5-e188-46b7-8f4d-cef86a460296", 00:32:20.557 "pm_path": "/tmp/pmem/c9e007ba-b2fc-40f4-b31d-8007a1b0977c" 00:32:20.557 } 00:32:20.557 } 00:32:20.557 } 00:32:20.557 ] 00:32:20.557 06:48:33 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:20.557 06:48:33 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:20.557 [2024-07-25 06:48:33.987904] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f63681b1350 PMD being used: compress_qat 00:32:20.557 I/O targets: 00:32:20.557 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:32:20.557 00:32:20.557 00:32:20.558 CUnit - A unit testing framework for C - Version 2.1-3 00:32:20.558 http://cunit.sourceforge.net/ 00:32:20.558 00:32:20.558 00:32:20.558 Suite: bdevio tests on: COMP_lvs0/lv0 00:32:20.558 Test: blockdev write read block ...passed 00:32:20.558 Test: blockdev write zeroes read block ...passed 00:32:20.558 Test: blockdev write zeroes read no split ...passed 00:32:20.558 Test: blockdev write zeroes read split ...passed 00:32:20.558 Test: blockdev write zeroes read split partial ...passed 00:32:20.558 Test: blockdev reset ...[2024-07-25 06:48:34.046988] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:32:20.558 passed 00:32:20.558 Test: blockdev write read 8 blocks ...passed 00:32:20.558 Test: blockdev write read size > 128k ...passed 00:32:20.558 Test: blockdev write read invalid size ...passed 00:32:20.558 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:20.558 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:20.558 Test: blockdev write read max offset ...passed 00:32:20.558 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:20.558 Test: blockdev writev readv 8 blocks ...passed 00:32:20.558 Test: blockdev writev readv 30 x 1block ...passed 00:32:20.558 Test: blockdev writev readv block ...passed 00:32:20.558 Test: blockdev writev readv size > 128k ...passed 00:32:20.558 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:20.558 Test: blockdev comparev and writev ...passed 00:32:20.558 Test: blockdev nvme passthru rw ...passed 00:32:20.558 Test: blockdev nvme passthru vendor specific ...passed 00:32:20.558 Test: blockdev nvme admin passthru ...passed 00:32:20.558 Test: blockdev copy ...passed 00:32:20.558 00:32:20.558 Run Summary: Type Total Ran Passed Failed Inactive 00:32:20.558 suites 1 1 n/a 0 0 00:32:20.558 tests 23 23 23 0 0 00:32:20.558 asserts 130 130 130 0 n/a 00:32:20.558 00:32:20.558 Elapsed time = 0.196 seconds 00:32:20.558 0 00:32:20.558 06:48:34 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:32:20.558 06:48:34 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:20.819 06:48:34 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:21.115 06:48:34 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:32:21.115 06:48:34 compress_compdev -- compress/compress.sh@62 -- # killprocess 1301326 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1301326 ']' 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1301326 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1301326 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1301326' 00:32:21.115 killing process with pid 1301326 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@969 -- # kill 1301326 00:32:21.115 06:48:34 compress_compdev -- common/autotest_common.sh@974 -- # wait 1301326 00:32:23.651 06:48:37 compress_compdev -- compress/compress.sh@91 -- # '[' 1 -eq 1 ']' 00:32:23.651 06:48:37 compress_compdev -- compress/compress.sh@92 -- # run_bdevperf 64 16384 30 00:32:23.651 06:48:37 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:32:23.651 06:48:37 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1302942 00:32:23.651 06:48:37 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:23.651 06:48:37 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 64 -o 16384 -w verify -t 30 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:32:23.651 06:48:37 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1302942 00:32:23.651 06:48:37 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1302942 ']' 00:32:23.651 06:48:37 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:23.651 06:48:37 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:23.651 06:48:37 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:23.651 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:23.651 06:48:37 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:23.651 06:48:37 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:23.651 [2024-07-25 06:48:37.169259] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:32:23.651 [2024-07-25 06:48:37.169309] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1302942 ] 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:01.0 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:01.1 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:01.2 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:01.3 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:01.4 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:01.5 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:01.6 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:01.7 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:02.0 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:02.1 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:02.2 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:02.3 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:02.4 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:02.5 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:02.6 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3d:02.7 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:01.0 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:01.1 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:01.2 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:01.3 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:01.4 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:01.5 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:01.6 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:01.7 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:02.0 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:02.1 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:02.2 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:02.3 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:02.4 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:02.5 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:02.6 cannot be used 00:32:23.910 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:32:23.910 EAL: Requested device 0000:3f:02.7 cannot be used 00:32:23.910 [2024-07-25 06:48:37.277201] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:23.910 [2024-07-25 06:48:37.322701] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:23.910 [2024-07-25 06:48:37.322706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:24.477 [2024-07-25 06:48:37.921221] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:32:24.477 06:48:38 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:24.477 06:48:38 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:32:24.477 06:48:38 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:32:24.477 06:48:38 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:24.477 06:48:38 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:27.767 [2024-07-25 06:48:41.103703] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2870c80 PMD being used: compress_qat 00:32:27.767 06:48:41 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:27.767 06:48:41 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:32:27.767 06:48:41 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:27.767 06:48:41 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:27.767 06:48:41 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:27.767 06:48:41 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:27.767 06:48:41 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:28.026 06:48:41 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:28.026 [ 00:32:28.026 { 00:32:28.026 "name": "Nvme0n1", 00:32:28.026 "aliases": [ 00:32:28.026 "c2a5bfd2-3998-4ed8-9527-e1a1e87125a5" 00:32:28.026 ], 00:32:28.026 "product_name": "NVMe disk", 00:32:28.026 "block_size": 512, 00:32:28.026 "num_blocks": 3907029168, 00:32:28.026 "uuid": "c2a5bfd2-3998-4ed8-9527-e1a1e87125a5", 00:32:28.026 "assigned_rate_limits": { 00:32:28.026 "rw_ios_per_sec": 0, 00:32:28.026 "rw_mbytes_per_sec": 0, 00:32:28.026 "r_mbytes_per_sec": 0, 00:32:28.026 "w_mbytes_per_sec": 0 00:32:28.026 }, 00:32:28.026 "claimed": false, 00:32:28.026 "zoned": false, 00:32:28.026 "supported_io_types": { 00:32:28.026 "read": true, 00:32:28.026 "write": true, 00:32:28.026 "unmap": true, 00:32:28.026 "flush": true, 00:32:28.026 "reset": true, 00:32:28.026 "nvme_admin": true, 00:32:28.026 "nvme_io": true, 00:32:28.026 "nvme_io_md": false, 00:32:28.026 "write_zeroes": true, 00:32:28.026 "zcopy": false, 00:32:28.026 "get_zone_info": false, 00:32:28.026 "zone_management": false, 00:32:28.026 "zone_append": false, 00:32:28.026 "compare": false, 00:32:28.026 "compare_and_write": false, 00:32:28.026 "abort": true, 00:32:28.026 "seek_hole": false, 00:32:28.026 "seek_data": false, 00:32:28.026 "copy": false, 00:32:28.026 "nvme_iov_md": false 00:32:28.026 }, 00:32:28.026 "driver_specific": { 00:32:28.026 "nvme": [ 00:32:28.026 { 00:32:28.026 "pci_address": "0000:d8:00.0", 00:32:28.026 "trid": { 00:32:28.026 "trtype": "PCIe", 00:32:28.026 "traddr": "0000:d8:00.0" 00:32:28.026 }, 00:32:28.026 "ctrlr_data": { 00:32:28.026 "cntlid": 0, 00:32:28.026 "vendor_id": "0x8086", 00:32:28.026 "model_number": "INTEL SSDPE2KX020T8", 00:32:28.026 "serial_number": "BTLJ125505KA2P0BGN", 00:32:28.026 "firmware_revision": "VDV10170", 00:32:28.026 "oacs": { 00:32:28.026 "security": 0, 00:32:28.026 "format": 1, 00:32:28.026 "firmware": 1, 00:32:28.026 "ns_manage": 1 00:32:28.026 }, 00:32:28.026 "multi_ctrlr": false, 00:32:28.026 "ana_reporting": false 00:32:28.026 }, 00:32:28.026 "vs": { 00:32:28.026 "nvme_version": "1.2" 00:32:28.026 }, 00:32:28.026 "ns_data": { 00:32:28.026 "id": 1, 00:32:28.026 "can_share": false 00:32:28.026 } 00:32:28.026 } 00:32:28.026 ], 00:32:28.026 "mp_policy": "active_passive" 00:32:28.026 } 00:32:28.026 } 00:32:28.026 ] 00:32:28.285 06:48:41 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:28.285 06:48:41 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:28.285 [2024-07-25 06:48:41.736436] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26d5950 PMD being used: compress_qat 00:32:29.223 d287d9a9-9f23-4f16-9ebe-295dde1ef55e 00:32:29.223 06:48:42 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:29.482 95df737d-6928-4bd0-8d29-021b6463f5c2 00:32:29.482 06:48:42 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:29.482 06:48:42 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:32:29.482 06:48:42 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:29.482 06:48:42 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:29.482 06:48:42 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:29.482 06:48:42 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:29.482 06:48:42 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:29.742 06:48:43 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:30.000 [ 00:32:30.000 { 00:32:30.000 "name": "95df737d-6928-4bd0-8d29-021b6463f5c2", 00:32:30.000 "aliases": [ 00:32:30.000 "lvs0/lv0" 00:32:30.000 ], 00:32:30.000 "product_name": "Logical Volume", 00:32:30.000 "block_size": 512, 00:32:30.000 "num_blocks": 204800, 00:32:30.000 "uuid": "95df737d-6928-4bd0-8d29-021b6463f5c2", 00:32:30.000 "assigned_rate_limits": { 00:32:30.000 "rw_ios_per_sec": 0, 00:32:30.000 "rw_mbytes_per_sec": 0, 00:32:30.000 "r_mbytes_per_sec": 0, 00:32:30.000 "w_mbytes_per_sec": 0 00:32:30.000 }, 00:32:30.000 "claimed": false, 00:32:30.000 "zoned": false, 00:32:30.000 "supported_io_types": { 00:32:30.000 "read": true, 00:32:30.000 "write": true, 00:32:30.000 "unmap": true, 00:32:30.000 "flush": false, 00:32:30.000 "reset": true, 00:32:30.000 "nvme_admin": false, 00:32:30.000 "nvme_io": false, 00:32:30.000 "nvme_io_md": false, 00:32:30.000 "write_zeroes": true, 00:32:30.000 "zcopy": false, 00:32:30.000 "get_zone_info": false, 00:32:30.000 "zone_management": false, 00:32:30.000 "zone_append": false, 00:32:30.000 "compare": false, 00:32:30.000 "compare_and_write": false, 00:32:30.000 "abort": false, 00:32:30.000 "seek_hole": true, 00:32:30.000 "seek_data": true, 00:32:30.000 "copy": false, 00:32:30.000 "nvme_iov_md": false 00:32:30.000 }, 00:32:30.000 "driver_specific": { 00:32:30.000 "lvol": { 00:32:30.000 "lvol_store_uuid": "d287d9a9-9f23-4f16-9ebe-295dde1ef55e", 00:32:30.000 "base_bdev": "Nvme0n1", 00:32:30.000 "thin_provision": true, 00:32:30.000 "num_allocated_clusters": 0, 00:32:30.000 "snapshot": false, 00:32:30.000 "clone": false, 00:32:30.000 "esnap_clone": false 00:32:30.000 } 00:32:30.000 } 00:32:30.000 } 00:32:30.000 ] 00:32:30.000 06:48:43 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:30.000 06:48:43 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:30.000 06:48:43 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:30.000 [2024-07-25 06:48:43.478428] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:30.000 COMP_lvs0/lv0 00:32:30.000 06:48:43 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:30.000 06:48:43 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:32:30.000 06:48:43 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:30.000 06:48:43 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:32:30.000 06:48:43 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:30.000 06:48:43 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:30.000 06:48:43 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:30.259 06:48:43 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:30.259 [ 00:32:30.259 { 00:32:30.259 "name": "COMP_lvs0/lv0", 00:32:30.259 "aliases": [ 00:32:30.259 "5d0aa17d-e63f-5272-9498-a09a87ba50d5" 00:32:30.259 ], 00:32:30.259 "product_name": "compress", 00:32:30.259 "block_size": 512, 00:32:30.259 "num_blocks": 200704, 00:32:30.259 "uuid": "5d0aa17d-e63f-5272-9498-a09a87ba50d5", 00:32:30.259 "assigned_rate_limits": { 00:32:30.259 "rw_ios_per_sec": 0, 00:32:30.259 "rw_mbytes_per_sec": 0, 00:32:30.259 "r_mbytes_per_sec": 0, 00:32:30.259 "w_mbytes_per_sec": 0 00:32:30.259 }, 00:32:30.259 "claimed": false, 00:32:30.259 "zoned": false, 00:32:30.259 "supported_io_types": { 00:32:30.259 "read": true, 00:32:30.259 "write": true, 00:32:30.259 "unmap": false, 00:32:30.259 "flush": false, 00:32:30.259 "reset": false, 00:32:30.259 "nvme_admin": false, 00:32:30.259 "nvme_io": false, 00:32:30.259 "nvme_io_md": false, 00:32:30.259 "write_zeroes": true, 00:32:30.259 "zcopy": false, 00:32:30.259 "get_zone_info": false, 00:32:30.259 "zone_management": false, 00:32:30.259 "zone_append": false, 00:32:30.259 "compare": false, 00:32:30.259 "compare_and_write": false, 00:32:30.259 "abort": false, 00:32:30.259 "seek_hole": false, 00:32:30.259 "seek_data": false, 00:32:30.259 "copy": false, 00:32:30.259 "nvme_iov_md": false 00:32:30.259 }, 00:32:30.259 "driver_specific": { 00:32:30.259 "compress": { 00:32:30.259 "name": "COMP_lvs0/lv0", 00:32:30.259 "base_bdev_name": "95df737d-6928-4bd0-8d29-021b6463f5c2", 00:32:30.259 "pm_path": "/tmp/pmem/5e733198-1259-4c89-9635-9a198a414fc7" 00:32:30.259 } 00:32:30.259 } 00:32:30.259 } 00:32:30.259 ] 00:32:30.518 06:48:43 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:32:30.518 06:48:43 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:30.518 [2024-07-25 06:48:43.900041] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f83241b15c0 PMD being used: compress_qat 00:32:30.518 [2024-07-25 06:48:43.902160] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x274cce0 PMD being used: compress_qat 00:32:30.518 Running I/O for 30 seconds... 00:33:02.635 00:33:02.635 Latency(us) 00:33:02.635 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:02.635 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 64, IO size: 16384) 00:33:02.635 Verification LBA range: start 0x0 length 0xc40 00:33:02.635 COMP_lvs0/lv0 : 30.01 1745.17 27.27 0.00 0.00 36414.95 452.20 33973.86 00:33:02.635 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 64, IO size: 16384) 00:33:02.635 Verification LBA range: start 0xc40 length 0xc40 00:33:02.635 COMP_lvs0/lv0 : 30.00 5482.26 85.66 0.00 0.00 11560.37 114.69 19713.23 00:33:02.635 =================================================================================================================== 00:33:02.635 Total : 7227.43 112.93 0.00 0.00 17562.12 114.69 33973.86 00:33:02.635 0 00:33:02.635 06:49:13 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:33:02.635 06:49:13 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:33:02.635 06:49:14 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:33:02.635 06:49:14 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:33:02.635 06:49:14 compress_compdev -- compress/compress.sh@78 -- # killprocess 1302942 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1302942 ']' 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1302942 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1302942 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1302942' 00:33:02.635 killing process with pid 1302942 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@969 -- # kill 1302942 00:33:02.635 Received shutdown signal, test time was about 30.000000 seconds 00:33:02.635 00:33:02.635 Latency(us) 00:33:02.635 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:02.635 =================================================================================================================== 00:33:02.635 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:02.635 06:49:14 compress_compdev -- common/autotest_common.sh@974 -- # wait 1302942 00:33:03.570 06:49:16 compress_compdev -- compress/compress.sh@95 -- # export TEST_TRANSPORT=tcp 00:33:03.570 06:49:16 compress_compdev -- compress/compress.sh@95 -- # TEST_TRANSPORT=tcp 00:33:03.570 06:49:16 compress_compdev -- compress/compress.sh@96 -- # NET_TYPE=virt 00:33:03.570 06:49:16 compress_compdev -- compress/compress.sh@96 -- # nvmftestinit 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@448 -- # prepare_net_devs 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@410 -- # local -g is_hw=no 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@412 -- # remove_spdk_ns 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:33:03.570 06:49:16 compress_compdev -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:33:03.570 06:49:16 compress_compdev -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@414 -- # [[ virt != virt ]] 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@423 -- # [[ virt == phy ]] 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@426 -- # [[ virt == phy-fallback ]] 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@432 -- # nvmf_veth_init 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:33:03.570 06:49:16 compress_compdev -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:33:03.571 Cannot find device "nvmf_tgt_br" 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@155 -- # true 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:33:03.571 Cannot find device "nvmf_tgt_br2" 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@156 -- # true 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:33:03.571 Cannot find device "nvmf_tgt_br" 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@158 -- # true 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:33:03.571 Cannot find device "nvmf_tgt_br2" 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@159 -- # true 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:33:03.571 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@162 -- # true 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:33:03.571 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@163 -- # true 00:33:03.571 06:49:16 compress_compdev -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:33:03.571 06:49:17 compress_compdev -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:33:03.571 06:49:17 compress_compdev -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:33:03.571 06:49:17 compress_compdev -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:33:03.571 06:49:17 compress_compdev -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:33:03.571 06:49:17 compress_compdev -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:33:03.571 06:49:17 compress_compdev -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:33:03.571 06:49:17 compress_compdev -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:33:03.828 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:33:03.828 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.102 ms 00:33:03.828 00:33:03.828 --- 10.0.0.2 ping statistics --- 00:33:03.828 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:03.828 rtt min/avg/max/mdev = 0.102/0.102/0.102/0.000 ms 00:33:03.828 06:49:17 compress_compdev -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:33:03.828 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:33:03.828 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.064 ms 00:33:03.828 00:33:03.828 --- 10.0.0.3 ping statistics --- 00:33:03.829 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:03.829 rtt min/avg/max/mdev = 0.064/0.064/0.064/0.000 ms 00:33:03.829 06:49:17 compress_compdev -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:33:04.117 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:33:04.117 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.060 ms 00:33:04.117 00:33:04.117 --- 10.0.0.1 ping statistics --- 00:33:04.117 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:04.117 rtt min/avg/max/mdev = 0.060/0.060/0.060/0.000 ms 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@433 -- # return 0 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:33:04.117 06:49:17 compress_compdev -- compress/compress.sh@97 -- # nvmfappstart -m 0x7 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:33:04.117 06:49:17 compress_compdev -- common/autotest_common.sh@724 -- # xtrace_disable 00:33:04.117 06:49:17 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@481 -- # nvmfpid=1309874 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@482 -- # waitforlisten 1309874 00:33:04.117 06:49:17 compress_compdev -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x7 00:33:04.117 06:49:17 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1309874 ']' 00:33:04.117 06:49:17 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:04.117 06:49:17 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:04.117 06:49:17 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:04.117 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:04.117 06:49:17 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:04.117 06:49:17 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:33:04.117 [2024-07-25 06:49:17.506204] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:33:04.117 [2024-07-25 06:49:17.506264] [ DPDK EAL parameters: nvmf -c 0x7 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:04.117 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:04.117 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:04.117 [2024-07-25 06:49:17.653061] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:04.376 [2024-07-25 06:49:17.700461] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:33:04.376 [2024-07-25 06:49:17.700508] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:33:04.376 [2024-07-25 06:49:17.700528] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:33:04.376 [2024-07-25 06:49:17.700546] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:33:04.376 [2024-07-25 06:49:17.700561] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:33:04.376 [2024-07-25 06:49:17.700618] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:04.376 [2024-07-25 06:49:17.700644] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:04.376 [2024-07-25 06:49:17.700651] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:04.941 06:49:18 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:04.942 06:49:18 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:33:04.942 06:49:18 compress_compdev -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:33:04.942 06:49:18 compress_compdev -- common/autotest_common.sh@730 -- # xtrace_disable 00:33:04.942 06:49:18 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:33:04.942 06:49:18 compress_compdev -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:33:04.942 06:49:18 compress_compdev -- compress/compress.sh@98 -- # trap 'nvmftestfini; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:33:04.942 06:49:18 compress_compdev -- compress/compress.sh@101 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t tcp -u 8192 00:33:05.200 [2024-07-25 06:49:18.559764] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:33:05.200 06:49:18 compress_compdev -- compress/compress.sh@102 -- # create_vols 00:33:05.200 06:49:18 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:33:05.200 06:49:18 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:33:08.500 06:49:21 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:33:08.500 06:49:21 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:33:08.500 06:49:21 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:33:08.500 06:49:21 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:33:08.500 06:49:21 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:33:08.500 06:49:21 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:33:08.500 06:49:21 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:08.500 06:49:21 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:33:08.759 [ 00:33:08.759 { 00:33:08.759 "name": "Nvme0n1", 00:33:08.759 "aliases": [ 00:33:08.759 "bbc9df69-29cf-4e0e-8887-90252392049b" 00:33:08.759 ], 00:33:08.759 "product_name": "NVMe disk", 00:33:08.759 "block_size": 512, 00:33:08.759 "num_blocks": 3907029168, 00:33:08.759 "uuid": "bbc9df69-29cf-4e0e-8887-90252392049b", 00:33:08.759 "assigned_rate_limits": { 00:33:08.759 "rw_ios_per_sec": 0, 00:33:08.759 "rw_mbytes_per_sec": 0, 00:33:08.759 "r_mbytes_per_sec": 0, 00:33:08.759 "w_mbytes_per_sec": 0 00:33:08.759 }, 00:33:08.759 "claimed": false, 00:33:08.759 "zoned": false, 00:33:08.759 "supported_io_types": { 00:33:08.759 "read": true, 00:33:08.759 "write": true, 00:33:08.759 "unmap": true, 00:33:08.759 "flush": true, 00:33:08.759 "reset": true, 00:33:08.759 "nvme_admin": true, 00:33:08.759 "nvme_io": true, 00:33:08.759 "nvme_io_md": false, 00:33:08.759 "write_zeroes": true, 00:33:08.759 "zcopy": false, 00:33:08.759 "get_zone_info": false, 00:33:08.759 "zone_management": false, 00:33:08.759 "zone_append": false, 00:33:08.759 "compare": false, 00:33:08.759 "compare_and_write": false, 00:33:08.759 "abort": true, 00:33:08.759 "seek_hole": false, 00:33:08.759 "seek_data": false, 00:33:08.759 "copy": false, 00:33:08.759 "nvme_iov_md": false 00:33:08.759 }, 00:33:08.759 "driver_specific": { 00:33:08.759 "nvme": [ 00:33:08.759 { 00:33:08.759 "pci_address": "0000:d8:00.0", 00:33:08.759 "trid": { 00:33:08.759 "trtype": "PCIe", 00:33:08.759 "traddr": "0000:d8:00.0" 00:33:08.759 }, 00:33:08.759 "ctrlr_data": { 00:33:08.759 "cntlid": 0, 00:33:08.759 "vendor_id": "0x8086", 00:33:08.759 "model_number": "INTEL SSDPE2KX020T8", 00:33:08.759 "serial_number": "BTLJ125505KA2P0BGN", 00:33:08.759 "firmware_revision": "VDV10170", 00:33:08.759 "oacs": { 00:33:08.759 "security": 0, 00:33:08.759 "format": 1, 00:33:08.759 "firmware": 1, 00:33:08.759 "ns_manage": 1 00:33:08.759 }, 00:33:08.759 "multi_ctrlr": false, 00:33:08.759 "ana_reporting": false 00:33:08.759 }, 00:33:08.759 "vs": { 00:33:08.759 "nvme_version": "1.2" 00:33:08.759 }, 00:33:08.759 "ns_data": { 00:33:08.759 "id": 1, 00:33:08.759 "can_share": false 00:33:08.759 } 00:33:08.759 } 00:33:08.759 ], 00:33:08.759 "mp_policy": "active_passive" 00:33:08.759 } 00:33:08.759 } 00:33:08.759 ] 00:33:08.759 06:49:22 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:33:08.759 06:49:22 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:33:10.135 60599276-a0ed-4db7-ac93-0f9985edcfa9 00:33:10.135 06:49:23 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:33:10.135 1853e078-d214-4064-b99a-60ddc7a13b6b 00:33:10.135 06:49:23 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:33:10.135 06:49:23 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:33:10.135 06:49:23 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:33:10.135 06:49:23 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:33:10.135 06:49:23 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:33:10.135 06:49:23 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:33:10.135 06:49:23 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:10.393 06:49:23 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:33:10.393 [ 00:33:10.393 { 00:33:10.393 "name": "1853e078-d214-4064-b99a-60ddc7a13b6b", 00:33:10.393 "aliases": [ 00:33:10.393 "lvs0/lv0" 00:33:10.393 ], 00:33:10.393 "product_name": "Logical Volume", 00:33:10.393 "block_size": 512, 00:33:10.393 "num_blocks": 204800, 00:33:10.393 "uuid": "1853e078-d214-4064-b99a-60ddc7a13b6b", 00:33:10.393 "assigned_rate_limits": { 00:33:10.393 "rw_ios_per_sec": 0, 00:33:10.393 "rw_mbytes_per_sec": 0, 00:33:10.393 "r_mbytes_per_sec": 0, 00:33:10.393 "w_mbytes_per_sec": 0 00:33:10.393 }, 00:33:10.393 "claimed": false, 00:33:10.393 "zoned": false, 00:33:10.393 "supported_io_types": { 00:33:10.393 "read": true, 00:33:10.393 "write": true, 00:33:10.393 "unmap": true, 00:33:10.393 "flush": false, 00:33:10.393 "reset": true, 00:33:10.393 "nvme_admin": false, 00:33:10.393 "nvme_io": false, 00:33:10.393 "nvme_io_md": false, 00:33:10.393 "write_zeroes": true, 00:33:10.393 "zcopy": false, 00:33:10.393 "get_zone_info": false, 00:33:10.393 "zone_management": false, 00:33:10.393 "zone_append": false, 00:33:10.393 "compare": false, 00:33:10.393 "compare_and_write": false, 00:33:10.393 "abort": false, 00:33:10.394 "seek_hole": true, 00:33:10.394 "seek_data": true, 00:33:10.394 "copy": false, 00:33:10.394 "nvme_iov_md": false 00:33:10.394 }, 00:33:10.394 "driver_specific": { 00:33:10.394 "lvol": { 00:33:10.394 "lvol_store_uuid": "60599276-a0ed-4db7-ac93-0f9985edcfa9", 00:33:10.394 "base_bdev": "Nvme0n1", 00:33:10.394 "thin_provision": true, 00:33:10.394 "num_allocated_clusters": 0, 00:33:10.394 "snapshot": false, 00:33:10.394 "clone": false, 00:33:10.394 "esnap_clone": false 00:33:10.394 } 00:33:10.394 } 00:33:10.394 } 00:33:10.394 ] 00:33:10.394 06:49:23 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:33:10.394 06:49:23 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:33:10.394 06:49:23 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:33:10.653 [2024-07-25 06:49:24.090947] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:33:10.653 COMP_lvs0/lv0 00:33:10.653 06:49:24 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:33:10.653 06:49:24 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:33:10.653 06:49:24 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:33:10.653 06:49:24 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:33:10.653 06:49:24 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:33:10.653 06:49:24 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:33:10.653 06:49:24 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:10.912 06:49:24 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:33:10.912 [ 00:33:10.912 { 00:33:10.912 "name": "COMP_lvs0/lv0", 00:33:10.912 "aliases": [ 00:33:10.912 "3d178462-5125-5879-9d0b-33cddd0ac455" 00:33:10.912 ], 00:33:10.912 "product_name": "compress", 00:33:10.912 "block_size": 512, 00:33:10.912 "num_blocks": 200704, 00:33:10.912 "uuid": "3d178462-5125-5879-9d0b-33cddd0ac455", 00:33:10.912 "assigned_rate_limits": { 00:33:10.912 "rw_ios_per_sec": 0, 00:33:10.912 "rw_mbytes_per_sec": 0, 00:33:10.912 "r_mbytes_per_sec": 0, 00:33:10.912 "w_mbytes_per_sec": 0 00:33:10.912 }, 00:33:10.912 "claimed": false, 00:33:10.912 "zoned": false, 00:33:10.912 "supported_io_types": { 00:33:10.912 "read": true, 00:33:10.912 "write": true, 00:33:10.912 "unmap": false, 00:33:10.912 "flush": false, 00:33:10.912 "reset": false, 00:33:10.912 "nvme_admin": false, 00:33:10.912 "nvme_io": false, 00:33:10.912 "nvme_io_md": false, 00:33:10.912 "write_zeroes": true, 00:33:10.912 "zcopy": false, 00:33:10.912 "get_zone_info": false, 00:33:10.912 "zone_management": false, 00:33:10.912 "zone_append": false, 00:33:10.912 "compare": false, 00:33:10.912 "compare_and_write": false, 00:33:10.912 "abort": false, 00:33:10.912 "seek_hole": false, 00:33:10.912 "seek_data": false, 00:33:10.912 "copy": false, 00:33:10.912 "nvme_iov_md": false 00:33:10.912 }, 00:33:10.912 "driver_specific": { 00:33:10.912 "compress": { 00:33:10.912 "name": "COMP_lvs0/lv0", 00:33:10.912 "base_bdev_name": "1853e078-d214-4064-b99a-60ddc7a13b6b", 00:33:10.912 "pm_path": "/tmp/pmem/d6c2da4d-5fd1-4e12-9050-01b89dfb7b2c" 00:33:10.912 } 00:33:10.912 } 00:33:10.912 } 00:33:10.912 ] 00:33:10.912 06:49:24 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:33:10.912 06:49:24 compress_compdev -- compress/compress.sh@103 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:33:11.171 06:49:24 compress_compdev -- compress/compress.sh@104 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 COMP_lvs0/lv0 00:33:11.428 06:49:24 compress_compdev -- compress/compress.sh@105 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t tcp -a 10.0.0.2 -s 4420 00:33:11.687 [2024-07-25 06:49:24.995183] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:33:11.687 06:49:25 compress_compdev -- compress/compress.sh@109 -- # perf_pid=1311190 00:33:11.687 06:49:25 compress_compdev -- compress/compress.sh@108 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_nvme_perf -r 'trtype:tcp adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -o 4096 -q 64 -s 512 -w randrw -t 30 -c 0x18 -M 50 00:33:11.687 06:49:25 compress_compdev -- compress/compress.sh@112 -- # trap 'killprocess $perf_pid; compress_err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:33:11.687 06:49:25 compress_compdev -- compress/compress.sh@113 -- # wait 1311190 00:33:11.945 [2024-07-25 06:49:25.254941] subsystem.c:1572:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on TCP/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:33:44.013 Initializing NVMe Controllers 00:33:44.013 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode0 00:33:44.013 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 3 00:33:44.013 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 4 00:33:44.013 Initialization complete. Launching workers. 00:33:44.013 ======================================================== 00:33:44.013 Latency(us) 00:33:44.013 Device Information : IOPS MiB/s Average min max 00:33:44.013 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 3: 5020.03 19.61 12750.40 1027.85 30764.38 00:33:44.013 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 4: 3151.33 12.31 20311.37 2400.05 37035.16 00:33:44.013 ======================================================== 00:33:44.013 Total : 8171.37 31.92 15666.33 1027.85 37035.16 00:33:44.013 00:33:44.013 06:49:55 compress_compdev -- compress/compress.sh@114 -- # destroy_vols 00:33:44.013 06:49:55 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:33:44.013 06:49:55 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:33:44.013 06:49:55 compress_compdev -- compress/compress.sh@116 -- # trap - SIGINT SIGTERM EXIT 00:33:44.013 06:49:55 compress_compdev -- compress/compress.sh@117 -- # nvmftestfini 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@488 -- # nvmfcleanup 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@117 -- # sync 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@120 -- # set +e 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@121 -- # for i in {1..20} 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:33:44.013 rmmod nvme_tcp 00:33:44.013 rmmod nvme_fabrics 00:33:44.013 rmmod nvme_keyring 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@124 -- # set -e 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@125 -- # return 0 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@489 -- # '[' -n 1309874 ']' 00:33:44.013 06:49:55 compress_compdev -- nvmf/common.sh@490 -- # killprocess 1309874 00:33:44.013 06:49:55 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1309874 ']' 00:33:44.013 06:49:55 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1309874 00:33:44.013 06:49:55 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:33:44.013 06:49:55 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:44.014 06:49:55 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1309874 00:33:44.014 06:49:55 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:33:44.014 06:49:55 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:33:44.014 06:49:55 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1309874' 00:33:44.014 killing process with pid 1309874 00:33:44.014 06:49:55 compress_compdev -- common/autotest_common.sh@969 -- # kill 1309874 00:33:44.014 06:49:55 compress_compdev -- common/autotest_common.sh@974 -- # wait 1309874 00:33:44.950 06:49:58 compress_compdev -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:33:44.950 06:49:58 compress_compdev -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:33:44.950 06:49:58 compress_compdev -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:33:44.950 06:49:58 compress_compdev -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:33:44.950 06:49:58 compress_compdev -- nvmf/common.sh@278 -- # remove_spdk_ns 00:33:44.950 06:49:58 compress_compdev -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:33:44.950 06:49:58 compress_compdev -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:33:44.950 06:49:58 compress_compdev -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:33:44.950 06:49:58 compress_compdev -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:33:44.950 06:49:58 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:33:44.950 00:33:44.950 real 2m9.532s 00:33:44.950 user 5m54.353s 00:33:44.950 sys 0m21.350s 00:33:44.950 06:49:58 compress_compdev -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:44.950 06:49:58 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:33:44.950 ************************************ 00:33:44.950 END TEST compress_compdev 00:33:44.950 ************************************ 00:33:44.950 06:49:58 -- spdk/autotest.sh@353 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:33:44.950 06:49:58 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:33:44.950 06:49:58 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:44.950 06:49:58 -- common/autotest_common.sh@10 -- # set +x 00:33:44.950 ************************************ 00:33:44.950 START TEST compress_isal 00:33:44.950 ************************************ 00:33:44.950 06:49:58 compress_isal -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:33:44.950 * Looking for test storage... 00:33:44.950 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:33:44.950 06:49:58 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:33:44.950 06:49:58 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:33:44.950 06:49:58 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:33:44.950 06:49:58 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:33:44.950 06:49:58 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:33:44.950 06:49:58 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:44.950 06:49:58 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:44.951 06:49:58 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:44.951 06:49:58 compress_isal -- paths/export.sh@5 -- # export PATH 00:33:44.951 06:49:58 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@47 -- # : 0 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:33:44.951 06:49:58 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1316553 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1316553 00:33:44.951 06:49:58 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1316553 ']' 00:33:44.951 06:49:58 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:33:44.951 06:49:58 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:44.951 06:49:58 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:44.951 06:49:58 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:44.951 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:44.951 06:49:58 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:44.951 06:49:58 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:33:45.210 [2024-07-25 06:49:58.530850] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:33:45.210 [2024-07-25 06:49:58.530915] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1316553 ] 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:45.210 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.210 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:45.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.211 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:45.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.211 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:45.211 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:45.211 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:45.211 [2024-07-25 06:49:58.656238] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:45.211 [2024-07-25 06:49:58.702054] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:45.211 [2024-07-25 06:49:58.702059] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:46.146 06:49:59 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:46.146 06:49:59 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:33:46.146 06:49:59 compress_isal -- compress/compress.sh@74 -- # create_vols 00:33:46.146 06:49:59 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:33:46.146 06:49:59 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:33:49.486 06:50:02 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:33:49.486 06:50:02 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:33:49.486 06:50:02 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:33:49.486 06:50:02 compress_isal -- common/autotest_common.sh@901 -- # local i 00:33:49.486 06:50:02 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:33:49.486 06:50:02 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:33:49.486 06:50:02 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:49.486 06:50:02 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:33:49.486 [ 00:33:49.486 { 00:33:49.486 "name": "Nvme0n1", 00:33:49.486 "aliases": [ 00:33:49.486 "32c52830-04fd-41e1-8277-670786f1862a" 00:33:49.486 ], 00:33:49.486 "product_name": "NVMe disk", 00:33:49.486 "block_size": 512, 00:33:49.486 "num_blocks": 3907029168, 00:33:49.486 "uuid": "32c52830-04fd-41e1-8277-670786f1862a", 00:33:49.486 "assigned_rate_limits": { 00:33:49.486 "rw_ios_per_sec": 0, 00:33:49.486 "rw_mbytes_per_sec": 0, 00:33:49.486 "r_mbytes_per_sec": 0, 00:33:49.486 "w_mbytes_per_sec": 0 00:33:49.486 }, 00:33:49.486 "claimed": false, 00:33:49.486 "zoned": false, 00:33:49.486 "supported_io_types": { 00:33:49.486 "read": true, 00:33:49.486 "write": true, 00:33:49.486 "unmap": true, 00:33:49.486 "flush": true, 00:33:49.486 "reset": true, 00:33:49.486 "nvme_admin": true, 00:33:49.486 "nvme_io": true, 00:33:49.486 "nvme_io_md": false, 00:33:49.486 "write_zeroes": true, 00:33:49.486 "zcopy": false, 00:33:49.486 "get_zone_info": false, 00:33:49.486 "zone_management": false, 00:33:49.486 "zone_append": false, 00:33:49.486 "compare": false, 00:33:49.486 "compare_and_write": false, 00:33:49.486 "abort": true, 00:33:49.486 "seek_hole": false, 00:33:49.486 "seek_data": false, 00:33:49.486 "copy": false, 00:33:49.486 "nvme_iov_md": false 00:33:49.486 }, 00:33:49.487 "driver_specific": { 00:33:49.487 "nvme": [ 00:33:49.487 { 00:33:49.487 "pci_address": "0000:d8:00.0", 00:33:49.487 "trid": { 00:33:49.487 "trtype": "PCIe", 00:33:49.487 "traddr": "0000:d8:00.0" 00:33:49.487 }, 00:33:49.487 "ctrlr_data": { 00:33:49.487 "cntlid": 0, 00:33:49.487 "vendor_id": "0x8086", 00:33:49.487 "model_number": "INTEL SSDPE2KX020T8", 00:33:49.487 "serial_number": "BTLJ125505KA2P0BGN", 00:33:49.487 "firmware_revision": "VDV10170", 00:33:49.487 "oacs": { 00:33:49.487 "security": 0, 00:33:49.487 "format": 1, 00:33:49.487 "firmware": 1, 00:33:49.487 "ns_manage": 1 00:33:49.487 }, 00:33:49.487 "multi_ctrlr": false, 00:33:49.487 "ana_reporting": false 00:33:49.487 }, 00:33:49.487 "vs": { 00:33:49.487 "nvme_version": "1.2" 00:33:49.487 }, 00:33:49.487 "ns_data": { 00:33:49.487 "id": 1, 00:33:49.487 "can_share": false 00:33:49.487 } 00:33:49.487 } 00:33:49.487 ], 00:33:49.487 "mp_policy": "active_passive" 00:33:49.487 } 00:33:49.487 } 00:33:49.487 ] 00:33:49.487 06:50:02 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:33:49.487 06:50:02 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:33:50.860 cc0bf0c2-5f81-4872-a7f4-7d46000a25b0 00:33:50.860 06:50:04 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:33:51.119 e17d636f-475f-4db3-9700-dea93db961a4 00:33:51.119 06:50:04 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:33:51.119 06:50:04 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:33:51.119 06:50:04 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:33:51.119 06:50:04 compress_isal -- common/autotest_common.sh@901 -- # local i 00:33:51.119 06:50:04 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:33:51.119 06:50:04 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:33:51.119 06:50:04 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:51.377 06:50:04 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:33:51.377 [ 00:33:51.377 { 00:33:51.377 "name": "e17d636f-475f-4db3-9700-dea93db961a4", 00:33:51.377 "aliases": [ 00:33:51.377 "lvs0/lv0" 00:33:51.377 ], 00:33:51.377 "product_name": "Logical Volume", 00:33:51.377 "block_size": 512, 00:33:51.377 "num_blocks": 204800, 00:33:51.377 "uuid": "e17d636f-475f-4db3-9700-dea93db961a4", 00:33:51.377 "assigned_rate_limits": { 00:33:51.377 "rw_ios_per_sec": 0, 00:33:51.377 "rw_mbytes_per_sec": 0, 00:33:51.377 "r_mbytes_per_sec": 0, 00:33:51.377 "w_mbytes_per_sec": 0 00:33:51.377 }, 00:33:51.377 "claimed": false, 00:33:51.377 "zoned": false, 00:33:51.377 "supported_io_types": { 00:33:51.377 "read": true, 00:33:51.377 "write": true, 00:33:51.377 "unmap": true, 00:33:51.377 "flush": false, 00:33:51.377 "reset": true, 00:33:51.377 "nvme_admin": false, 00:33:51.377 "nvme_io": false, 00:33:51.377 "nvme_io_md": false, 00:33:51.377 "write_zeroes": true, 00:33:51.377 "zcopy": false, 00:33:51.377 "get_zone_info": false, 00:33:51.377 "zone_management": false, 00:33:51.377 "zone_append": false, 00:33:51.377 "compare": false, 00:33:51.377 "compare_and_write": false, 00:33:51.377 "abort": false, 00:33:51.377 "seek_hole": true, 00:33:51.377 "seek_data": true, 00:33:51.377 "copy": false, 00:33:51.377 "nvme_iov_md": false 00:33:51.377 }, 00:33:51.377 "driver_specific": { 00:33:51.377 "lvol": { 00:33:51.377 "lvol_store_uuid": "cc0bf0c2-5f81-4872-a7f4-7d46000a25b0", 00:33:51.377 "base_bdev": "Nvme0n1", 00:33:51.377 "thin_provision": true, 00:33:51.377 "num_allocated_clusters": 0, 00:33:51.377 "snapshot": false, 00:33:51.377 "clone": false, 00:33:51.377 "esnap_clone": false 00:33:51.377 } 00:33:51.377 } 00:33:51.377 } 00:33:51.377 ] 00:33:51.636 06:50:04 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:33:51.636 06:50:04 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:33:51.636 06:50:04 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:33:51.636 [2024-07-25 06:50:05.151288] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:33:51.636 COMP_lvs0/lv0 00:33:51.636 06:50:05 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:33:51.636 06:50:05 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:33:51.636 06:50:05 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:33:51.636 06:50:05 compress_isal -- common/autotest_common.sh@901 -- # local i 00:33:51.636 06:50:05 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:33:51.636 06:50:05 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:33:51.636 06:50:05 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:51.895 06:50:05 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:33:52.154 [ 00:33:52.154 { 00:33:52.154 "name": "COMP_lvs0/lv0", 00:33:52.154 "aliases": [ 00:33:52.154 "6dfd95c8-82db-5adc-be6c-1ce950292a70" 00:33:52.154 ], 00:33:52.154 "product_name": "compress", 00:33:52.154 "block_size": 512, 00:33:52.154 "num_blocks": 200704, 00:33:52.154 "uuid": "6dfd95c8-82db-5adc-be6c-1ce950292a70", 00:33:52.154 "assigned_rate_limits": { 00:33:52.154 "rw_ios_per_sec": 0, 00:33:52.154 "rw_mbytes_per_sec": 0, 00:33:52.154 "r_mbytes_per_sec": 0, 00:33:52.154 "w_mbytes_per_sec": 0 00:33:52.154 }, 00:33:52.154 "claimed": false, 00:33:52.154 "zoned": false, 00:33:52.154 "supported_io_types": { 00:33:52.154 "read": true, 00:33:52.154 "write": true, 00:33:52.154 "unmap": false, 00:33:52.154 "flush": false, 00:33:52.154 "reset": false, 00:33:52.154 "nvme_admin": false, 00:33:52.154 "nvme_io": false, 00:33:52.154 "nvme_io_md": false, 00:33:52.154 "write_zeroes": true, 00:33:52.154 "zcopy": false, 00:33:52.154 "get_zone_info": false, 00:33:52.154 "zone_management": false, 00:33:52.154 "zone_append": false, 00:33:52.154 "compare": false, 00:33:52.154 "compare_and_write": false, 00:33:52.154 "abort": false, 00:33:52.154 "seek_hole": false, 00:33:52.154 "seek_data": false, 00:33:52.154 "copy": false, 00:33:52.154 "nvme_iov_md": false 00:33:52.154 }, 00:33:52.154 "driver_specific": { 00:33:52.154 "compress": { 00:33:52.154 "name": "COMP_lvs0/lv0", 00:33:52.154 "base_bdev_name": "e17d636f-475f-4db3-9700-dea93db961a4", 00:33:52.154 "pm_path": "/tmp/pmem/86bc543e-393c-452d-98f9-c84c76108d5b" 00:33:52.154 } 00:33:52.154 } 00:33:52.154 } 00:33:52.154 ] 00:33:52.154 06:50:05 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:33:52.154 06:50:05 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:52.413 Running I/O for 3 seconds... 00:33:55.693 00:33:55.693 Latency(us) 00:33:55.693 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:55.693 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:33:55.693 Verification LBA range: start 0x0 length 0x3100 00:33:55.693 COMP_lvs0/lv0 : 3.00 3521.51 13.76 0.00 0.00 9038.26 56.93 14050.92 00:33:55.693 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:33:55.694 Verification LBA range: start 0x3100 length 0x3100 00:33:55.694 COMP_lvs0/lv0 : 3.00 3548.45 13.86 0.00 0.00 8980.53 56.52 14470.35 00:33:55.694 =================================================================================================================== 00:33:55.694 Total : 7069.96 27.62 0.00 0.00 9009.28 56.52 14470.35 00:33:55.694 0 00:33:55.694 06:50:08 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:33:55.694 06:50:08 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:33:55.694 06:50:09 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:33:55.694 06:50:09 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:33:55.694 06:50:09 compress_isal -- compress/compress.sh@78 -- # killprocess 1316553 00:33:55.694 06:50:09 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1316553 ']' 00:33:55.694 06:50:09 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1316553 00:33:55.694 06:50:09 compress_isal -- common/autotest_common.sh@955 -- # uname 00:33:55.694 06:50:09 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:55.694 06:50:09 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1316553 00:33:55.952 06:50:09 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:33:55.952 06:50:09 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:33:55.952 06:50:09 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1316553' 00:33:55.952 killing process with pid 1316553 00:33:55.952 06:50:09 compress_isal -- common/autotest_common.sh@969 -- # kill 1316553 00:33:55.952 Received shutdown signal, test time was about 3.000000 seconds 00:33:55.952 00:33:55.952 Latency(us) 00:33:55.952 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:55.952 =================================================================================================================== 00:33:55.952 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:55.952 06:50:09 compress_isal -- common/autotest_common.sh@974 -- # wait 1316553 00:33:58.479 06:50:11 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:33:58.479 06:50:11 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:33:58.479 06:50:11 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1318690 00:33:58.479 06:50:11 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:33:58.479 06:50:11 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:33:58.479 06:50:11 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1318690 00:33:58.479 06:50:11 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1318690 ']' 00:33:58.479 06:50:11 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:58.479 06:50:11 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:58.479 06:50:11 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:58.479 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:58.479 06:50:11 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:58.479 06:50:11 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:33:58.479 [2024-07-25 06:50:11.637008] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:33:58.479 [2024-07-25 06:50:11.637071] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1318690 ] 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:01.0 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:01.1 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:01.2 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:01.3 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:01.4 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:01.5 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:01.6 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:01.7 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:02.0 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:02.1 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:02.2 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:02.3 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:02.4 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:02.5 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:02.6 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3d:02.7 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:01.0 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:01.1 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:01.2 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:01.3 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:01.4 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:01.5 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:01.6 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:01.7 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:02.0 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:02.1 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:02.2 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:02.3 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:02.4 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:02.5 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:02.6 cannot be used 00:33:58.479 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:33:58.479 EAL: Requested device 0000:3f:02.7 cannot be used 00:33:58.479 [2024-07-25 06:50:11.759535] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:58.479 [2024-07-25 06:50:11.805757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:58.479 [2024-07-25 06:50:11.805763] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:59.046 06:50:12 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:59.046 06:50:12 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:33:59.046 06:50:12 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:33:59.046 06:50:12 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:33:59.046 06:50:12 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:34:02.328 06:50:15 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:34:02.328 06:50:15 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:34:02.328 06:50:15 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:02.328 06:50:15 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:02.328 06:50:15 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:02.328 06:50:15 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:02.328 06:50:15 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:02.328 06:50:15 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:34:02.587 [ 00:34:02.587 { 00:34:02.587 "name": "Nvme0n1", 00:34:02.587 "aliases": [ 00:34:02.587 "49ea7c76-53f4-41ba-91d2-e8f149a68917" 00:34:02.587 ], 00:34:02.587 "product_name": "NVMe disk", 00:34:02.587 "block_size": 512, 00:34:02.587 "num_blocks": 3907029168, 00:34:02.587 "uuid": "49ea7c76-53f4-41ba-91d2-e8f149a68917", 00:34:02.587 "assigned_rate_limits": { 00:34:02.587 "rw_ios_per_sec": 0, 00:34:02.587 "rw_mbytes_per_sec": 0, 00:34:02.587 "r_mbytes_per_sec": 0, 00:34:02.587 "w_mbytes_per_sec": 0 00:34:02.587 }, 00:34:02.587 "claimed": false, 00:34:02.587 "zoned": false, 00:34:02.587 "supported_io_types": { 00:34:02.587 "read": true, 00:34:02.587 "write": true, 00:34:02.587 "unmap": true, 00:34:02.587 "flush": true, 00:34:02.587 "reset": true, 00:34:02.587 "nvme_admin": true, 00:34:02.588 "nvme_io": true, 00:34:02.588 "nvme_io_md": false, 00:34:02.588 "write_zeroes": true, 00:34:02.588 "zcopy": false, 00:34:02.588 "get_zone_info": false, 00:34:02.588 "zone_management": false, 00:34:02.588 "zone_append": false, 00:34:02.588 "compare": false, 00:34:02.588 "compare_and_write": false, 00:34:02.588 "abort": true, 00:34:02.588 "seek_hole": false, 00:34:02.588 "seek_data": false, 00:34:02.588 "copy": false, 00:34:02.588 "nvme_iov_md": false 00:34:02.588 }, 00:34:02.588 "driver_specific": { 00:34:02.588 "nvme": [ 00:34:02.588 { 00:34:02.588 "pci_address": "0000:d8:00.0", 00:34:02.588 "trid": { 00:34:02.588 "trtype": "PCIe", 00:34:02.588 "traddr": "0000:d8:00.0" 00:34:02.588 }, 00:34:02.588 "ctrlr_data": { 00:34:02.588 "cntlid": 0, 00:34:02.588 "vendor_id": "0x8086", 00:34:02.588 "model_number": "INTEL SSDPE2KX020T8", 00:34:02.588 "serial_number": "BTLJ125505KA2P0BGN", 00:34:02.588 "firmware_revision": "VDV10170", 00:34:02.588 "oacs": { 00:34:02.588 "security": 0, 00:34:02.588 "format": 1, 00:34:02.588 "firmware": 1, 00:34:02.588 "ns_manage": 1 00:34:02.588 }, 00:34:02.588 "multi_ctrlr": false, 00:34:02.588 "ana_reporting": false 00:34:02.588 }, 00:34:02.588 "vs": { 00:34:02.588 "nvme_version": "1.2" 00:34:02.588 }, 00:34:02.588 "ns_data": { 00:34:02.588 "id": 1, 00:34:02.588 "can_share": false 00:34:02.588 } 00:34:02.588 } 00:34:02.588 ], 00:34:02.588 "mp_policy": "active_passive" 00:34:02.588 } 00:34:02.588 } 00:34:02.588 ] 00:34:02.588 06:50:16 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:02.588 06:50:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:34:03.965 364561f3-8b95-45f3-b12b-15cb0d22f094 00:34:03.965 06:50:17 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:34:04.224 42cfbfea-1214-44ee-a4ca-bfe0e13a2052 00:34:04.224 06:50:17 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:34:04.224 06:50:17 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:34:04.224 06:50:17 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:04.224 06:50:17 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:04.224 06:50:17 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:04.224 06:50:17 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:04.224 06:50:17 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:04.483 06:50:17 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:34:04.741 [ 00:34:04.741 { 00:34:04.741 "name": "42cfbfea-1214-44ee-a4ca-bfe0e13a2052", 00:34:04.741 "aliases": [ 00:34:04.741 "lvs0/lv0" 00:34:04.741 ], 00:34:04.741 "product_name": "Logical Volume", 00:34:04.741 "block_size": 512, 00:34:04.741 "num_blocks": 204800, 00:34:04.741 "uuid": "42cfbfea-1214-44ee-a4ca-bfe0e13a2052", 00:34:04.741 "assigned_rate_limits": { 00:34:04.741 "rw_ios_per_sec": 0, 00:34:04.741 "rw_mbytes_per_sec": 0, 00:34:04.741 "r_mbytes_per_sec": 0, 00:34:04.741 "w_mbytes_per_sec": 0 00:34:04.741 }, 00:34:04.741 "claimed": false, 00:34:04.741 "zoned": false, 00:34:04.741 "supported_io_types": { 00:34:04.741 "read": true, 00:34:04.741 "write": true, 00:34:04.741 "unmap": true, 00:34:04.741 "flush": false, 00:34:04.741 "reset": true, 00:34:04.741 "nvme_admin": false, 00:34:04.742 "nvme_io": false, 00:34:04.742 "nvme_io_md": false, 00:34:04.742 "write_zeroes": true, 00:34:04.742 "zcopy": false, 00:34:04.742 "get_zone_info": false, 00:34:04.742 "zone_management": false, 00:34:04.742 "zone_append": false, 00:34:04.742 "compare": false, 00:34:04.742 "compare_and_write": false, 00:34:04.742 "abort": false, 00:34:04.742 "seek_hole": true, 00:34:04.742 "seek_data": true, 00:34:04.742 "copy": false, 00:34:04.742 "nvme_iov_md": false 00:34:04.742 }, 00:34:04.742 "driver_specific": { 00:34:04.742 "lvol": { 00:34:04.742 "lvol_store_uuid": "364561f3-8b95-45f3-b12b-15cb0d22f094", 00:34:04.742 "base_bdev": "Nvme0n1", 00:34:04.742 "thin_provision": true, 00:34:04.742 "num_allocated_clusters": 0, 00:34:04.742 "snapshot": false, 00:34:04.742 "clone": false, 00:34:04.742 "esnap_clone": false 00:34:04.742 } 00:34:04.742 } 00:34:04.742 } 00:34:04.742 ] 00:34:04.742 06:50:18 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:04.742 06:50:18 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:34:04.742 06:50:18 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:34:04.742 [2024-07-25 06:50:18.289997] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:34:04.742 COMP_lvs0/lv0 00:34:05.001 06:50:18 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:34:05.001 06:50:18 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:34:05.001 06:50:18 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:05.001 06:50:18 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:05.001 06:50:18 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:05.001 06:50:18 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:05.001 06:50:18 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:05.001 06:50:18 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:34:05.260 [ 00:34:05.260 { 00:34:05.260 "name": "COMP_lvs0/lv0", 00:34:05.260 "aliases": [ 00:34:05.260 "e9545869-22c3-54e4-8c10-52801fd02ce3" 00:34:05.260 ], 00:34:05.260 "product_name": "compress", 00:34:05.260 "block_size": 512, 00:34:05.260 "num_blocks": 200704, 00:34:05.260 "uuid": "e9545869-22c3-54e4-8c10-52801fd02ce3", 00:34:05.260 "assigned_rate_limits": { 00:34:05.260 "rw_ios_per_sec": 0, 00:34:05.260 "rw_mbytes_per_sec": 0, 00:34:05.260 "r_mbytes_per_sec": 0, 00:34:05.260 "w_mbytes_per_sec": 0 00:34:05.260 }, 00:34:05.260 "claimed": false, 00:34:05.260 "zoned": false, 00:34:05.260 "supported_io_types": { 00:34:05.260 "read": true, 00:34:05.260 "write": true, 00:34:05.260 "unmap": false, 00:34:05.260 "flush": false, 00:34:05.260 "reset": false, 00:34:05.260 "nvme_admin": false, 00:34:05.260 "nvme_io": false, 00:34:05.260 "nvme_io_md": false, 00:34:05.260 "write_zeroes": true, 00:34:05.260 "zcopy": false, 00:34:05.260 "get_zone_info": false, 00:34:05.260 "zone_management": false, 00:34:05.260 "zone_append": false, 00:34:05.260 "compare": false, 00:34:05.260 "compare_and_write": false, 00:34:05.260 "abort": false, 00:34:05.260 "seek_hole": false, 00:34:05.260 "seek_data": false, 00:34:05.260 "copy": false, 00:34:05.260 "nvme_iov_md": false 00:34:05.260 }, 00:34:05.260 "driver_specific": { 00:34:05.260 "compress": { 00:34:05.260 "name": "COMP_lvs0/lv0", 00:34:05.260 "base_bdev_name": "42cfbfea-1214-44ee-a4ca-bfe0e13a2052", 00:34:05.260 "pm_path": "/tmp/pmem/36bf834d-83f8-497c-a7c7-59921bee7f69" 00:34:05.260 } 00:34:05.260 } 00:34:05.260 } 00:34:05.260 ] 00:34:05.260 06:50:18 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:05.260 06:50:18 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:05.550 Running I/O for 3 seconds... 00:34:08.838 00:34:08.838 Latency(us) 00:34:08.838 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:08.838 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:34:08.838 Verification LBA range: start 0x0 length 0x3100 00:34:08.838 COMP_lvs0/lv0 : 3.00 3456.93 13.50 0.00 0.00 9208.13 58.98 14470.35 00:34:08.838 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:34:08.838 Verification LBA range: start 0x3100 length 0x3100 00:34:08.838 COMP_lvs0/lv0 : 3.01 3457.97 13.51 0.00 0.00 9211.57 56.12 14575.21 00:34:08.838 =================================================================================================================== 00:34:08.838 Total : 6914.90 27.01 0.00 0.00 9209.85 56.12 14575.21 00:34:08.838 0 00:34:08.838 06:50:21 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:34:08.838 06:50:21 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:34:08.838 06:50:22 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:34:08.838 06:50:22 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:34:08.838 06:50:22 compress_isal -- compress/compress.sh@78 -- # killprocess 1318690 00:34:08.838 06:50:22 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1318690 ']' 00:34:08.838 06:50:22 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1318690 00:34:08.838 06:50:22 compress_isal -- common/autotest_common.sh@955 -- # uname 00:34:08.838 06:50:22 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:08.838 06:50:22 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1318690 00:34:09.097 06:50:22 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:34:09.097 06:50:22 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:34:09.097 06:50:22 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1318690' 00:34:09.097 killing process with pid 1318690 00:34:09.097 06:50:22 compress_isal -- common/autotest_common.sh@969 -- # kill 1318690 00:34:09.097 Received shutdown signal, test time was about 3.000000 seconds 00:34:09.097 00:34:09.097 Latency(us) 00:34:09.097 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:09.097 =================================================================================================================== 00:34:09.097 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:09.097 06:50:22 compress_isal -- common/autotest_common.sh@974 -- # wait 1318690 00:34:11.629 06:50:24 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:34:11.629 06:50:24 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:34:11.629 06:50:24 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1320825 00:34:11.629 06:50:24 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:11.629 06:50:24 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:34:11.629 06:50:24 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1320825 00:34:11.629 06:50:24 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1320825 ']' 00:34:11.629 06:50:24 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:11.629 06:50:24 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:11.629 06:50:24 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:11.629 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:11.629 06:50:24 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:11.629 06:50:24 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:34:11.629 [2024-07-25 06:50:24.780463] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:34:11.629 [2024-07-25 06:50:24.780527] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1320825 ] 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:11.629 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:11.629 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:11.629 [2024-07-25 06:50:24.904426] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:11.630 [2024-07-25 06:50:24.951010] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:11.630 [2024-07-25 06:50:24.951012] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:12.197 06:50:25 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:12.197 06:50:25 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:34:12.197 06:50:25 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:34:12.197 06:50:25 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:34:12.197 06:50:25 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:34:15.482 06:50:28 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:34:15.482 06:50:28 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:34:15.482 06:50:28 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:15.482 06:50:28 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:15.482 06:50:28 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:15.482 06:50:28 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:15.482 06:50:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:15.741 06:50:29 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:34:15.741 [ 00:34:15.741 { 00:34:15.741 "name": "Nvme0n1", 00:34:15.741 "aliases": [ 00:34:15.741 "d8e3ca4f-d4fa-4d57-a2ec-fb671e895e26" 00:34:15.741 ], 00:34:15.741 "product_name": "NVMe disk", 00:34:15.741 "block_size": 512, 00:34:15.741 "num_blocks": 3907029168, 00:34:15.741 "uuid": "d8e3ca4f-d4fa-4d57-a2ec-fb671e895e26", 00:34:15.741 "assigned_rate_limits": { 00:34:15.741 "rw_ios_per_sec": 0, 00:34:15.741 "rw_mbytes_per_sec": 0, 00:34:15.741 "r_mbytes_per_sec": 0, 00:34:15.741 "w_mbytes_per_sec": 0 00:34:15.741 }, 00:34:15.741 "claimed": false, 00:34:15.741 "zoned": false, 00:34:15.741 "supported_io_types": { 00:34:15.741 "read": true, 00:34:15.741 "write": true, 00:34:15.741 "unmap": true, 00:34:15.741 "flush": true, 00:34:15.741 "reset": true, 00:34:15.741 "nvme_admin": true, 00:34:15.741 "nvme_io": true, 00:34:15.741 "nvme_io_md": false, 00:34:15.741 "write_zeroes": true, 00:34:15.741 "zcopy": false, 00:34:15.741 "get_zone_info": false, 00:34:15.741 "zone_management": false, 00:34:15.741 "zone_append": false, 00:34:15.741 "compare": false, 00:34:15.741 "compare_and_write": false, 00:34:15.741 "abort": true, 00:34:15.741 "seek_hole": false, 00:34:15.741 "seek_data": false, 00:34:15.741 "copy": false, 00:34:15.741 "nvme_iov_md": false 00:34:15.741 }, 00:34:15.741 "driver_specific": { 00:34:15.741 "nvme": [ 00:34:15.741 { 00:34:15.741 "pci_address": "0000:d8:00.0", 00:34:15.741 "trid": { 00:34:15.741 "trtype": "PCIe", 00:34:15.741 "traddr": "0000:d8:00.0" 00:34:15.741 }, 00:34:15.741 "ctrlr_data": { 00:34:15.741 "cntlid": 0, 00:34:15.741 "vendor_id": "0x8086", 00:34:15.741 "model_number": "INTEL SSDPE2KX020T8", 00:34:15.741 "serial_number": "BTLJ125505KA2P0BGN", 00:34:15.741 "firmware_revision": "VDV10170", 00:34:15.741 "oacs": { 00:34:15.741 "security": 0, 00:34:15.741 "format": 1, 00:34:15.741 "firmware": 1, 00:34:15.741 "ns_manage": 1 00:34:15.741 }, 00:34:15.741 "multi_ctrlr": false, 00:34:15.741 "ana_reporting": false 00:34:15.741 }, 00:34:15.741 "vs": { 00:34:15.741 "nvme_version": "1.2" 00:34:15.741 }, 00:34:15.741 "ns_data": { 00:34:15.741 "id": 1, 00:34:15.741 "can_share": false 00:34:15.741 } 00:34:15.741 } 00:34:15.741 ], 00:34:15.741 "mp_policy": "active_passive" 00:34:15.741 } 00:34:15.741 } 00:34:15.741 ] 00:34:15.741 06:50:29 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:15.741 06:50:29 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:34:17.119 0ab4e984-34eb-43d9-b8f5-f47c46338848 00:34:17.119 06:50:30 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:34:17.378 c1cf07e6-a15d-4e49-abe4-5b587b639658 00:34:17.378 06:50:30 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:34:17.378 06:50:30 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:34:17.378 06:50:30 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:17.378 06:50:30 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:17.378 06:50:30 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:17.378 06:50:30 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:17.378 06:50:30 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:17.637 06:50:30 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:34:17.637 [ 00:34:17.637 { 00:34:17.637 "name": "c1cf07e6-a15d-4e49-abe4-5b587b639658", 00:34:17.637 "aliases": [ 00:34:17.637 "lvs0/lv0" 00:34:17.637 ], 00:34:17.637 "product_name": "Logical Volume", 00:34:17.637 "block_size": 512, 00:34:17.637 "num_blocks": 204800, 00:34:17.637 "uuid": "c1cf07e6-a15d-4e49-abe4-5b587b639658", 00:34:17.637 "assigned_rate_limits": { 00:34:17.637 "rw_ios_per_sec": 0, 00:34:17.637 "rw_mbytes_per_sec": 0, 00:34:17.637 "r_mbytes_per_sec": 0, 00:34:17.637 "w_mbytes_per_sec": 0 00:34:17.637 }, 00:34:17.637 "claimed": false, 00:34:17.637 "zoned": false, 00:34:17.637 "supported_io_types": { 00:34:17.637 "read": true, 00:34:17.637 "write": true, 00:34:17.637 "unmap": true, 00:34:17.637 "flush": false, 00:34:17.637 "reset": true, 00:34:17.637 "nvme_admin": false, 00:34:17.637 "nvme_io": false, 00:34:17.637 "nvme_io_md": false, 00:34:17.637 "write_zeroes": true, 00:34:17.637 "zcopy": false, 00:34:17.637 "get_zone_info": false, 00:34:17.637 "zone_management": false, 00:34:17.637 "zone_append": false, 00:34:17.637 "compare": false, 00:34:17.637 "compare_and_write": false, 00:34:17.637 "abort": false, 00:34:17.637 "seek_hole": true, 00:34:17.637 "seek_data": true, 00:34:17.637 "copy": false, 00:34:17.637 "nvme_iov_md": false 00:34:17.637 }, 00:34:17.637 "driver_specific": { 00:34:17.637 "lvol": { 00:34:17.637 "lvol_store_uuid": "0ab4e984-34eb-43d9-b8f5-f47c46338848", 00:34:17.637 "base_bdev": "Nvme0n1", 00:34:17.637 "thin_provision": true, 00:34:17.637 "num_allocated_clusters": 0, 00:34:17.637 "snapshot": false, 00:34:17.637 "clone": false, 00:34:17.637 "esnap_clone": false 00:34:17.637 } 00:34:17.637 } 00:34:17.637 } 00:34:17.637 ] 00:34:17.896 06:50:31 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:17.896 06:50:31 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:34:17.896 06:50:31 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:34:17.896 [2024-07-25 06:50:31.429922] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:34:17.896 COMP_lvs0/lv0 00:34:18.155 06:50:31 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:34:18.155 06:50:31 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:34:18.155 06:50:31 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:18.155 06:50:31 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:18.155 06:50:31 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:18.155 06:50:31 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:18.155 06:50:31 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:18.155 06:50:31 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:34:18.415 [ 00:34:18.415 { 00:34:18.415 "name": "COMP_lvs0/lv0", 00:34:18.415 "aliases": [ 00:34:18.415 "2eef1b13-47e8-5771-8dfb-57aa5f52f2a1" 00:34:18.415 ], 00:34:18.415 "product_name": "compress", 00:34:18.415 "block_size": 4096, 00:34:18.415 "num_blocks": 25088, 00:34:18.415 "uuid": "2eef1b13-47e8-5771-8dfb-57aa5f52f2a1", 00:34:18.415 "assigned_rate_limits": { 00:34:18.415 "rw_ios_per_sec": 0, 00:34:18.415 "rw_mbytes_per_sec": 0, 00:34:18.415 "r_mbytes_per_sec": 0, 00:34:18.415 "w_mbytes_per_sec": 0 00:34:18.415 }, 00:34:18.415 "claimed": false, 00:34:18.415 "zoned": false, 00:34:18.415 "supported_io_types": { 00:34:18.415 "read": true, 00:34:18.415 "write": true, 00:34:18.415 "unmap": false, 00:34:18.415 "flush": false, 00:34:18.415 "reset": false, 00:34:18.415 "nvme_admin": false, 00:34:18.415 "nvme_io": false, 00:34:18.415 "nvme_io_md": false, 00:34:18.415 "write_zeroes": true, 00:34:18.415 "zcopy": false, 00:34:18.415 "get_zone_info": false, 00:34:18.415 "zone_management": false, 00:34:18.415 "zone_append": false, 00:34:18.415 "compare": false, 00:34:18.415 "compare_and_write": false, 00:34:18.415 "abort": false, 00:34:18.415 "seek_hole": false, 00:34:18.415 "seek_data": false, 00:34:18.415 "copy": false, 00:34:18.415 "nvme_iov_md": false 00:34:18.415 }, 00:34:18.415 "driver_specific": { 00:34:18.415 "compress": { 00:34:18.415 "name": "COMP_lvs0/lv0", 00:34:18.415 "base_bdev_name": "c1cf07e6-a15d-4e49-abe4-5b587b639658", 00:34:18.415 "pm_path": "/tmp/pmem/b24e6a66-943f-449f-9ef8-f39b517be8ed" 00:34:18.415 } 00:34:18.415 } 00:34:18.415 } 00:34:18.415 ] 00:34:18.415 06:50:31 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:18.415 06:50:31 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:18.674 Running I/O for 3 seconds... 00:34:21.963 00:34:21.963 Latency(us) 00:34:21.963 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:21.963 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:34:21.963 Verification LBA range: start 0x0 length 0x3100 00:34:21.963 COMP_lvs0/lv0 : 3.00 3416.68 13.35 0.00 0.00 9314.26 58.16 14784.92 00:34:21.963 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:34:21.963 Verification LBA range: start 0x3100 length 0x3100 00:34:21.963 COMP_lvs0/lv0 : 3.01 3439.08 13.43 0.00 0.00 9262.67 58.16 15099.49 00:34:21.963 =================================================================================================================== 00:34:21.963 Total : 6855.77 26.78 0.00 0.00 9288.38 58.16 15099.49 00:34:21.963 0 00:34:21.963 06:50:35 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:34:21.963 06:50:35 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:34:21.963 06:50:35 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:34:21.963 06:50:35 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:34:21.963 06:50:35 compress_isal -- compress/compress.sh@78 -- # killprocess 1320825 00:34:21.963 06:50:35 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1320825 ']' 00:34:21.963 06:50:35 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1320825 00:34:21.963 06:50:35 compress_isal -- common/autotest_common.sh@955 -- # uname 00:34:21.963 06:50:35 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:22.222 06:50:35 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1320825 00:34:22.222 06:50:35 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:34:22.222 06:50:35 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:34:22.222 06:50:35 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1320825' 00:34:22.222 killing process with pid 1320825 00:34:22.222 06:50:35 compress_isal -- common/autotest_common.sh@969 -- # kill 1320825 00:34:22.222 Received shutdown signal, test time was about 3.000000 seconds 00:34:22.222 00:34:22.222 Latency(us) 00:34:22.222 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:22.222 =================================================================================================================== 00:34:22.222 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:22.222 06:50:35 compress_isal -- common/autotest_common.sh@974 -- # wait 1320825 00:34:24.756 06:50:37 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:34:24.756 06:50:37 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:34:24.756 06:50:37 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1323010 00:34:24.756 06:50:37 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:24.756 06:50:37 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:34:24.756 06:50:37 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1323010 00:34:24.756 06:50:37 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1323010 ']' 00:34:24.756 06:50:37 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:24.756 06:50:37 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:24.756 06:50:37 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:24.756 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:24.756 06:50:37 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:24.756 06:50:37 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:34:24.756 [2024-07-25 06:50:38.035923] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:34:24.756 [2024-07-25 06:50:38.035986] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1323010 ] 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:24.756 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:24.756 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:24.757 [2024-07-25 06:50:38.173147] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:24.757 [2024-07-25 06:50:38.219623] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:24.757 [2024-07-25 06:50:38.219641] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:24.757 [2024-07-25 06:50:38.219644] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:25.731 06:50:38 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:25.731 06:50:38 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:34:25.731 06:50:38 compress_isal -- compress/compress.sh@58 -- # create_vols 00:34:25.731 06:50:38 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:34:25.731 06:50:38 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:34:29.020 06:50:42 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:34:29.020 06:50:42 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:34:29.020 06:50:42 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:29.020 06:50:42 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:29.020 06:50:42 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:29.020 06:50:42 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:29.020 06:50:42 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:29.020 06:50:42 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:34:29.020 [ 00:34:29.020 { 00:34:29.020 "name": "Nvme0n1", 00:34:29.020 "aliases": [ 00:34:29.020 "e2376698-3045-45d8-93ec-143d5f347ded" 00:34:29.020 ], 00:34:29.020 "product_name": "NVMe disk", 00:34:29.020 "block_size": 512, 00:34:29.020 "num_blocks": 3907029168, 00:34:29.020 "uuid": "e2376698-3045-45d8-93ec-143d5f347ded", 00:34:29.020 "assigned_rate_limits": { 00:34:29.020 "rw_ios_per_sec": 0, 00:34:29.020 "rw_mbytes_per_sec": 0, 00:34:29.020 "r_mbytes_per_sec": 0, 00:34:29.020 "w_mbytes_per_sec": 0 00:34:29.020 }, 00:34:29.020 "claimed": false, 00:34:29.020 "zoned": false, 00:34:29.020 "supported_io_types": { 00:34:29.020 "read": true, 00:34:29.020 "write": true, 00:34:29.020 "unmap": true, 00:34:29.020 "flush": true, 00:34:29.020 "reset": true, 00:34:29.020 "nvme_admin": true, 00:34:29.020 "nvme_io": true, 00:34:29.020 "nvme_io_md": false, 00:34:29.020 "write_zeroes": true, 00:34:29.020 "zcopy": false, 00:34:29.020 "get_zone_info": false, 00:34:29.020 "zone_management": false, 00:34:29.020 "zone_append": false, 00:34:29.020 "compare": false, 00:34:29.020 "compare_and_write": false, 00:34:29.020 "abort": true, 00:34:29.020 "seek_hole": false, 00:34:29.020 "seek_data": false, 00:34:29.020 "copy": false, 00:34:29.020 "nvme_iov_md": false 00:34:29.020 }, 00:34:29.020 "driver_specific": { 00:34:29.020 "nvme": [ 00:34:29.020 { 00:34:29.020 "pci_address": "0000:d8:00.0", 00:34:29.020 "trid": { 00:34:29.020 "trtype": "PCIe", 00:34:29.020 "traddr": "0000:d8:00.0" 00:34:29.020 }, 00:34:29.020 "ctrlr_data": { 00:34:29.020 "cntlid": 0, 00:34:29.020 "vendor_id": "0x8086", 00:34:29.020 "model_number": "INTEL SSDPE2KX020T8", 00:34:29.020 "serial_number": "BTLJ125505KA2P0BGN", 00:34:29.020 "firmware_revision": "VDV10170", 00:34:29.020 "oacs": { 00:34:29.020 "security": 0, 00:34:29.020 "format": 1, 00:34:29.020 "firmware": 1, 00:34:29.020 "ns_manage": 1 00:34:29.020 }, 00:34:29.020 "multi_ctrlr": false, 00:34:29.020 "ana_reporting": false 00:34:29.020 }, 00:34:29.020 "vs": { 00:34:29.020 "nvme_version": "1.2" 00:34:29.020 }, 00:34:29.020 "ns_data": { 00:34:29.020 "id": 1, 00:34:29.020 "can_share": false 00:34:29.020 } 00:34:29.020 } 00:34:29.020 ], 00:34:29.020 "mp_policy": "active_passive" 00:34:29.020 } 00:34:29.020 } 00:34:29.020 ] 00:34:29.020 06:50:42 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:29.020 06:50:42 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:34:30.397 40920589-3a40-4412-baa4-b98fa3af1e73 00:34:30.397 06:50:43 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:34:30.656 66606464-1f6c-4a8f-b62f-255af1c43336 00:34:30.656 06:50:44 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:34:30.656 06:50:44 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:34:30.656 06:50:44 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:30.656 06:50:44 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:30.656 06:50:44 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:30.656 06:50:44 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:30.656 06:50:44 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:30.915 06:50:44 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:34:30.915 [ 00:34:30.915 { 00:34:30.915 "name": "66606464-1f6c-4a8f-b62f-255af1c43336", 00:34:30.915 "aliases": [ 00:34:30.915 "lvs0/lv0" 00:34:30.915 ], 00:34:30.915 "product_name": "Logical Volume", 00:34:30.915 "block_size": 512, 00:34:30.915 "num_blocks": 204800, 00:34:30.915 "uuid": "66606464-1f6c-4a8f-b62f-255af1c43336", 00:34:30.915 "assigned_rate_limits": { 00:34:30.915 "rw_ios_per_sec": 0, 00:34:30.915 "rw_mbytes_per_sec": 0, 00:34:30.915 "r_mbytes_per_sec": 0, 00:34:30.915 "w_mbytes_per_sec": 0 00:34:30.915 }, 00:34:30.915 "claimed": false, 00:34:30.915 "zoned": false, 00:34:30.915 "supported_io_types": { 00:34:30.915 "read": true, 00:34:30.915 "write": true, 00:34:30.915 "unmap": true, 00:34:30.915 "flush": false, 00:34:30.915 "reset": true, 00:34:30.915 "nvme_admin": false, 00:34:30.915 "nvme_io": false, 00:34:30.915 "nvme_io_md": false, 00:34:30.915 "write_zeroes": true, 00:34:30.915 "zcopy": false, 00:34:30.915 "get_zone_info": false, 00:34:30.915 "zone_management": false, 00:34:30.915 "zone_append": false, 00:34:30.915 "compare": false, 00:34:30.915 "compare_and_write": false, 00:34:30.915 "abort": false, 00:34:30.915 "seek_hole": true, 00:34:30.915 "seek_data": true, 00:34:30.915 "copy": false, 00:34:30.915 "nvme_iov_md": false 00:34:30.915 }, 00:34:30.915 "driver_specific": { 00:34:30.915 "lvol": { 00:34:30.915 "lvol_store_uuid": "40920589-3a40-4412-baa4-b98fa3af1e73", 00:34:30.915 "base_bdev": "Nvme0n1", 00:34:30.915 "thin_provision": true, 00:34:30.915 "num_allocated_clusters": 0, 00:34:30.915 "snapshot": false, 00:34:30.915 "clone": false, 00:34:30.915 "esnap_clone": false 00:34:30.915 } 00:34:30.915 } 00:34:30.915 } 00:34:30.915 ] 00:34:30.915 06:50:44 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:30.915 06:50:44 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:34:30.915 06:50:44 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:34:31.174 [2024-07-25 06:50:44.619450] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:34:31.174 COMP_lvs0/lv0 00:34:31.174 06:50:44 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:34:31.174 06:50:44 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:34:31.174 06:50:44 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:31.174 06:50:44 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:31.174 06:50:44 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:31.174 06:50:44 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:31.174 06:50:44 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:31.432 06:50:44 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:34:31.691 [ 00:34:31.691 { 00:34:31.691 "name": "COMP_lvs0/lv0", 00:34:31.691 "aliases": [ 00:34:31.691 "e18654ff-3970-5443-a00b-ed9e20d0ea54" 00:34:31.691 ], 00:34:31.691 "product_name": "compress", 00:34:31.691 "block_size": 512, 00:34:31.691 "num_blocks": 200704, 00:34:31.691 "uuid": "e18654ff-3970-5443-a00b-ed9e20d0ea54", 00:34:31.691 "assigned_rate_limits": { 00:34:31.691 "rw_ios_per_sec": 0, 00:34:31.691 "rw_mbytes_per_sec": 0, 00:34:31.691 "r_mbytes_per_sec": 0, 00:34:31.691 "w_mbytes_per_sec": 0 00:34:31.691 }, 00:34:31.691 "claimed": false, 00:34:31.691 "zoned": false, 00:34:31.691 "supported_io_types": { 00:34:31.691 "read": true, 00:34:31.691 "write": true, 00:34:31.691 "unmap": false, 00:34:31.691 "flush": false, 00:34:31.691 "reset": false, 00:34:31.691 "nvme_admin": false, 00:34:31.691 "nvme_io": false, 00:34:31.691 "nvme_io_md": false, 00:34:31.691 "write_zeroes": true, 00:34:31.691 "zcopy": false, 00:34:31.691 "get_zone_info": false, 00:34:31.691 "zone_management": false, 00:34:31.691 "zone_append": false, 00:34:31.691 "compare": false, 00:34:31.691 "compare_and_write": false, 00:34:31.691 "abort": false, 00:34:31.691 "seek_hole": false, 00:34:31.691 "seek_data": false, 00:34:31.691 "copy": false, 00:34:31.691 "nvme_iov_md": false 00:34:31.691 }, 00:34:31.691 "driver_specific": { 00:34:31.691 "compress": { 00:34:31.691 "name": "COMP_lvs0/lv0", 00:34:31.691 "base_bdev_name": "66606464-1f6c-4a8f-b62f-255af1c43336", 00:34:31.691 "pm_path": "/tmp/pmem/784ad208-a545-46a6-a570-d537e3e3ba8c" 00:34:31.691 } 00:34:31.691 } 00:34:31.691 } 00:34:31.691 ] 00:34:31.691 06:50:45 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:31.691 06:50:45 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:31.691 I/O targets: 00:34:31.691 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:34:31.691 00:34:31.691 00:34:31.691 CUnit - A unit testing framework for C - Version 2.1-3 00:34:31.691 http://cunit.sourceforge.net/ 00:34:31.691 00:34:31.691 00:34:31.691 Suite: bdevio tests on: COMP_lvs0/lv0 00:34:31.691 Test: blockdev write read block ...passed 00:34:31.691 Test: blockdev write zeroes read block ...passed 00:34:31.692 Test: blockdev write zeroes read no split ...passed 00:34:31.692 Test: blockdev write zeroes read split ...passed 00:34:31.692 Test: blockdev write zeroes read split partial ...passed 00:34:31.692 Test: blockdev reset ...[2024-07-25 06:50:45.211728] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:34:31.692 passed 00:34:31.692 Test: blockdev write read 8 blocks ...passed 00:34:31.692 Test: blockdev write read size > 128k ...passed 00:34:31.692 Test: blockdev write read invalid size ...passed 00:34:31.692 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:31.692 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:31.692 Test: blockdev write read max offset ...passed 00:34:31.692 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:31.692 Test: blockdev writev readv 8 blocks ...passed 00:34:31.692 Test: blockdev writev readv 30 x 1block ...passed 00:34:31.692 Test: blockdev writev readv block ...passed 00:34:31.692 Test: blockdev writev readv size > 128k ...passed 00:34:31.692 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:31.692 Test: blockdev comparev and writev ...passed 00:34:31.692 Test: blockdev nvme passthru rw ...passed 00:34:31.692 Test: blockdev nvme passthru vendor specific ...passed 00:34:31.692 Test: blockdev nvme admin passthru ...passed 00:34:31.692 Test: blockdev copy ...passed 00:34:31.692 00:34:31.692 Run Summary: Type Total Ran Passed Failed Inactive 00:34:31.692 suites 1 1 n/a 0 0 00:34:31.692 tests 23 23 23 0 0 00:34:31.692 asserts 130 130 130 0 n/a 00:34:31.692 00:34:31.692 Elapsed time = 0.164 seconds 00:34:31.692 0 00:34:31.692 06:50:45 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:34:31.692 06:50:45 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:34:32.007 06:50:45 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:34:32.265 06:50:45 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:34:32.265 06:50:45 compress_isal -- compress/compress.sh@62 -- # killprocess 1323010 00:34:32.265 06:50:45 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1323010 ']' 00:34:32.265 06:50:45 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1323010 00:34:32.266 06:50:45 compress_isal -- common/autotest_common.sh@955 -- # uname 00:34:32.266 06:50:45 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:32.266 06:50:45 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1323010 00:34:32.266 06:50:45 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:32.266 06:50:45 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:32.266 06:50:45 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1323010' 00:34:32.266 killing process with pid 1323010 00:34:32.266 06:50:45 compress_isal -- common/autotest_common.sh@969 -- # kill 1323010 00:34:32.266 06:50:45 compress_isal -- common/autotest_common.sh@974 -- # wait 1323010 00:34:34.797 06:50:48 compress_isal -- compress/compress.sh@91 -- # '[' 1 -eq 1 ']' 00:34:34.797 06:50:48 compress_isal -- compress/compress.sh@92 -- # run_bdevperf 64 16384 30 00:34:34.797 06:50:48 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:34:34.797 06:50:48 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1324823 00:34:34.797 06:50:48 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:34.797 06:50:48 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 64 -o 16384 -w verify -t 30 -C -m 0x6 00:34:34.797 06:50:48 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1324823 00:34:34.797 06:50:48 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1324823 ']' 00:34:34.797 06:50:48 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:34.797 06:50:48 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:34.797 06:50:48 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:34.797 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:34.797 06:50:48 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:34.797 06:50:48 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:34:34.797 [2024-07-25 06:50:48.244772] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:34:34.797 [2024-07-25 06:50:48.244834] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1324823 ] 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:01.0 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:01.1 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:01.2 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:01.3 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:01.4 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:01.5 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:01.6 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:01.7 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:02.0 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:02.1 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:02.2 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:02.3 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:02.4 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:02.5 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:02.6 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3d:02.7 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.797 EAL: Requested device 0000:3f:01.0 cannot be used 00:34:34.797 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:01.1 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:01.2 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:01.3 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:01.4 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:01.5 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:01.6 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:01.7 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:02.0 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:02.1 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:02.2 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:02.3 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:02.4 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:02.5 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:02.6 cannot be used 00:34:34.798 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:34:34.798 EAL: Requested device 0000:3f:02.7 cannot be used 00:34:35.056 [2024-07-25 06:50:48.369164] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:35.056 [2024-07-25 06:50:48.414000] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:35.056 [2024-07-25 06:50:48.414006] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:35.623 06:50:49 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:35.623 06:50:49 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:34:35.623 06:50:49 compress_isal -- compress/compress.sh@74 -- # create_vols 00:34:35.623 06:50:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:34:35.623 06:50:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:34:38.909 06:50:52 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:34:38.909 06:50:52 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:34:38.909 06:50:52 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:38.909 06:50:52 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:38.909 06:50:52 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:38.909 06:50:52 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:38.909 06:50:52 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:39.167 06:50:52 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:34:39.167 [ 00:34:39.167 { 00:34:39.167 "name": "Nvme0n1", 00:34:39.167 "aliases": [ 00:34:39.167 "865a3286-4108-4d95-bb68-f0e1848a0738" 00:34:39.167 ], 00:34:39.167 "product_name": "NVMe disk", 00:34:39.167 "block_size": 512, 00:34:39.167 "num_blocks": 3907029168, 00:34:39.167 "uuid": "865a3286-4108-4d95-bb68-f0e1848a0738", 00:34:39.167 "assigned_rate_limits": { 00:34:39.167 "rw_ios_per_sec": 0, 00:34:39.167 "rw_mbytes_per_sec": 0, 00:34:39.167 "r_mbytes_per_sec": 0, 00:34:39.167 "w_mbytes_per_sec": 0 00:34:39.168 }, 00:34:39.168 "claimed": false, 00:34:39.168 "zoned": false, 00:34:39.168 "supported_io_types": { 00:34:39.168 "read": true, 00:34:39.168 "write": true, 00:34:39.168 "unmap": true, 00:34:39.168 "flush": true, 00:34:39.168 "reset": true, 00:34:39.168 "nvme_admin": true, 00:34:39.168 "nvme_io": true, 00:34:39.168 "nvme_io_md": false, 00:34:39.168 "write_zeroes": true, 00:34:39.168 "zcopy": false, 00:34:39.168 "get_zone_info": false, 00:34:39.168 "zone_management": false, 00:34:39.168 "zone_append": false, 00:34:39.168 "compare": false, 00:34:39.168 "compare_and_write": false, 00:34:39.168 "abort": true, 00:34:39.168 "seek_hole": false, 00:34:39.168 "seek_data": false, 00:34:39.168 "copy": false, 00:34:39.168 "nvme_iov_md": false 00:34:39.168 }, 00:34:39.168 "driver_specific": { 00:34:39.168 "nvme": [ 00:34:39.168 { 00:34:39.168 "pci_address": "0000:d8:00.0", 00:34:39.168 "trid": { 00:34:39.168 "trtype": "PCIe", 00:34:39.168 "traddr": "0000:d8:00.0" 00:34:39.168 }, 00:34:39.168 "ctrlr_data": { 00:34:39.168 "cntlid": 0, 00:34:39.168 "vendor_id": "0x8086", 00:34:39.168 "model_number": "INTEL SSDPE2KX020T8", 00:34:39.168 "serial_number": "BTLJ125505KA2P0BGN", 00:34:39.168 "firmware_revision": "VDV10170", 00:34:39.168 "oacs": { 00:34:39.168 "security": 0, 00:34:39.168 "format": 1, 00:34:39.168 "firmware": 1, 00:34:39.168 "ns_manage": 1 00:34:39.168 }, 00:34:39.168 "multi_ctrlr": false, 00:34:39.168 "ana_reporting": false 00:34:39.168 }, 00:34:39.168 "vs": { 00:34:39.168 "nvme_version": "1.2" 00:34:39.168 }, 00:34:39.168 "ns_data": { 00:34:39.168 "id": 1, 00:34:39.168 "can_share": false 00:34:39.168 } 00:34:39.168 } 00:34:39.168 ], 00:34:39.168 "mp_policy": "active_passive" 00:34:39.168 } 00:34:39.168 } 00:34:39.168 ] 00:34:39.168 06:50:52 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:39.168 06:50:52 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:34:40.544 885f96e9-f212-4061-8f17-34ecf6c7ce2a 00:34:40.544 06:50:53 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:34:40.544 a564a90d-c175-4474-a018-e945b8dca260 00:34:40.544 06:50:54 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:34:40.544 06:50:54 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:34:40.544 06:50:54 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:40.544 06:50:54 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:40.544 06:50:54 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:40.544 06:50:54 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:40.544 06:50:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:40.803 06:50:54 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:34:41.062 [ 00:34:41.062 { 00:34:41.062 "name": "a564a90d-c175-4474-a018-e945b8dca260", 00:34:41.062 "aliases": [ 00:34:41.062 "lvs0/lv0" 00:34:41.062 ], 00:34:41.062 "product_name": "Logical Volume", 00:34:41.062 "block_size": 512, 00:34:41.062 "num_blocks": 204800, 00:34:41.062 "uuid": "a564a90d-c175-4474-a018-e945b8dca260", 00:34:41.062 "assigned_rate_limits": { 00:34:41.062 "rw_ios_per_sec": 0, 00:34:41.062 "rw_mbytes_per_sec": 0, 00:34:41.062 "r_mbytes_per_sec": 0, 00:34:41.062 "w_mbytes_per_sec": 0 00:34:41.062 }, 00:34:41.062 "claimed": false, 00:34:41.062 "zoned": false, 00:34:41.062 "supported_io_types": { 00:34:41.062 "read": true, 00:34:41.062 "write": true, 00:34:41.062 "unmap": true, 00:34:41.062 "flush": false, 00:34:41.062 "reset": true, 00:34:41.062 "nvme_admin": false, 00:34:41.062 "nvme_io": false, 00:34:41.062 "nvme_io_md": false, 00:34:41.062 "write_zeroes": true, 00:34:41.062 "zcopy": false, 00:34:41.062 "get_zone_info": false, 00:34:41.062 "zone_management": false, 00:34:41.062 "zone_append": false, 00:34:41.062 "compare": false, 00:34:41.062 "compare_and_write": false, 00:34:41.062 "abort": false, 00:34:41.062 "seek_hole": true, 00:34:41.062 "seek_data": true, 00:34:41.062 "copy": false, 00:34:41.062 "nvme_iov_md": false 00:34:41.062 }, 00:34:41.062 "driver_specific": { 00:34:41.062 "lvol": { 00:34:41.062 "lvol_store_uuid": "885f96e9-f212-4061-8f17-34ecf6c7ce2a", 00:34:41.062 "base_bdev": "Nvme0n1", 00:34:41.062 "thin_provision": true, 00:34:41.062 "num_allocated_clusters": 0, 00:34:41.062 "snapshot": false, 00:34:41.062 "clone": false, 00:34:41.062 "esnap_clone": false 00:34:41.062 } 00:34:41.062 } 00:34:41.062 } 00:34:41.062 ] 00:34:41.062 06:50:54 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:41.062 06:50:54 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:34:41.062 06:50:54 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:34:41.321 [2024-07-25 06:50:54.750335] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:34:41.321 COMP_lvs0/lv0 00:34:41.321 06:50:54 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:34:41.321 06:50:54 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:34:41.321 06:50:54 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:41.321 06:50:54 compress_isal -- common/autotest_common.sh@901 -- # local i 00:34:41.321 06:50:54 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:41.321 06:50:54 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:41.321 06:50:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:41.580 06:50:54 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:34:41.838 [ 00:34:41.838 { 00:34:41.838 "name": "COMP_lvs0/lv0", 00:34:41.838 "aliases": [ 00:34:41.838 "7a45b62f-08f4-54b8-8fab-fe865954d9a9" 00:34:41.838 ], 00:34:41.838 "product_name": "compress", 00:34:41.838 "block_size": 512, 00:34:41.838 "num_blocks": 200704, 00:34:41.838 "uuid": "7a45b62f-08f4-54b8-8fab-fe865954d9a9", 00:34:41.838 "assigned_rate_limits": { 00:34:41.838 "rw_ios_per_sec": 0, 00:34:41.838 "rw_mbytes_per_sec": 0, 00:34:41.838 "r_mbytes_per_sec": 0, 00:34:41.838 "w_mbytes_per_sec": 0 00:34:41.838 }, 00:34:41.838 "claimed": false, 00:34:41.838 "zoned": false, 00:34:41.838 "supported_io_types": { 00:34:41.838 "read": true, 00:34:41.838 "write": true, 00:34:41.838 "unmap": false, 00:34:41.838 "flush": false, 00:34:41.838 "reset": false, 00:34:41.838 "nvme_admin": false, 00:34:41.838 "nvme_io": false, 00:34:41.838 "nvme_io_md": false, 00:34:41.838 "write_zeroes": true, 00:34:41.838 "zcopy": false, 00:34:41.839 "get_zone_info": false, 00:34:41.839 "zone_management": false, 00:34:41.839 "zone_append": false, 00:34:41.839 "compare": false, 00:34:41.839 "compare_and_write": false, 00:34:41.839 "abort": false, 00:34:41.839 "seek_hole": false, 00:34:41.839 "seek_data": false, 00:34:41.839 "copy": false, 00:34:41.839 "nvme_iov_md": false 00:34:41.839 }, 00:34:41.839 "driver_specific": { 00:34:41.839 "compress": { 00:34:41.839 "name": "COMP_lvs0/lv0", 00:34:41.839 "base_bdev_name": "a564a90d-c175-4474-a018-e945b8dca260", 00:34:41.839 "pm_path": "/tmp/pmem/996aab8e-4bfa-4a24-aaa4-a66cdcc3597b" 00:34:41.839 } 00:34:41.839 } 00:34:41.839 } 00:34:41.839 ] 00:34:41.839 06:50:55 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:34:41.839 06:50:55 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:41.839 Running I/O for 30 seconds... 00:35:13.982 00:35:13.982 Latency(us) 00:35:13.982 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:13.982 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 64, IO size: 16384) 00:35:13.982 Verification LBA range: start 0x0 length 0xc40 00:35:13.982 COMP_lvs0/lv0 : 30.01 1512.51 23.63 0.00 0.00 42074.48 231.01 39007.03 00:35:13.982 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 64, IO size: 16384) 00:35:13.982 Verification LBA range: start 0xc40 length 0xc40 00:35:13.982 COMP_lvs0/lv0 : 30.01 4763.63 74.43 0.00 0.00 13322.65 383.39 26214.40 00:35:13.982 =================================================================================================================== 00:35:13.982 Total : 6276.14 98.06 0.00 0.00 20252.04 231.01 39007.03 00:35:13.982 0 00:35:13.982 06:51:25 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:35:13.982 06:51:25 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:35:13.982 06:51:25 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:35:13.982 06:51:25 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:35:13.982 06:51:25 compress_isal -- compress/compress.sh@78 -- # killprocess 1324823 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1324823 ']' 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1324823 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@955 -- # uname 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1324823 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1324823' 00:35:13.982 killing process with pid 1324823 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@969 -- # kill 1324823 00:35:13.982 Received shutdown signal, test time was about 30.000000 seconds 00:35:13.982 00:35:13.982 Latency(us) 00:35:13.982 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:13.982 =================================================================================================================== 00:35:13.982 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:13.982 06:51:25 compress_isal -- common/autotest_common.sh@974 -- # wait 1324823 00:35:14.919 06:51:28 compress_isal -- compress/compress.sh@95 -- # export TEST_TRANSPORT=tcp 00:35:14.919 06:51:28 compress_isal -- compress/compress.sh@95 -- # TEST_TRANSPORT=tcp 00:35:14.919 06:51:28 compress_isal -- compress/compress.sh@96 -- # NET_TYPE=virt 00:35:14.919 06:51:28 compress_isal -- compress/compress.sh@96 -- # nvmftestinit 00:35:14.919 06:51:28 compress_isal -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:14.919 06:51:28 compress_isal -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:14.919 06:51:28 compress_isal -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:14.919 06:51:28 compress_isal -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:14.920 06:51:28 compress_isal -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:14.920 06:51:28 compress_isal -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@414 -- # [[ virt != virt ]] 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@423 -- # [[ virt == phy ]] 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@426 -- # [[ virt == phy-fallback ]] 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:14.920 Cannot find device "nvmf_tgt_br" 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@155 -- # true 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:14.920 Cannot find device "nvmf_tgt_br2" 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@156 -- # true 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:14.920 Cannot find device "nvmf_tgt_br" 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@158 -- # true 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:14.920 Cannot find device "nvmf_tgt_br2" 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@159 -- # true 00:35:14.920 06:51:28 compress_isal -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:15.180 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@162 -- # true 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:15.180 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@163 -- # true 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:15.180 06:51:28 compress_isal -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:15.440 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:15.440 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.072 ms 00:35:15.440 00:35:15.440 --- 10.0.0.2 ping statistics --- 00:35:15.440 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:15.440 rtt min/avg/max/mdev = 0.072/0.072/0.072/0.000 ms 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:15.440 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:15.440 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.073 ms 00:35:15.440 00:35:15.440 --- 10.0.0.3 ping statistics --- 00:35:15.440 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:15.440 rtt min/avg/max/mdev = 0.073/0.073/0.073/0.000 ms 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:15.440 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:15.440 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.034 ms 00:35:15.440 00:35:15.440 --- 10.0.0.1 ping statistics --- 00:35:15.440 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:15.440 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@433 -- # return 0 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:15.440 06:51:28 compress_isal -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:15.699 06:51:29 compress_isal -- compress/compress.sh@97 -- # nvmfappstart -m 0x7 00:35:15.699 06:51:29 compress_isal -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:15.699 06:51:29 compress_isal -- common/autotest_common.sh@724 -- # xtrace_disable 00:35:15.699 06:51:29 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:35:15.699 06:51:29 compress_isal -- nvmf/common.sh@481 -- # nvmfpid=1332295 00:35:15.699 06:51:29 compress_isal -- nvmf/common.sh@482 -- # waitforlisten 1332295 00:35:15.699 06:51:29 compress_isal -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x7 00:35:15.699 06:51:29 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1332295 ']' 00:35:15.699 06:51:29 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:15.699 06:51:29 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:15.699 06:51:29 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:15.699 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:15.699 06:51:29 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:15.699 06:51:29 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:35:15.699 [2024-07-25 06:51:29.077982] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:35:15.699 [2024-07-25 06:51:29.078041] [ DPDK EAL parameters: nvmf -c 0x7 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:15.699 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:15.699 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:15.699 [2024-07-25 06:51:29.222648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:35:15.958 [2024-07-25 06:51:29.266811] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:15.958 [2024-07-25 06:51:29.266857] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:15.958 [2024-07-25 06:51:29.266870] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:15.958 [2024-07-25 06:51:29.266882] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:15.958 [2024-07-25 06:51:29.266892] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:15.958 [2024-07-25 06:51:29.266952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:15.958 [2024-07-25 06:51:29.267044] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:35:15.958 [2024-07-25 06:51:29.267048] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:16.522 06:51:29 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:16.522 06:51:29 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:35:16.522 06:51:29 compress_isal -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:16.522 06:51:29 compress_isal -- common/autotest_common.sh@730 -- # xtrace_disable 00:35:16.522 06:51:29 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:35:16.522 06:51:30 compress_isal -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:16.522 06:51:30 compress_isal -- compress/compress.sh@98 -- # trap 'nvmftestfini; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:35:16.522 06:51:30 compress_isal -- compress/compress.sh@101 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t tcp -u 8192 00:35:16.780 [2024-07-25 06:51:30.226427] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:16.780 06:51:30 compress_isal -- compress/compress.sh@102 -- # create_vols 00:35:16.780 06:51:30 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:35:16.780 06:51:30 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:35:20.062 06:51:33 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:35:20.062 06:51:33 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:35:20.062 06:51:33 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:35:20.062 06:51:33 compress_isal -- common/autotest_common.sh@901 -- # local i 00:35:20.062 06:51:33 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:35:20.062 06:51:33 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:35:20.062 06:51:33 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:20.062 06:51:33 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:35:20.319 [ 00:35:20.319 { 00:35:20.319 "name": "Nvme0n1", 00:35:20.319 "aliases": [ 00:35:20.319 "d377acae-bd18-425c-bebc-105bc289915c" 00:35:20.319 ], 00:35:20.319 "product_name": "NVMe disk", 00:35:20.319 "block_size": 512, 00:35:20.319 "num_blocks": 3907029168, 00:35:20.319 "uuid": "d377acae-bd18-425c-bebc-105bc289915c", 00:35:20.319 "assigned_rate_limits": { 00:35:20.319 "rw_ios_per_sec": 0, 00:35:20.319 "rw_mbytes_per_sec": 0, 00:35:20.319 "r_mbytes_per_sec": 0, 00:35:20.319 "w_mbytes_per_sec": 0 00:35:20.319 }, 00:35:20.319 "claimed": false, 00:35:20.319 "zoned": false, 00:35:20.319 "supported_io_types": { 00:35:20.319 "read": true, 00:35:20.319 "write": true, 00:35:20.319 "unmap": true, 00:35:20.319 "flush": true, 00:35:20.319 "reset": true, 00:35:20.319 "nvme_admin": true, 00:35:20.319 "nvme_io": true, 00:35:20.319 "nvme_io_md": false, 00:35:20.319 "write_zeroes": true, 00:35:20.319 "zcopy": false, 00:35:20.319 "get_zone_info": false, 00:35:20.319 "zone_management": false, 00:35:20.319 "zone_append": false, 00:35:20.319 "compare": false, 00:35:20.319 "compare_and_write": false, 00:35:20.319 "abort": true, 00:35:20.319 "seek_hole": false, 00:35:20.319 "seek_data": false, 00:35:20.319 "copy": false, 00:35:20.319 "nvme_iov_md": false 00:35:20.319 }, 00:35:20.319 "driver_specific": { 00:35:20.319 "nvme": [ 00:35:20.319 { 00:35:20.319 "pci_address": "0000:d8:00.0", 00:35:20.319 "trid": { 00:35:20.319 "trtype": "PCIe", 00:35:20.319 "traddr": "0000:d8:00.0" 00:35:20.319 }, 00:35:20.319 "ctrlr_data": { 00:35:20.319 "cntlid": 0, 00:35:20.319 "vendor_id": "0x8086", 00:35:20.319 "model_number": "INTEL SSDPE2KX020T8", 00:35:20.319 "serial_number": "BTLJ125505KA2P0BGN", 00:35:20.319 "firmware_revision": "VDV10170", 00:35:20.319 "oacs": { 00:35:20.319 "security": 0, 00:35:20.319 "format": 1, 00:35:20.319 "firmware": 1, 00:35:20.319 "ns_manage": 1 00:35:20.319 }, 00:35:20.319 "multi_ctrlr": false, 00:35:20.319 "ana_reporting": false 00:35:20.319 }, 00:35:20.319 "vs": { 00:35:20.319 "nvme_version": "1.2" 00:35:20.319 }, 00:35:20.319 "ns_data": { 00:35:20.319 "id": 1, 00:35:20.319 "can_share": false 00:35:20.319 } 00:35:20.319 } 00:35:20.319 ], 00:35:20.319 "mp_policy": "active_passive" 00:35:20.319 } 00:35:20.319 } 00:35:20.319 ] 00:35:20.319 06:51:33 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:35:20.319 06:51:33 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:35:21.692 97b84b40-eb45-40a4-b317-8ac9423273f7 00:35:21.692 06:51:35 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:35:21.950 d3d815c7-a456-4ff0-a5fc-d82ec03bb854 00:35:21.950 06:51:35 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:35:21.950 06:51:35 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:35:21.950 06:51:35 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:35:21.950 06:51:35 compress_isal -- common/autotest_common.sh@901 -- # local i 00:35:21.950 06:51:35 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:35:21.950 06:51:35 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:35:21.950 06:51:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:22.209 06:51:35 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:35:22.209 [ 00:35:22.209 { 00:35:22.209 "name": "d3d815c7-a456-4ff0-a5fc-d82ec03bb854", 00:35:22.209 "aliases": [ 00:35:22.209 "lvs0/lv0" 00:35:22.209 ], 00:35:22.209 "product_name": "Logical Volume", 00:35:22.209 "block_size": 512, 00:35:22.209 "num_blocks": 204800, 00:35:22.209 "uuid": "d3d815c7-a456-4ff0-a5fc-d82ec03bb854", 00:35:22.209 "assigned_rate_limits": { 00:35:22.209 "rw_ios_per_sec": 0, 00:35:22.209 "rw_mbytes_per_sec": 0, 00:35:22.209 "r_mbytes_per_sec": 0, 00:35:22.209 "w_mbytes_per_sec": 0 00:35:22.209 }, 00:35:22.209 "claimed": false, 00:35:22.209 "zoned": false, 00:35:22.209 "supported_io_types": { 00:35:22.209 "read": true, 00:35:22.209 "write": true, 00:35:22.209 "unmap": true, 00:35:22.209 "flush": false, 00:35:22.209 "reset": true, 00:35:22.209 "nvme_admin": false, 00:35:22.209 "nvme_io": false, 00:35:22.209 "nvme_io_md": false, 00:35:22.209 "write_zeroes": true, 00:35:22.209 "zcopy": false, 00:35:22.209 "get_zone_info": false, 00:35:22.209 "zone_management": false, 00:35:22.209 "zone_append": false, 00:35:22.209 "compare": false, 00:35:22.209 "compare_and_write": false, 00:35:22.209 "abort": false, 00:35:22.209 "seek_hole": true, 00:35:22.209 "seek_data": true, 00:35:22.209 "copy": false, 00:35:22.209 "nvme_iov_md": false 00:35:22.209 }, 00:35:22.209 "driver_specific": { 00:35:22.209 "lvol": { 00:35:22.209 "lvol_store_uuid": "97b84b40-eb45-40a4-b317-8ac9423273f7", 00:35:22.209 "base_bdev": "Nvme0n1", 00:35:22.209 "thin_provision": true, 00:35:22.209 "num_allocated_clusters": 0, 00:35:22.209 "snapshot": false, 00:35:22.209 "clone": false, 00:35:22.209 "esnap_clone": false 00:35:22.209 } 00:35:22.209 } 00:35:22.209 } 00:35:22.209 ] 00:35:22.209 06:51:35 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:35:22.209 06:51:35 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:35:22.209 06:51:35 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:35:22.468 [2024-07-25 06:51:35.951652] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:35:22.468 COMP_lvs0/lv0 00:35:22.468 06:51:35 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:35:22.468 06:51:35 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:35:22.468 06:51:35 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:35:22.468 06:51:35 compress_isal -- common/autotest_common.sh@901 -- # local i 00:35:22.468 06:51:35 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:35:22.468 06:51:35 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:35:22.468 06:51:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:22.727 06:51:36 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:35:22.986 [ 00:35:22.986 { 00:35:22.986 "name": "COMP_lvs0/lv0", 00:35:22.986 "aliases": [ 00:35:22.986 "ffe553e0-65fc-57cf-9890-c539aa1b26fe" 00:35:22.986 ], 00:35:22.986 "product_name": "compress", 00:35:22.986 "block_size": 512, 00:35:22.986 "num_blocks": 200704, 00:35:22.986 "uuid": "ffe553e0-65fc-57cf-9890-c539aa1b26fe", 00:35:22.986 "assigned_rate_limits": { 00:35:22.986 "rw_ios_per_sec": 0, 00:35:22.986 "rw_mbytes_per_sec": 0, 00:35:22.986 "r_mbytes_per_sec": 0, 00:35:22.986 "w_mbytes_per_sec": 0 00:35:22.986 }, 00:35:22.986 "claimed": false, 00:35:22.986 "zoned": false, 00:35:22.986 "supported_io_types": { 00:35:22.986 "read": true, 00:35:22.986 "write": true, 00:35:22.986 "unmap": false, 00:35:22.986 "flush": false, 00:35:22.986 "reset": false, 00:35:22.986 "nvme_admin": false, 00:35:22.986 "nvme_io": false, 00:35:22.986 "nvme_io_md": false, 00:35:22.986 "write_zeroes": true, 00:35:22.986 "zcopy": false, 00:35:22.986 "get_zone_info": false, 00:35:22.986 "zone_management": false, 00:35:22.986 "zone_append": false, 00:35:22.986 "compare": false, 00:35:22.986 "compare_and_write": false, 00:35:22.986 "abort": false, 00:35:22.986 "seek_hole": false, 00:35:22.986 "seek_data": false, 00:35:22.986 "copy": false, 00:35:22.987 "nvme_iov_md": false 00:35:22.987 }, 00:35:22.987 "driver_specific": { 00:35:22.987 "compress": { 00:35:22.987 "name": "COMP_lvs0/lv0", 00:35:22.987 "base_bdev_name": "d3d815c7-a456-4ff0-a5fc-d82ec03bb854", 00:35:22.987 "pm_path": "/tmp/pmem/c1e1fa9a-710a-47f7-a373-754bb6cfabfd" 00:35:22.987 } 00:35:22.987 } 00:35:22.987 } 00:35:22.987 ] 00:35:22.987 06:51:36 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:35:22.987 06:51:36 compress_isal -- compress/compress.sh@103 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:35:23.246 06:51:36 compress_isal -- compress/compress.sh@104 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 COMP_lvs0/lv0 00:35:23.505 06:51:36 compress_isal -- compress/compress.sh@105 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t tcp -a 10.0.0.2 -s 4420 00:35:23.505 [2024-07-25 06:51:37.032492] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:23.505 06:51:37 compress_isal -- compress/compress.sh@109 -- # perf_pid=1333613 00:35:23.505 06:51:37 compress_isal -- compress/compress.sh@112 -- # trap 'killprocess $perf_pid; compress_err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:35:23.505 06:51:37 compress_isal -- compress/compress.sh@108 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_nvme_perf -r 'trtype:tcp adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -o 4096 -q 64 -s 512 -w randrw -t 30 -c 0x18 -M 50 00:35:23.505 06:51:37 compress_isal -- compress/compress.sh@113 -- # wait 1333613 00:35:23.764 [2024-07-25 06:51:37.219845] subsystem.c:1572:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on TCP/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:35:55.877 Initializing NVMe Controllers 00:35:55.877 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode0 00:35:55.877 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 3 00:35:55.877 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 4 00:35:55.877 Initialization complete. Launching workers. 00:35:55.877 ======================================================== 00:35:55.877 Latency(us) 00:35:55.877 Device Information : IOPS MiB/s Average min max 00:35:55.877 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 3: 5037.64 19.68 12707.05 1811.36 30398.49 00:35:55.877 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 4: 3165.36 12.36 20222.88 2250.02 38965.81 00:35:55.877 ======================================================== 00:35:55.877 Total : 8203.00 32.04 15607.25 1811.36 38965.81 00:35:55.877 00:35:55.877 06:52:07 compress_isal -- compress/compress.sh@114 -- # destroy_vols 00:35:55.877 06:52:07 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:35:55.877 06:52:07 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:35:55.877 06:52:07 compress_isal -- compress/compress.sh@116 -- # trap - SIGINT SIGTERM EXIT 00:35:55.878 06:52:07 compress_isal -- compress/compress.sh@117 -- # nvmftestfini 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@117 -- # sync 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@120 -- # set +e 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:55.878 rmmod nvme_tcp 00:35:55.878 rmmod nvme_fabrics 00:35:55.878 rmmod nvme_keyring 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@124 -- # set -e 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@125 -- # return 0 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@489 -- # '[' -n 1332295 ']' 00:35:55.878 06:52:07 compress_isal -- nvmf/common.sh@490 -- # killprocess 1332295 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1332295 ']' 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1332295 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@955 -- # uname 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1332295 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1332295' 00:35:55.878 killing process with pid 1332295 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@969 -- # kill 1332295 00:35:55.878 06:52:07 compress_isal -- common/autotest_common.sh@974 -- # wait 1332295 00:35:56.816 06:52:10 compress_isal -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:56.816 06:52:10 compress_isal -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:56.816 06:52:10 compress_isal -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:56.816 06:52:10 compress_isal -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:56.816 06:52:10 compress_isal -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:56.816 06:52:10 compress_isal -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:56.816 06:52:10 compress_isal -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:56.816 06:52:10 compress_isal -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:57.076 06:52:10 compress_isal -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:57.076 06:52:10 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:35:57.076 00:35:57.076 real 2m12.097s 00:35:57.076 user 6m4.658s 00:35:57.076 sys 0m20.134s 00:35:57.076 06:52:10 compress_isal -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:57.076 06:52:10 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:35:57.076 ************************************ 00:35:57.076 END TEST compress_isal 00:35:57.076 ************************************ 00:35:57.076 06:52:10 -- spdk/autotest.sh@356 -- # '[' 0 -eq 1 ']' 00:35:57.076 06:52:10 -- spdk/autotest.sh@360 -- # '[' 1 -eq 1 ']' 00:35:57.076 06:52:10 -- spdk/autotest.sh@361 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:35:57.076 06:52:10 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:35:57.076 06:52:10 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:57.076 06:52:10 -- common/autotest_common.sh@10 -- # set +x 00:35:57.076 ************************************ 00:35:57.076 START TEST blockdev_crypto_aesni 00:35:57.076 ************************************ 00:35:57.076 06:52:10 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:35:57.076 * Looking for test storage... 00:35:57.335 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:57.335 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:35:57.335 06:52:10 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1338978 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1338978 00:35:57.336 06:52:10 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:35:57.336 06:52:10 blockdev_crypto_aesni -- common/autotest_common.sh@831 -- # '[' -z 1338978 ']' 00:35:57.336 06:52:10 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:57.336 06:52:10 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:57.336 06:52:10 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:57.336 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:57.336 06:52:10 blockdev_crypto_aesni -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:57.336 06:52:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:35:57.336 [2024-07-25 06:52:10.717845] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:35:57.336 [2024-07-25 06:52:10.717891] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338978 ] 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:01.0 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:01.1 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:01.2 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:01.3 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:01.4 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:01.5 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:01.6 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:01.7 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:02.0 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:02.1 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:02.2 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:02.3 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:02.4 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:02.5 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:02.6 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3d:02.7 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:01.0 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:01.1 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:01.2 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:01.3 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:01.4 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:01.5 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:01.6 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:01.7 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:02.0 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:02.1 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:02.2 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:02.3 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:02.4 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:02.5 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:02.6 cannot be used 00:35:57.336 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:35:57.336 EAL: Requested device 0000:3f:02.7 cannot be used 00:35:57.336 [2024-07-25 06:52:10.838758] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:57.336 [2024-07-25 06:52:10.882894] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:58.275 06:52:11 blockdev_crypto_aesni -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:58.275 06:52:11 blockdev_crypto_aesni -- common/autotest_common.sh@864 -- # return 0 00:35:58.275 06:52:11 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:35:58.275 06:52:11 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:35:58.275 06:52:11 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:35:58.275 06:52:11 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:58.275 06:52:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:35:58.275 [2024-07-25 06:52:11.573040] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:35:58.275 [2024-07-25 06:52:11.581074] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:58.275 [2024-07-25 06:52:11.589090] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:58.275 [2024-07-25 06:52:11.665022] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:36:00.811 true 00:36:00.811 true 00:36:00.811 true 00:36:00.811 true 00:36:00.811 Malloc0 00:36:00.811 Malloc1 00:36:00.811 Malloc2 00:36:00.811 Malloc3 00:36:00.811 [2024-07-25 06:52:14.133533] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:36:00.812 crypto_ram 00:36:00.812 [2024-07-25 06:52:14.141553] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:36:00.812 crypto_ram2 00:36:00.812 [2024-07-25 06:52:14.149573] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:36:00.812 crypto_ram3 00:36:00.812 [2024-07-25 06:52:14.157597] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:36:00.812 crypto_ram4 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:00.812 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:36:00.812 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c3798b95-15ca-5f09-ae04-56c05ca8f5c9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c3798b95-15ca-5f09-ae04-56c05ca8f5c9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "45f0f0c1-df77-58dc-accd-477a1370ac4e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "45f0f0c1-df77-58dc-accd-477a1370ac4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "3af39778-50f4-5282-8d65-5da263583e4a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3af39778-50f4-5282-8d65-5da263583e4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "4e3b3b23-372f-5e77-9d0b-ca45e52a0740"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4e3b3b23-372f-5e77-9d0b-ca45e52a0740",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:36:01.072 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:36:01.072 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:36:01.072 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:36:01.072 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 1338978 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@950 -- # '[' -z 1338978 ']' 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # kill -0 1338978 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # uname 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1338978 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1338978' 00:36:01.072 killing process with pid 1338978 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@969 -- # kill 1338978 00:36:01.072 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@974 -- # wait 1338978 00:36:01.331 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:36:01.331 06:52:14 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:36:01.331 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:36:01.331 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:01.331 06:52:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:01.591 ************************************ 00:36:01.591 START TEST bdev_hello_world 00:36:01.591 ************************************ 00:36:01.591 06:52:14 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:36:01.591 [2024-07-25 06:52:14.975298] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:36:01.591 [2024-07-25 06:52:14.975353] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1339786 ] 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:01.0 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:01.1 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:01.2 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:01.3 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:01.4 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:01.5 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:01.6 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:01.7 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:02.0 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:02.1 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:02.2 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:02.3 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:02.4 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:02.5 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:02.6 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3d:02.7 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:01.0 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:01.1 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:01.2 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:01.3 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:01.4 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:01.5 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:01.6 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:01.7 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:02.0 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:02.1 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:02.2 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:02.3 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:02.4 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:02.5 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:02.6 cannot be used 00:36:01.591 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:01.591 EAL: Requested device 0000:3f:02.7 cannot be used 00:36:01.591 [2024-07-25 06:52:15.110443] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:01.850 [2024-07-25 06:52:15.154661] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:01.850 [2024-07-25 06:52:15.175913] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:36:01.850 [2024-07-25 06:52:15.183939] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:01.850 [2024-07-25 06:52:15.191958] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:01.850 [2024-07-25 06:52:15.302442] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:36:04.386 [2024-07-25 06:52:17.628610] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:36:04.386 [2024-07-25 06:52:17.628678] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:04.386 [2024-07-25 06:52:17.628691] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:04.386 [2024-07-25 06:52:17.636628] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:36:04.386 [2024-07-25 06:52:17.636647] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:04.386 [2024-07-25 06:52:17.636658] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:04.386 [2024-07-25 06:52:17.644648] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:36:04.386 [2024-07-25 06:52:17.644668] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:04.386 [2024-07-25 06:52:17.644679] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:04.386 [2024-07-25 06:52:17.652669] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:36:04.386 [2024-07-25 06:52:17.652685] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:04.386 [2024-07-25 06:52:17.652695] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:04.386 [2024-07-25 06:52:17.724400] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:36:04.386 [2024-07-25 06:52:17.724439] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:36:04.386 [2024-07-25 06:52:17.724456] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:36:04.386 [2024-07-25 06:52:17.725614] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:36:04.386 [2024-07-25 06:52:17.725691] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:36:04.386 [2024-07-25 06:52:17.725706] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:36:04.386 [2024-07-25 06:52:17.725747] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:36:04.386 00:36:04.386 [2024-07-25 06:52:17.725764] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:36:04.645 00:36:04.645 real 0m3.103s 00:36:04.645 user 0m2.593s 00:36:04.645 sys 0m0.472s 00:36:04.645 06:52:18 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:04.645 06:52:18 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:36:04.645 ************************************ 00:36:04.645 END TEST bdev_hello_world 00:36:04.645 ************************************ 00:36:04.645 06:52:18 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:36:04.645 06:52:18 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:36:04.645 06:52:18 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:04.645 06:52:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:04.645 ************************************ 00:36:04.645 START TEST bdev_bounds 00:36:04.645 ************************************ 00:36:04.645 06:52:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1340328 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1340328' 00:36:04.646 Process bdevio pid: 1340328 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1340328 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1340328 ']' 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:04.646 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:36:04.646 06:52:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:36:04.646 [2024-07-25 06:52:18.167676] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:36:04.646 [2024-07-25 06:52:18.167735] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1340328 ] 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:01.0 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:01.1 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:01.2 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:01.3 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:01.4 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:01.5 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:01.6 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:01.7 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:02.0 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:02.1 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:02.2 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:02.3 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:02.4 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:02.5 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:02.6 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3d:02.7 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3f:01.0 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3f:01.1 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3f:01.2 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3f:01.3 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3f:01.4 cannot be used 00:36:04.905 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.905 EAL: Requested device 0000:3f:01.5 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:01.6 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:01.7 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:02.0 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:02.1 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:02.2 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:02.3 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:02.4 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:02.5 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:02.6 cannot be used 00:36:04.906 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:04.906 EAL: Requested device 0000:3f:02.7 cannot be used 00:36:04.906 [2024-07-25 06:52:18.304432] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:36:04.906 [2024-07-25 06:52:18.351777] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:04.906 [2024-07-25 06:52:18.351869] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:36:04.906 [2024-07-25 06:52:18.351872] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:04.906 [2024-07-25 06:52:18.373157] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:36:04.906 [2024-07-25 06:52:18.381182] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:04.906 [2024-07-25 06:52:18.389203] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:05.165 [2024-07-25 06:52:18.486173] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:36:07.703 [2024-07-25 06:52:20.814084] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:36:07.703 [2024-07-25 06:52:20.814166] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:07.703 [2024-07-25 06:52:20.814180] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:07.703 [2024-07-25 06:52:20.822094] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:36:07.703 [2024-07-25 06:52:20.822114] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:07.703 [2024-07-25 06:52:20.822125] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:07.703 [2024-07-25 06:52:20.830118] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:36:07.704 [2024-07-25 06:52:20.830135] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:07.704 [2024-07-25 06:52:20.830150] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:07.704 [2024-07-25 06:52:20.838150] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:36:07.704 [2024-07-25 06:52:20.838166] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:07.704 [2024-07-25 06:52:20.838176] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:07.704 06:52:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:36:07.704 06:52:20 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:36:07.704 06:52:20 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:36:07.704 I/O targets: 00:36:07.704 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:36:07.704 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:36:07.704 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:36:07.704 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:36:07.704 00:36:07.704 00:36:07.704 CUnit - A unit testing framework for C - Version 2.1-3 00:36:07.704 http://cunit.sourceforge.net/ 00:36:07.704 00:36:07.704 00:36:07.704 Suite: bdevio tests on: crypto_ram4 00:36:07.704 Test: blockdev write read block ...passed 00:36:07.704 Test: blockdev write zeroes read block ...passed 00:36:07.704 Test: blockdev write zeroes read no split ...passed 00:36:07.704 Test: blockdev write zeroes read split ...passed 00:36:07.704 Test: blockdev write zeroes read split partial ...passed 00:36:07.704 Test: blockdev reset ...passed 00:36:07.704 Test: blockdev write read 8 blocks ...passed 00:36:07.704 Test: blockdev write read size > 128k ...passed 00:36:07.704 Test: blockdev write read invalid size ...passed 00:36:07.704 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:07.704 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:07.704 Test: blockdev write read max offset ...passed 00:36:07.704 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:07.704 Test: blockdev writev readv 8 blocks ...passed 00:36:07.704 Test: blockdev writev readv 30 x 1block ...passed 00:36:07.704 Test: blockdev writev readv block ...passed 00:36:07.704 Test: blockdev writev readv size > 128k ...passed 00:36:07.704 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:07.704 Test: blockdev comparev and writev ...passed 00:36:07.704 Test: blockdev nvme passthru rw ...passed 00:36:07.704 Test: blockdev nvme passthru vendor specific ...passed 00:36:07.704 Test: blockdev nvme admin passthru ...passed 00:36:07.704 Test: blockdev copy ...passed 00:36:07.704 Suite: bdevio tests on: crypto_ram3 00:36:07.704 Test: blockdev write read block ...passed 00:36:07.704 Test: blockdev write zeroes read block ...passed 00:36:07.704 Test: blockdev write zeroes read no split ...passed 00:36:07.704 Test: blockdev write zeroes read split ...passed 00:36:07.704 Test: blockdev write zeroes read split partial ...passed 00:36:07.704 Test: blockdev reset ...passed 00:36:07.704 Test: blockdev write read 8 blocks ...passed 00:36:07.704 Test: blockdev write read size > 128k ...passed 00:36:07.704 Test: blockdev write read invalid size ...passed 00:36:07.704 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:07.704 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:07.704 Test: blockdev write read max offset ...passed 00:36:07.704 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:07.704 Test: blockdev writev readv 8 blocks ...passed 00:36:07.704 Test: blockdev writev readv 30 x 1block ...passed 00:36:07.704 Test: blockdev writev readv block ...passed 00:36:07.704 Test: blockdev writev readv size > 128k ...passed 00:36:07.704 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:07.704 Test: blockdev comparev and writev ...passed 00:36:07.704 Test: blockdev nvme passthru rw ...passed 00:36:07.704 Test: blockdev nvme passthru vendor specific ...passed 00:36:07.704 Test: blockdev nvme admin passthru ...passed 00:36:07.704 Test: blockdev copy ...passed 00:36:07.704 Suite: bdevio tests on: crypto_ram2 00:36:07.704 Test: blockdev write read block ...passed 00:36:07.704 Test: blockdev write zeroes read block ...passed 00:36:07.704 Test: blockdev write zeroes read no split ...passed 00:36:07.704 Test: blockdev write zeroes read split ...passed 00:36:07.704 Test: blockdev write zeroes read split partial ...passed 00:36:07.704 Test: blockdev reset ...passed 00:36:07.704 Test: blockdev write read 8 blocks ...passed 00:36:07.704 Test: blockdev write read size > 128k ...passed 00:36:07.704 Test: blockdev write read invalid size ...passed 00:36:07.704 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:07.704 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:07.704 Test: blockdev write read max offset ...passed 00:36:07.704 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:07.704 Test: blockdev writev readv 8 blocks ...passed 00:36:07.704 Test: blockdev writev readv 30 x 1block ...passed 00:36:07.704 Test: blockdev writev readv block ...passed 00:36:07.704 Test: blockdev writev readv size > 128k ...passed 00:36:07.704 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:07.704 Test: blockdev comparev and writev ...passed 00:36:07.704 Test: blockdev nvme passthru rw ...passed 00:36:07.704 Test: blockdev nvme passthru vendor specific ...passed 00:36:07.704 Test: blockdev nvme admin passthru ...passed 00:36:07.704 Test: blockdev copy ...passed 00:36:07.704 Suite: bdevio tests on: crypto_ram 00:36:07.704 Test: blockdev write read block ...passed 00:36:07.704 Test: blockdev write zeroes read block ...passed 00:36:07.704 Test: blockdev write zeroes read no split ...passed 00:36:07.704 Test: blockdev write zeroes read split ...passed 00:36:08.040 Test: blockdev write zeroes read split partial ...passed 00:36:08.040 Test: blockdev reset ...passed 00:36:08.040 Test: blockdev write read 8 blocks ...passed 00:36:08.040 Test: blockdev write read size > 128k ...passed 00:36:08.040 Test: blockdev write read invalid size ...passed 00:36:08.040 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:08.040 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:08.040 Test: blockdev write read max offset ...passed 00:36:08.040 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:08.040 Test: blockdev writev readv 8 blocks ...passed 00:36:08.040 Test: blockdev writev readv 30 x 1block ...passed 00:36:08.040 Test: blockdev writev readv block ...passed 00:36:08.040 Test: blockdev writev readv size > 128k ...passed 00:36:08.040 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:08.040 Test: blockdev comparev and writev ...passed 00:36:08.040 Test: blockdev nvme passthru rw ...passed 00:36:08.040 Test: blockdev nvme passthru vendor specific ...passed 00:36:08.040 Test: blockdev nvme admin passthru ...passed 00:36:08.040 Test: blockdev copy ...passed 00:36:08.040 00:36:08.040 Run Summary: Type Total Ran Passed Failed Inactive 00:36:08.040 suites 4 4 n/a 0 0 00:36:08.040 tests 92 92 92 0 0 00:36:08.040 asserts 520 520 520 0 n/a 00:36:08.040 00:36:08.040 Elapsed time = 0.493 seconds 00:36:08.040 0 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1340328 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1340328 ']' 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1340328 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1340328 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1340328' 00:36:08.040 killing process with pid 1340328 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1340328 00:36:08.040 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1340328 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:36:08.301 00:36:08.301 real 0m3.551s 00:36:08.301 user 0m9.903s 00:36:08.301 sys 0m0.693s 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:36:08.301 ************************************ 00:36:08.301 END TEST bdev_bounds 00:36:08.301 ************************************ 00:36:08.301 06:52:21 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:36:08.301 06:52:21 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:36:08.301 06:52:21 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:08.301 06:52:21 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:08.301 ************************************ 00:36:08.301 START TEST bdev_nbd 00:36:08.301 ************************************ 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1340891 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1340891 /var/tmp/spdk-nbd.sock 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1340891 ']' 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:36:08.301 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:36:08.301 06:52:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:36:08.301 [2024-07-25 06:52:21.812382] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:36:08.301 [2024-07-25 06:52:21.812439] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:01.0 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:01.1 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:01.2 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:01.3 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:01.4 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:01.5 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:01.6 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:01.7 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:02.0 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:02.1 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:02.2 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:02.3 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:02.4 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:02.5 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:02.6 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3d:02.7 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:01.0 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:01.1 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:01.2 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:01.3 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:01.4 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:01.5 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:01.6 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:01.7 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:02.0 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:02.1 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:02.2 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:02.3 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:02.4 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:02.5 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:02.6 cannot be used 00:36:08.562 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:08.562 EAL: Requested device 0000:3f:02.7 cannot be used 00:36:08.562 [2024-07-25 06:52:21.948582] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:08.562 [2024-07-25 06:52:21.993161] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:08.562 [2024-07-25 06:52:22.014384] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:36:08.562 [2024-07-25 06:52:22.022411] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:08.562 [2024-07-25 06:52:22.030430] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:08.822 [2024-07-25 06:52:22.139495] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:36:11.361 [2024-07-25 06:52:24.472628] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:36:11.361 [2024-07-25 06:52:24.472884] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:11.361 [2024-07-25 06:52:24.472899] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:11.361 [2024-07-25 06:52:24.480647] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:36:11.361 [2024-07-25 06:52:24.480665] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:11.361 [2024-07-25 06:52:24.480676] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:11.361 [2024-07-25 06:52:24.488667] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:36:11.361 [2024-07-25 06:52:24.488683] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:11.361 [2024-07-25 06:52:24.488693] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:11.361 [2024-07-25 06:52:24.496687] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:36:11.361 [2024-07-25 06:52:24.496703] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:11.361 [2024-07-25 06:52:24.496713] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:36:11.361 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:11.362 1+0 records in 00:36:11.362 1+0 records out 00:36:11.362 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272254 s, 15.0 MB/s 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:11.362 06:52:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:11.621 1+0 records in 00:36:11.621 1+0 records out 00:36:11.621 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319757 s, 12.8 MB/s 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:11.621 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:11.880 1+0 records in 00:36:11.880 1+0 records out 00:36:11.880 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339313 s, 12.1 MB/s 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:11.880 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:12.140 1+0 records in 00:36:12.140 1+0 records out 00:36:12.140 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000359201 s, 11.4 MB/s 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:12.140 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:36:12.400 { 00:36:12.400 "nbd_device": "/dev/nbd0", 00:36:12.400 "bdev_name": "crypto_ram" 00:36:12.400 }, 00:36:12.400 { 00:36:12.400 "nbd_device": "/dev/nbd1", 00:36:12.400 "bdev_name": "crypto_ram2" 00:36:12.400 }, 00:36:12.400 { 00:36:12.400 "nbd_device": "/dev/nbd2", 00:36:12.400 "bdev_name": "crypto_ram3" 00:36:12.400 }, 00:36:12.400 { 00:36:12.400 "nbd_device": "/dev/nbd3", 00:36:12.400 "bdev_name": "crypto_ram4" 00:36:12.400 } 00:36:12.400 ]' 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:36:12.400 { 00:36:12.400 "nbd_device": "/dev/nbd0", 00:36:12.400 "bdev_name": "crypto_ram" 00:36:12.400 }, 00:36:12.400 { 00:36:12.400 "nbd_device": "/dev/nbd1", 00:36:12.400 "bdev_name": "crypto_ram2" 00:36:12.400 }, 00:36:12.400 { 00:36:12.400 "nbd_device": "/dev/nbd2", 00:36:12.400 "bdev_name": "crypto_ram3" 00:36:12.400 }, 00:36:12.400 { 00:36:12.400 "nbd_device": "/dev/nbd3", 00:36:12.400 "bdev_name": "crypto_ram4" 00:36:12.400 } 00:36:12.400 ]' 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:12.400 06:52:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:12.660 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:12.920 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:13.181 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:13.440 06:52:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:13.700 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:36:13.960 /dev/nbd0 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:13.960 1+0 records in 00:36:13.960 1+0 records out 00:36:13.960 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281421 s, 14.6 MB/s 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:13.960 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:36:14.220 /dev/nbd1 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:14.220 1+0 records in 00:36:14.220 1+0 records out 00:36:14.220 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285835 s, 14.3 MB/s 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:14.220 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:36:14.480 /dev/nbd10 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:36:14.480 06:52:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:14.480 1+0 records in 00:36:14.480 1+0 records out 00:36:14.480 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261869 s, 15.6 MB/s 00:36:14.481 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.481 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:36:14.481 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.481 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:36:14.481 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:36:14.481 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:14.481 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:14.481 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:36:14.740 /dev/nbd11 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:14.740 1+0 records in 00:36:14.740 1+0 records out 00:36:14.740 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000357814 s, 11.4 MB/s 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:14.740 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:36:14.741 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:14.741 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:15.000 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:36:15.000 { 00:36:15.000 "nbd_device": "/dev/nbd0", 00:36:15.000 "bdev_name": "crypto_ram" 00:36:15.000 }, 00:36:15.000 { 00:36:15.000 "nbd_device": "/dev/nbd1", 00:36:15.000 "bdev_name": "crypto_ram2" 00:36:15.000 }, 00:36:15.000 { 00:36:15.000 "nbd_device": "/dev/nbd10", 00:36:15.000 "bdev_name": "crypto_ram3" 00:36:15.000 }, 00:36:15.000 { 00:36:15.000 "nbd_device": "/dev/nbd11", 00:36:15.000 "bdev_name": "crypto_ram4" 00:36:15.000 } 00:36:15.000 ]' 00:36:15.000 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:36:15.000 { 00:36:15.000 "nbd_device": "/dev/nbd0", 00:36:15.000 "bdev_name": "crypto_ram" 00:36:15.000 }, 00:36:15.000 { 00:36:15.000 "nbd_device": "/dev/nbd1", 00:36:15.000 "bdev_name": "crypto_ram2" 00:36:15.000 }, 00:36:15.000 { 00:36:15.000 "nbd_device": "/dev/nbd10", 00:36:15.000 "bdev_name": "crypto_ram3" 00:36:15.000 }, 00:36:15.000 { 00:36:15.000 "nbd_device": "/dev/nbd11", 00:36:15.000 "bdev_name": "crypto_ram4" 00:36:15.000 } 00:36:15.000 ]' 00:36:15.000 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:36:15.259 /dev/nbd1 00:36:15.259 /dev/nbd10 00:36:15.259 /dev/nbd11' 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:36:15.259 /dev/nbd1 00:36:15.259 /dev/nbd10 00:36:15.259 /dev/nbd11' 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:36:15.259 256+0 records in 00:36:15.259 256+0 records out 00:36:15.259 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104328 s, 101 MB/s 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:15.259 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:36:15.260 256+0 records in 00:36:15.260 256+0 records out 00:36:15.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0561145 s, 18.7 MB/s 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:36:15.260 256+0 records in 00:36:15.260 256+0 records out 00:36:15.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0592901 s, 17.7 MB/s 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:36:15.260 256+0 records in 00:36:15.260 256+0 records out 00:36:15.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0356571 s, 29.4 MB/s 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:36:15.260 256+0 records in 00:36:15.260 256+0 records out 00:36:15.260 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.034353 s, 30.5 MB/s 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:15.260 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:15.519 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:36:15.520 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:15.520 06:52:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:36:15.520 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:15.520 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:15.520 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:15.520 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:15.520 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:15.520 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:15.779 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:16.038 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:36:16.297 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:16.298 06:52:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:16.557 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:36:16.557 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:36:16.557 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:36:16.817 malloc_lvol_verify 00:36:16.817 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:36:17.076 531d08ad-0373-4868-8444-3174a34ba832 00:36:17.076 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:36:17.335 e4f2c927-e0be-4e81-9776-69fad37f8d65 00:36:17.335 06:52:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:36:17.595 /dev/nbd0 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:36:17.595 mke2fs 1.46.5 (30-Dec-2021) 00:36:17.595 Discarding device blocks: 0/4096 done 00:36:17.595 Creating filesystem with 4096 1k blocks and 1024 inodes 00:36:17.595 00:36:17.595 Allocating group tables: 0/1 done 00:36:17.595 Writing inode tables: 0/1 done 00:36:17.595 Creating journal (1024 blocks): done 00:36:17.595 Writing superblocks and filesystem accounting information: 0/1 done 00:36:17.595 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:17.595 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1340891 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1340891 ']' 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1340891 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1340891 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1340891' 00:36:17.853 killing process with pid 1340891 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1340891 00:36:17.853 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1340891 00:36:18.111 06:52:31 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:36:18.111 00:36:18.111 real 0m9.920s 00:36:18.111 user 0m12.769s 00:36:18.111 sys 0m4.043s 00:36:18.111 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:18.111 06:52:31 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:36:18.111 ************************************ 00:36:18.111 END TEST bdev_nbd 00:36:18.111 ************************************ 00:36:18.370 06:52:31 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:36:18.370 06:52:31 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:36:18.370 06:52:31 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:36:18.370 06:52:31 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:36:18.370 06:52:31 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:36:18.370 06:52:31 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:18.370 06:52:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:18.370 ************************************ 00:36:18.370 START TEST bdev_fio 00:36:18.370 ************************************ 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:36:18.370 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:36:18.370 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:18.371 ************************************ 00:36:18.371 START TEST bdev_fio_rw_verify 00:36:18.371 ************************************ 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:36:18.371 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:36:18.655 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:36:18.655 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:36:18.655 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:36:18.655 06:52:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:18.924 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:18.924 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:18.924 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:18.924 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:18.924 fio-3.35 00:36:18.924 Starting 4 threads 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:01.0 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:01.1 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:01.2 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:01.3 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:01.4 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:01.5 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:01.6 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:01.7 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:02.0 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:02.1 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:02.2 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:02.3 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:02.4 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:02.5 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:02.6 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3d:02.7 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3f:01.0 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3f:01.1 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3f:01.2 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3f:01.3 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3f:01.4 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3f:01.5 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3f:01.6 cannot be used 00:36:18.924 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.924 EAL: Requested device 0000:3f:01.7 cannot be used 00:36:18.925 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.925 EAL: Requested device 0000:3f:02.0 cannot be used 00:36:18.925 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.925 EAL: Requested device 0000:3f:02.1 cannot be used 00:36:18.925 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.925 EAL: Requested device 0000:3f:02.2 cannot be used 00:36:18.925 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.925 EAL: Requested device 0000:3f:02.3 cannot be used 00:36:18.925 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.925 EAL: Requested device 0000:3f:02.4 cannot be used 00:36:18.925 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.925 EAL: Requested device 0000:3f:02.5 cannot be used 00:36:18.925 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.925 EAL: Requested device 0000:3f:02.6 cannot be used 00:36:18.925 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:18.925 EAL: Requested device 0000:3f:02.7 cannot be used 00:36:33.844 00:36:33.844 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1343339: Thu Jul 25 06:52:45 2024 00:36:33.844 read: IOPS=23.3k, BW=91.0MiB/s (95.4MB/s)(910MiB/10001msec) 00:36:33.844 slat (usec): min=15, max=1493, avg=57.70, stdev=37.05 00:36:33.844 clat (usec): min=11, max=1947, avg=303.78, stdev=210.94 00:36:33.844 lat (usec): min=47, max=2191, avg=361.47, stdev=232.92 00:36:33.844 clat percentiles (usec): 00:36:33.844 | 50.000th=[ 251], 99.000th=[ 1029], 99.900th=[ 1270], 99.990th=[ 1401], 00:36:33.844 | 99.999th=[ 1860] 00:36:33.844 write: IOPS=25.6k, BW=100.0MiB/s (105MB/s)(978MiB/9780msec); 0 zone resets 00:36:33.844 slat (usec): min=17, max=397, avg=69.84, stdev=36.24 00:36:33.844 clat (usec): min=31, max=1815, avg=371.76, stdev=244.24 00:36:33.844 lat (usec): min=72, max=2012, avg=441.59, stdev=265.31 00:36:33.844 clat percentiles (usec): 00:36:33.844 | 50.000th=[ 322], 99.000th=[ 1205], 99.900th=[ 1663], 99.990th=[ 1762], 00:36:33.844 | 99.999th=[ 1811] 00:36:33.844 bw ( KiB/s): min=86272, max=127912, per=97.99%, avg=100309.05, stdev=2790.06, samples=76 00:36:33.844 iops : min=21568, max=31978, avg=25077.26, stdev=697.52, samples=76 00:36:33.844 lat (usec) : 20=0.01%, 50=0.01%, 100=8.10%, 250=34.03%, 500=40.11% 00:36:33.844 lat (usec) : 750=10.74%, 1000=4.88% 00:36:33.844 lat (msec) : 2=2.13% 00:36:33.844 cpu : usr=99.62%, sys=0.01%, ctx=60, majf=0, minf=327 00:36:33.844 IO depths : 1=10.2%, 2=25.4%, 4=51.1%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:36:33.844 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:33.844 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:33.844 issued rwts: total=232861,250278,0,0 short=0,0,0,0 dropped=0,0,0,0 00:36:33.844 latency : target=0, window=0, percentile=100.00%, depth=8 00:36:33.844 00:36:33.844 Run status group 0 (all jobs): 00:36:33.844 READ: bw=91.0MiB/s (95.4MB/s), 91.0MiB/s-91.0MiB/s (95.4MB/s-95.4MB/s), io=910MiB (954MB), run=10001-10001msec 00:36:33.844 WRITE: bw=100.0MiB/s (105MB/s), 100.0MiB/s-100.0MiB/s (105MB/s-105MB/s), io=978MiB (1025MB), run=9780-9780msec 00:36:33.844 00:36:33.844 real 0m13.579s 00:36:33.844 user 0m54.117s 00:36:33.844 sys 0m0.681s 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:36:33.844 ************************************ 00:36:33.844 END TEST bdev_fio_rw_verify 00:36:33.844 ************************************ 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:36:33.844 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c3798b95-15ca-5f09-ae04-56c05ca8f5c9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c3798b95-15ca-5f09-ae04-56c05ca8f5c9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "45f0f0c1-df77-58dc-accd-477a1370ac4e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "45f0f0c1-df77-58dc-accd-477a1370ac4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "3af39778-50f4-5282-8d65-5da263583e4a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3af39778-50f4-5282-8d65-5da263583e4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "4e3b3b23-372f-5e77-9d0b-ca45e52a0740"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4e3b3b23-372f-5e77-9d0b-ca45e52a0740",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:36:33.845 crypto_ram2 00:36:33.845 crypto_ram3 00:36:33.845 crypto_ram4 ]] 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c3798b95-15ca-5f09-ae04-56c05ca8f5c9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c3798b95-15ca-5f09-ae04-56c05ca8f5c9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "45f0f0c1-df77-58dc-accd-477a1370ac4e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "45f0f0c1-df77-58dc-accd-477a1370ac4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "3af39778-50f4-5282-8d65-5da263583e4a"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3af39778-50f4-5282-8d65-5da263583e4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "4e3b3b23-372f-5e77-9d0b-ca45e52a0740"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4e3b3b23-372f-5e77-9d0b-ca45e52a0740",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:33.845 ************************************ 00:36:33.845 START TEST bdev_fio_trim 00:36:33.845 ************************************ 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:33.845 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:36:33.846 06:52:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:33.846 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:33.846 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:33.846 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:33.846 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:33.846 fio-3.35 00:36:33.846 Starting 4 threads 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:01.0 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:01.1 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:01.2 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:01.3 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:01.4 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:01.5 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:01.6 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:01.7 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:02.0 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:02.1 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:02.2 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:02.3 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:02.4 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:02.5 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:02.6 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3d:02.7 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:01.0 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:01.1 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:01.2 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:01.3 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:01.4 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:01.5 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:01.6 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:01.7 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:02.0 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:02.1 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:02.2 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:02.3 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:02.4 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:02.5 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:02.6 cannot be used 00:36:33.846 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:33.846 EAL: Requested device 0000:3f:02.7 cannot be used 00:36:46.061 00:36:46.061 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1345802: Thu Jul 25 06:52:58 2024 00:36:46.061 write: IOPS=35.8k, BW=140MiB/s (147MB/s)(1400MiB/10001msec); 0 zone resets 00:36:46.061 slat (usec): min=17, max=491, avg=61.78, stdev=27.70 00:36:46.061 clat (usec): min=73, max=2356, avg=512.82, stdev=174.27 00:36:46.061 lat (usec): min=106, max=2471, avg=574.60, stdev=164.65 00:36:46.061 clat percentiles (usec): 00:36:46.061 | 50.000th=[ 586], 99.000th=[ 766], 99.900th=[ 906], 99.990th=[ 1045], 00:36:46.061 | 99.999th=[ 1876] 00:36:46.061 bw ( KiB/s): min=132240, max=156512, per=100.00%, avg=143709.89, stdev=2954.52, samples=76 00:36:46.061 iops : min=33060, max=39128, avg=35927.47, stdev=738.63, samples=76 00:36:46.061 trim: IOPS=35.8k, BW=140MiB/s (147MB/s)(1400MiB/10001msec); 0 zone resets 00:36:46.061 slat (usec): min=5, max=1609, avg=17.27, stdev= 6.03 00:36:46.061 clat (usec): min=24, max=2070, avg=160.01, stdev=119.13 00:36:46.061 lat (usec): min=34, max=2091, avg=177.28, stdev=119.34 00:36:46.061 clat percentiles (usec): 00:36:46.061 | 50.000th=[ 94], 99.000th=[ 515], 99.900th=[ 635], 99.990th=[ 717], 00:36:46.061 | 99.999th=[ 1172] 00:36:46.061 bw ( KiB/s): min=132288, max=156512, per=100.00%, avg=143715.79, stdev=2954.05, samples=76 00:36:46.061 iops : min=33072, max=39128, avg=35928.95, stdev=738.51, samples=76 00:36:46.061 lat (usec) : 50=0.11%, 100=27.48%, 250=17.48%, 500=21.68%, 750=32.45% 00:36:46.061 lat (usec) : 1000=0.77% 00:36:46.061 lat (msec) : 2=0.01%, 4=0.01% 00:36:46.061 cpu : usr=99.63%, sys=0.00%, ctx=104, majf=0, minf=143 00:36:46.061 IO depths : 1=0.1%, 2=12.5%, 4=53.1%, 8=34.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:36:46.061 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:46.061 complete : 0=0.0%, 4=95.3%, 8=4.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:46.061 issued rwts: total=0,358365,358366,0 short=0,0,0,0 dropped=0,0,0,0 00:36:46.061 latency : target=0, window=0, percentile=100.00%, depth=8 00:36:46.061 00:36:46.061 Run status group 0 (all jobs): 00:36:46.061 WRITE: bw=140MiB/s (147MB/s), 140MiB/s-140MiB/s (147MB/s-147MB/s), io=1400MiB (1468MB), run=10001-10001msec 00:36:46.061 TRIM: bw=140MiB/s (147MB/s), 140MiB/s-140MiB/s (147MB/s-147MB/s), io=1400MiB (1468MB), run=10001-10001msec 00:36:46.061 00:36:46.061 real 0m13.602s 00:36:46.061 user 0m54.274s 00:36:46.061 sys 0m0.642s 00:36:46.061 06:52:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:46.061 06:52:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:36:46.061 ************************************ 00:36:46.061 END TEST bdev_fio_trim 00:36:46.061 ************************************ 00:36:46.062 06:52:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:36:46.062 06:52:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:46.062 06:52:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:36:46.062 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:46.062 06:52:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:36:46.062 00:36:46.062 real 0m27.559s 00:36:46.062 user 1m48.577s 00:36:46.062 sys 0m1.539s 00:36:46.062 06:52:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:46.062 06:52:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:46.062 ************************************ 00:36:46.062 END TEST bdev_fio 00:36:46.062 ************************************ 00:36:46.062 06:52:59 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:36:46.062 06:52:59 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:36:46.062 06:52:59 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:36:46.062 06:52:59 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:46.062 06:52:59 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:46.062 ************************************ 00:36:46.062 START TEST bdev_verify 00:36:46.062 ************************************ 00:36:46.062 06:52:59 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:36:46.062 [2024-07-25 06:52:59.448004] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:36:46.062 [2024-07-25 06:52:59.448058] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1347461 ] 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:01.0 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:01.1 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:01.2 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:01.3 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:01.4 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:01.5 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:01.6 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:01.7 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:02.0 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:02.1 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:02.2 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:02.3 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:02.4 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:02.5 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:02.6 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3d:02.7 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:01.0 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:01.1 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:01.2 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:01.3 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:01.4 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:01.5 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:01.6 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:01.7 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:02.0 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:02.1 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:02.2 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:02.3 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:02.4 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:02.5 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:02.6 cannot be used 00:36:46.062 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:46.062 EAL: Requested device 0000:3f:02.7 cannot be used 00:36:46.062 [2024-07-25 06:52:59.583198] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:36:46.323 [2024-07-25 06:52:59.629426] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:46.323 [2024-07-25 06:52:59.629431] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:46.323 [2024-07-25 06:52:59.650730] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:36:46.323 [2024-07-25 06:52:59.658761] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:46.323 [2024-07-25 06:52:59.666780] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:46.323 [2024-07-25 06:52:59.765872] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:36:48.856 [2024-07-25 06:53:02.087341] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:36:48.856 [2024-07-25 06:53:02.087414] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:48.856 [2024-07-25 06:53:02.087428] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:48.856 [2024-07-25 06:53:02.095358] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:36:48.856 [2024-07-25 06:53:02.095375] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:48.857 [2024-07-25 06:53:02.095385] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:48.857 [2024-07-25 06:53:02.103380] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:36:48.857 [2024-07-25 06:53:02.103396] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:48.857 [2024-07-25 06:53:02.103406] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:48.857 [2024-07-25 06:53:02.111402] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:36:48.857 [2024-07-25 06:53:02.111418] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:48.857 [2024-07-25 06:53:02.111429] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:48.857 Running I/O for 5 seconds... 00:36:54.127 00:36:54.127 Latency(us) 00:36:54.127 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:54.127 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:54.127 Verification LBA range: start 0x0 length 0x1000 00:36:54.127 crypto_ram : 5.06 1032.14 4.03 0.00 0.00 123452.67 7602.18 107374.18 00:36:54.127 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:54.127 Verification LBA range: start 0x1000 length 0x1000 00:36:54.127 crypto_ram : 5.06 1036.72 4.05 0.00 0.00 123151.41 9279.90 107374.18 00:36:54.127 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:54.127 Verification LBA range: start 0x0 length 0x1000 00:36:54.127 crypto_ram2 : 5.06 1036.41 4.05 0.00 0.00 122781.02 7130.32 89338.68 00:36:54.127 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:54.127 Verification LBA range: start 0x1000 length 0x1000 00:36:54.127 crypto_ram2 : 5.06 1036.46 4.05 0.00 0.00 122742.80 9856.61 88919.24 00:36:54.127 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:54.127 Verification LBA range: start 0x0 length 0x1000 00:36:54.127 crypto_ram3 : 5.05 3271.52 12.78 0.00 0.00 38706.74 6606.03 43830.48 00:36:54.127 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:54.127 Verification LBA range: start 0x1000 length 0x1000 00:36:54.127 crypto_ram3 : 5.06 3290.43 12.85 0.00 0.00 38526.05 2215.12 43830.48 00:36:54.127 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:54.127 Verification LBA range: start 0x0 length 0x1000 00:36:54.127 crypto_ram4 : 5.06 3289.35 12.85 0.00 0.00 38423.95 1291.06 39845.89 00:36:54.127 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:54.127 Verification LBA range: start 0x1000 length 0x1000 00:36:54.127 crypto_ram4 : 5.06 3289.84 12.85 0.00 0.00 38401.13 2713.19 39426.46 00:36:54.127 =================================================================================================================== 00:36:54.127 Total : 17282.88 67.51 0.00 0.00 58790.54 1291.06 107374.18 00:36:54.127 00:36:54.127 real 0m8.200s 00:36:54.127 user 0m15.507s 00:36:54.128 sys 0m0.490s 00:36:54.128 06:53:07 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:54.128 06:53:07 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:36:54.128 ************************************ 00:36:54.128 END TEST bdev_verify 00:36:54.128 ************************************ 00:36:54.128 06:53:07 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:36:54.128 06:53:07 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:36:54.128 06:53:07 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:54.128 06:53:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:36:54.128 ************************************ 00:36:54.128 START TEST bdev_verify_big_io 00:36:54.128 ************************************ 00:36:54.128 06:53:07 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:36:54.387 [2024-07-25 06:53:07.720671] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:36:54.387 [2024-07-25 06:53:07.720725] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1348793 ] 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:01.0 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:01.1 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:01.2 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:01.3 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:01.4 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:01.5 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:01.6 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:01.7 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:02.0 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:02.1 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:02.2 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:02.3 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:02.4 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:02.5 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:02.6 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3d:02.7 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:01.0 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:01.1 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:01.2 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:01.3 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:01.4 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:01.5 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:01.6 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:01.7 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:02.0 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:02.1 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:02.2 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:02.3 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:02.4 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:02.5 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:02.6 cannot be used 00:36:54.387 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:36:54.387 EAL: Requested device 0000:3f:02.7 cannot be used 00:36:54.387 [2024-07-25 06:53:07.856373] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:36:54.387 [2024-07-25 06:53:07.902198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:54.387 [2024-07-25 06:53:07.902204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:54.387 [2024-07-25 06:53:07.923522] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:36:54.387 [2024-07-25 06:53:07.931556] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:54.387 [2024-07-25 06:53:07.939574] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:54.646 [2024-07-25 06:53:08.048944] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:36:57.217 [2024-07-25 06:53:10.371073] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:36:57.217 [2024-07-25 06:53:10.371150] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:57.217 [2024-07-25 06:53:10.371168] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:57.217 [2024-07-25 06:53:10.379101] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:36:57.217 [2024-07-25 06:53:10.379118] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:57.217 [2024-07-25 06:53:10.379129] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:57.217 [2024-07-25 06:53:10.387112] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:36:57.217 [2024-07-25 06:53:10.387128] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:57.217 [2024-07-25 06:53:10.387142] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:57.217 [2024-07-25 06:53:10.395131] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:36:57.217 [2024-07-25 06:53:10.395150] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:57.217 [2024-07-25 06:53:10.395160] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:57.217 Running I/O for 5 seconds... 00:36:57.787 [2024-07-25 06:53:11.257260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.257661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.257822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.257871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.257910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.257948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.258396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.258413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.259841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.259916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.259966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.260005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.260435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.260483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.260521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.260558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.260942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.260957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.261912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.261960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.261998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.262058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.262600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.262644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.262682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.262719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.262965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.262980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.264202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.264249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.264286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.264326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.264733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.264776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.264814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.264851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.265178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.265195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.266124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.266178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.266225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.266263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.266753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.266797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.266848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.266913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.267274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.267289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.268387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.268435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.268473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.268512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.268898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.268941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.268980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.269017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.269292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.269308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.270271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.270318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.270356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.270406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.270853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.270908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.270946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.270998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.787 [2024-07-25 06:53:11.271342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.271357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.272510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.272585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.272634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.272673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.273088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.273147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.273186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.273223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.273595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.273610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.274540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.274588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.274626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.274663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.275098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.275165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.275205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.275242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.275577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.275591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.276700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.276769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.276819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.276868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.277337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.277384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.277422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.277460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.277865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.277880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.278804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.278853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.278890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.278942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.279472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.279528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.279567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.279605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.279915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.279930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.280989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.281970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.282916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.282964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.283003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.283052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.283606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.283680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.283720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.283783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.284119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.284134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.285431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.285478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.285515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.285553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.285963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.286007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.286045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.286083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.286388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.286403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.287301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.287349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.287387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.287425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.287900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.287942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.287986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.288033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.288349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.288365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.289725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.289785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.788 [2024-07-25 06:53:11.289841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.289880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.290314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.290370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.290408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.290446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.290762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.290776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.291584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.291634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.291673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.291711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.292101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.292151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.292204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.292242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.292624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.292639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.293763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.293821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.293876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.293931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.294431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.294474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.294516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.294554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.294876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.294891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.295829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.295878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.295916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.295954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.296420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.296464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.296520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.296557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.296953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.296967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.297967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.298937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.300084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.300132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.300178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.300215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.300698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.300741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.300780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.300833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.301158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.301173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.302280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.302339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.302377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.302415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.302809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.302865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.302914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.302953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.303258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.303273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.304338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.304386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.304425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.304464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.304839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.304882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.304920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.304958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.305293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.305309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.306381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.306430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.789 [2024-07-25 06:53:11.306479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.306520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.307034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.307078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.307128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.307207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.307566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.307582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.308606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.308666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.308705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.308744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.309207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.309253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.309291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.309328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.309688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.309704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.310662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.310708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.310746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.310785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.311267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.311312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.311351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.311388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.311701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.311716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.312721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.312815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.312867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.312907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.313364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.313408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.313446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.313483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.313900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.313917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.314902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.314960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.315013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.315072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.315531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.315575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.315616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.315655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.316012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.316027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.316864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.316912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.316976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.317017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.317366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.317410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.317448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.317485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.317754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.317769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.318595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.318650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.318688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.318726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.319064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.319105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.319150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.319187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.319533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.319552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.321249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.321312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.321363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.321420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.321844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.321914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.321964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.322013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.322281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.322300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.323310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.323357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.323395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.790 [2024-07-25 06:53:11.323417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.323431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.323924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.323970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.324008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.324045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.324440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.324455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.327327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.328661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.330215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.331764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.333642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.335053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.336603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.338225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.338657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:57.791 [2024-07-25 06:53:11.338672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.341158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.342823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.344518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.346099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.348047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.349681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.351388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.353017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.353349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.353364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.355453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.357018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.358569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.359636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.361220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.362778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.364340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.365081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.365508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.365526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.368112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.369702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.371372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.372061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.373976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.375545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.376940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.377301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.377651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.377670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.380021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.381574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.382413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.384011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.385941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.387503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.387973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.388330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.388641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.388656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.391057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.392699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.393495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.394801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.396667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.397955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.398313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.398666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.398895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.398910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.401262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.402037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.403696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.405186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.407061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.407478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.407833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.408603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.408837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.408856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.411131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.412073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.413389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.414956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.416385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.416756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.417109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.418794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.419032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.419047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.420349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.421875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.423530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.052 [2024-07-25 06:53:11.425089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.425746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.426105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.427125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.428423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.428652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.428667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.430466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.431789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.433344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.434914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.435712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.436068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.437634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.439380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.439612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.439626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.441950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.443517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.445084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.446430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.447226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.448428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.449727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.451282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.451512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.451527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.453593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.455159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.456722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.457195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.458080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.459435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.460993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.462548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.462778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.462793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.465207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.466779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.467913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.468274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.470199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.471511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.473042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.474571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.475005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.475020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.477353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.478962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.479331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.479684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.481289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.482845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.484407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.485511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.485743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.485757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.488099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.489055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.489429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.489783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.491423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.492982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.494531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.495057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.495293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.495308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.497666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.498036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.498394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.499423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.501326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.502883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.503935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.505338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.505644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.505658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.507273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.507641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.508031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.509479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.511358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.513070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.053 [2024-07-25 06:53:11.513774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.515069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.515304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.515319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.516414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.516776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.517955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.519237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.521103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.522045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.523522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.524820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.525050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.525065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.526237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.526633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.528077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.529682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.531707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.532488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.533796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.535364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.535596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.535610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.536781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.538149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.539458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.541002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.541996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.543643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.545269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.546995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.547234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.547249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.548974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.550271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.551813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.553355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.554708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.556012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.557553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.559094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.559385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.559401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.562051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.563491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.565050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.566659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.568636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.570272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.571832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.573296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.573650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.573665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.575694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.577228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.578798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.579750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.581343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.582894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.584436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.585083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.585501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.585516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.588036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.589687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.591389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.592156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.594038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.595597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.596869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.597229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.597620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.597635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.599968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.601536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.602265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.603926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.054 [2024-07-25 06:53:11.605833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.317 [2024-07-25 06:53:11.607389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.317 [2024-07-25 06:53:11.607799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.317 [2024-07-25 06:53:11.608157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.317 [2024-07-25 06:53:11.608457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.317 [2024-07-25 06:53:11.608471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.317 [2024-07-25 06:53:11.610837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.612341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.613276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.614585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.616450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.617581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.617936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.618307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.618536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.618551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.620887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.621505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.623092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.624754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.626661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.627031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.627387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.628385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.628657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.628671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.630701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.631887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.633173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.634737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.635918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.636285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.636639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.638145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.638378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.638392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.640657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.642007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.643492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.643944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.644783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.645153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.645513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.645866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.646300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.646317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.647456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.647819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.648182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.648540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.649354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.649717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.650073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.650429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.650748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.650763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.652004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.652372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.652729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.653086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.653880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.654251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.654611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.654973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.655352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.655368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.656774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.657137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.657183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.657539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.657595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.657967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.658416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.658781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.659150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.659514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.659834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.659848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.660863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.660910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.660960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.661017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.661483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.661617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.661684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.661735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.661774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.662104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.662119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.663024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.318 [2024-07-25 06:53:11.663070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.663110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.663155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.663443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.663572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.663614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.663653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.663690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.664070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.664085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.665719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.666033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.666048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.667826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.668232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.668248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.669922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.670219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.670234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.671999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.672299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.672316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.673276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.673323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.673361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.673399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.673822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.673949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.673998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.674057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.674108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.674651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.674667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.675727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.675775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.675816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.675854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.676145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.676278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.676320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.676362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.676400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.676650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.676665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.677606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.677653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.677690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.677727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.678022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.678163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.678206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.319 [2024-07-25 06:53:11.678243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.678293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.678680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.678696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.679913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.679972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.680029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.680080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.680432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.680566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.680644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.680691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.680730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.681093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.681108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.681976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.682023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.682061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.682101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.682343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.682475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.682517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.682556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.682608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.683015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.683030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.684270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.684316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.684374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.684425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.684778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.684908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.684950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.684987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.685024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.685327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.685342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.686339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.686396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.686436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.686474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.686822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.686953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.686994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.687032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.687071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.687475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.687491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.688352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.688399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.688456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.688493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.688872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.688999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.689040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.689078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.689115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.689354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.689369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.690478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.690524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.690561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.690599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.690945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.691070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.691112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.691156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.691196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.691521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.691536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.692413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.692459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.692501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.692538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.692900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.693033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.693093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.693133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.693178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.693576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.693594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.694770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.694815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.694854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.694892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.695125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.695264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.695307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.695345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.695382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.695688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.695704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.320 [2024-07-25 06:53:11.696733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.696779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.696833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.696885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.697305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.697452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.697521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.697568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.697607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.697981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.697996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.698813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.698860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.698898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.698937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.699194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.699325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.699367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.699405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.699454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.699880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.699896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.700940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.700987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.701860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.702620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.702666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.702711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.702750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.703080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.703220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.703266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.703303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.703341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.703745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.703762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.704556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.704602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.704640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.704677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.704902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.705035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.705076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.705112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.705157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.705515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.705531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.706901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.707265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.707281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.708760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.708806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.708846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.708884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.709110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.709245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.709287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.709324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.709377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.709602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.709616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.710584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.710631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.710669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.710709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.710937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.711067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.711108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.711159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.711201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.711428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.711442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.712413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.712459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.712496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.321 [2024-07-25 06:53:11.712535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.712902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.713029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.713070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.713109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.713157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.713384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.713399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.714820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.715045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.715060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.715855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.715901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.715942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.715980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.716268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.716397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.716454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.716493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.716531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.716895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.716910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.717793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.717839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.717876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.717913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.718145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.718279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.718321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.718357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.718394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.718621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.718635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.719547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.719602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.719639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.719676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.719902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.720028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.720069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.720107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.720150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.720383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.720398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.721475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.721523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.721560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.721597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.721848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.721972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.722014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.722051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.722088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.722378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.722393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.723283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.724053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.724096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.725398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.725627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.725900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.725955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.725998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.726035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.726269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.322 [2024-07-25 06:53:11.726284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.727782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.729089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.730635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.732203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.732443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.733331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.734692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.736253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.737822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.738096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.738110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.740277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.741591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.743157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.744716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.745033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.746381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.747693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.749248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.750818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.751189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.751204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.754195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.755851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.757592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.759210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.759522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.760835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.762405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.763970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.765360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.765704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.765719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.767769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.769337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.770902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.771773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.772007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.773398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.774958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.776511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.776977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.777321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.777337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.779831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.781475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.782994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.783881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.784155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.785813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.787377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.788491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.788842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.789259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.789278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.791593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.793157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.793720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.795274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.795503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.797146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.798815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.799179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.799531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.799763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.799778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.802065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.803306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.804545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.805837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.806066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.807721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.808531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.808886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.809308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.809537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.809552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.811828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.812375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.813690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.815253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.815483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.817044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.817408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.817761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.819154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.819473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.819487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.821178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.822686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.824025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.825592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.323 [2024-07-25 06:53:11.825822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.826450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.826808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.827563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.828869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.829097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.829115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.830637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.831948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.833516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.835093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.835401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.835840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.836209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.837872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.839439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.839669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.839683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.842207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.843818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.845505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.847155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.847479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.847915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.848949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.850248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.851809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.852037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.852052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.854092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.855666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.857229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.857905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.858307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.858750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.860211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.861808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.863380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.863609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.863623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.866032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.867603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.868863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.869222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.324 [2024-07-25 06:53:11.869612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.871087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.872402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.873964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.875517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.875920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.875934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.878276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.879843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.880208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.880562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.880813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.882184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.883738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.885295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.886405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.886634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.886649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.888958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.889980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.890340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.890693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.890922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.892450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.894011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.895664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.896325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.896555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.896571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.898966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.899343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.899698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.900869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.901174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.902827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.904401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.905227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.906843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.907143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.907158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.908647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.909010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.909542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.910852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.911082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.912772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.914249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.915203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.916510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.916741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.916755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.917813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.918179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.919615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.920909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.921142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.922793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.923396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.924975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.926632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.926864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.926878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.928066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.928776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.930082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.931641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.931870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.933317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.934441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.935751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.587 [2024-07-25 06:53:11.937308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.937545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.937559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.938773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.940442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.941946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.943538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.943768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.944314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.945664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.947229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.948799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.949029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.949043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.950806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.952124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.953659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.955214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.955497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.956888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.958195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.959756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.961305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.961634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.961649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.964773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.966397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.968094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.969741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.970053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.971435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.972992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.974555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.975793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.976168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.976183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.978252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.979831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.981393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.982107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.982339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.983753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.985316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.986879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.987243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.987568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.987583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.989940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.991514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.992865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.993915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.994188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.995848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.997422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.998345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.998716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.999115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:11.999130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.001446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.003018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.003488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.004876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.005105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.006921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.008487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.008841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.009197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.009428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.009443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.011074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.012273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.013451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.013808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.014192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.014632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.014992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.015357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.015709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.016009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.016024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.017241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.017605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.017963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.018325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.018674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.019114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.019482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.019837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.020194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.020469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.020484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.021836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.022210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.588 [2024-07-25 06:53:12.022571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.022932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.023233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.023676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.024041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.024409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.024766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.025037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.025052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.026416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.026784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.027151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.027525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.027885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.028329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.028692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.029048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.029407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.029735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.029750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.030994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.031358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.031720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.032078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.032493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.032947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.033322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.033682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.034053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.034474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.034490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.035706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.036068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.036431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.036791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.037100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.037543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.037904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.038269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.038622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.038986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.039001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.040275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.040333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.040688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.040727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.041063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.041512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.041867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.042227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.042582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.043008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.043024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.044746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.045158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.045174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.046494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.046540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.046578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.046616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.047020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.047150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.047204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.047244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.047306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.047665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.047682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.048896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.048942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.048980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.049018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.049419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.049546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.049586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.049623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.049661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.050058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.050074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.051120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.051173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.051212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.051250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.051531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.051657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.589 [2024-07-25 06:53:12.051698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.051736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.051782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.052213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.052229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.053233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.053279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.053320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.053359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.053761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.053888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.053930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.053969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.054010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.054319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.054335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.055253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.055299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.055339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.055377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.055790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.055914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.055957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.055996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.056047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.056486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.056503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.057525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.057573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.057612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.057650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.057932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.058060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.058103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.058147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.058201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.058643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.058659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.059663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.059710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.059753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.059791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.060212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.060337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.060384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.060422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.060459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.060800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.060815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.061734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.061781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.061819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.061857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.062257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.062384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.062428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.062473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.062510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.062952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.062968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.063956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.064003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.064041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.064077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.064396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.064524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.064575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.064614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.064651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.065071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.065087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.066896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.067258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.067273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.068248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.068295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.068337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.068375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.068777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.068912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.068954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.068992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.069029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.069451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.069468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.590 [2024-07-25 06:53:12.070446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.070493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.070530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.070580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.070904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.071037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.071080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.071117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.071161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.071562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.071578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.072818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.072868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.072907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.072946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.073341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.073473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.073515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.073564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.073626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.074024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.074038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.075932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.076325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.076341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.077453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.077503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.077566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.077616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.077934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.078058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.078100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.078145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.078183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.078579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.078598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.079598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.079644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.079683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.079722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.080018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.080173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.080223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.080261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.080300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.080578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.080592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.081644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.081690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.081728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.081766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.082110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.082238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.082281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.082322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.082360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.082769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.082784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.085857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.086083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.591 [2024-07-25 06:53:12.086097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.088805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.089162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.089178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.091977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.094930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.095284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.095299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.097914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.098145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.098159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.100897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.103660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.104020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.104035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.105906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.105956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.105993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.106030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.106329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.106457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.106498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.106537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.106575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.106800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.106815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.108611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.108657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.108702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.108743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.108969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.109095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.109136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.109179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.109215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.109441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.109455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.111748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.111794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.111834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.592 [2024-07-25 06:53:12.111871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.112097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.112228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.112270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.112312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.112358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.112584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.112598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.114410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.114456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.114502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.114540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.114841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.114973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.115016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.115055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.115106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.115492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.115509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.117785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.118011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.118026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.120958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.122038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.123342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.123386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.124935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.125166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.125438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.125485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.125522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.125560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.125911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.125926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.128030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.129596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.131163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.132041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.132272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.133585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.135160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.136723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.137445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.137843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.593 [2024-07-25 06:53:12.137858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.140198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.141766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.143323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.143815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.144042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.145376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.146949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.148519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.854 [2024-07-25 06:53:12.148878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.149195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.149211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.151675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.153249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.154679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.155678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.155943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.157598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.159168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.160166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.160524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.160928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.160943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.163296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.164875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.165405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.166910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.167146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.168831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.170535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.170893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.171254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.171489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.171503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.173831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.175009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.176321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.177633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.177863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.179516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.180227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.180588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.181072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.181306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.181320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.183614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.184251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.185562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.187122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.187357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.188855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.189223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.189582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.190961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.191259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.191273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.192971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.194533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.195924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.197488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.197717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.198313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.198675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.199352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.200634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.200862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.200877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.202347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.203654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.205197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.206762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.206995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.207443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.207806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.209157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.210469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.210698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.210712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.212916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.214254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.215815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.217379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.217720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.218164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.218799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.220094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.221646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.221879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.221894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.224023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.225598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.227161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.228156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.228562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.229010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.230720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.232303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.233946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.234188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.234203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.236800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.238376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.855 [2024-07-25 06:53:12.239798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.240156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.240516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.241838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.243135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.244691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.246249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.246555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.246570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.248935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.250519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.250881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.251239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.251505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.252885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.254457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.256020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.257047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.257280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.257295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.259655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.260440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.260796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.261304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.261533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.263302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.264922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.266452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.267379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.267673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.267688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.269811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.270180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.270534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.272092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.272360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.274019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.275568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.276061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.277505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.277733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.277748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.278844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.279209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.280160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.281447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.281680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.283324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.284416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.285765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.287060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.287292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.287307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.288488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.288882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.290338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.291940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.292173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.293935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.294761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.296087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.297646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.297876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.297891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.299066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.300535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.301858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.303418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.303646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.304321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.305878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.307594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.309209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.309436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.309451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.311173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.312487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.314053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.315618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.315896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.317264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.318574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.320133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.321696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.322037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.322052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.325162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.326723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.328364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.330110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.330461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.331883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.856 [2024-07-25 06:53:12.333445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.335023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.336280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.336647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.336662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.338725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.340304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.341869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.342652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.342882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.344297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.345849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.347417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.347780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.348100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.348115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.350550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.352121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.353560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.354538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.354820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.356476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.358056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.359036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.359407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.359817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.359834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.362150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.362886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.364388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.365706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.365937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.366678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.367042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.367442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.368889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.369317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.369334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.370492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.370872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.371237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.371598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.371966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.372413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.373545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.375083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.375710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.375977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.375992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.377048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.377440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.377801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.378166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.378511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.378950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.379322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.379682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.380042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.380393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.380409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.381655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.382022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.382388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.382749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.383201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.383641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.384002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.384369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.384730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.385129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.385152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.386617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.386983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.387350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.387710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.388044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.388492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.388858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.389223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.389581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.389907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.389922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.391235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.391605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.391964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.392328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.392651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.393094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.393478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.393838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.394200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.394481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.394496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.395639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.857 [2024-07-25 06:53:12.396005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.396371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.396731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.396994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.397443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.397820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.398184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.398546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.398972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.398987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.400506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.400874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.401241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.401625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.402006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.402454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.402816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.403181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.403541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.403816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.403832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.405160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.405526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.405886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.406272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.406629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.407065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.407433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.407794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:58.858 [2024-07-25 06:53:12.408159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.408580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.408598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.409799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.410172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.410532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.410889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.411209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.411647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.412009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.412374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.412735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.413039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.413054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.414377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.414439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.414795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.414837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.415123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.415569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.415932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.416296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.416659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.416983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.416997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.418770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.419052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.419067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.420835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.421182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.421198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.422979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.423330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.423346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.424309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.424375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.424413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.121 [2024-07-25 06:53:12.424472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.424702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.424832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.424887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.424926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.424964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.425224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.425239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.426794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.427020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.427035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.427835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.427896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.427933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.427970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.428202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.428333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.428375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.428417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.428454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.428679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.428694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.429501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.429550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.429587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.429624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.429887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.430013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.430060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.430109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.430155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.430506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.430521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.431511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.431558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.431599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.431636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.431862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.431993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.432034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.432071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.432115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.432348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.432363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.433936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.434169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.434184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.435774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.436001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.436015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.436822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.436869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.436910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.436947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.437179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.437308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.437349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.437386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.437423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.437757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.437772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.438859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.438905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.438945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.438982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.439249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.439379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.439420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.122 [2024-07-25 06:53:12.439457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.439520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.439746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.439760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.440682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.440728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.440765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.440810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.441054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.441192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.441241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.441282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.441319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.441544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.441559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.442456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.442506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.442544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.442596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.443016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.443152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.443195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.443233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.443276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.443502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.443517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.444315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.444361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.444406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.444445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.444757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.444884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.444926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.444963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.445000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.445259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.445274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.446844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.447226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.447241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.448646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.449079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.449094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.449894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.449940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.449978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.450014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.450246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.450373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.450418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.450463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.450502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.450833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.450848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.451744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.451790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.451828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.451865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.452092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.452232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.452285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.452323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.452360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.452585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.452600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.453495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.453541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.453581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.453640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.453866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.453995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.454037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.454075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.454112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.454344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.454359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.455236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.123 [2024-07-25 06:53:12.455283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.455336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.455387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.455790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.455925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.455966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.456004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.456040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.456278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.456301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.457816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.458047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.458062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.458859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.458905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.458943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.458979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.459349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.459479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.459521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.459558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.459596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.459893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.459908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.460759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.460805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.460846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.460883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.461108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.461242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.461292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.461334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.461371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.461596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.461610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.462486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.462536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.462573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.462610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.462837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.462968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.463009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.463053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.463094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.463327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.463342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.464864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.465090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.465104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.465909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.465957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.465995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.466032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.466266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.466394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.466435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.466479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.466518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.466745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.466763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.467571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.467618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.467661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.467698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.468066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.468202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.468274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.468315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.468352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.468695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.468709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.469514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.469560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.469598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.469634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.469861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.124 [2024-07-25 06:53:12.469993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.470035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.470072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.470109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.470450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.470465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.471880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.472273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.472288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.473783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.473829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.473866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.473902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.474195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.474325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.474367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.474404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.474441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.474667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.474682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.475681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.477008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.477052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.478615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.478843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.479123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.479181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.479220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.479259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.479579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.479593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.481630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.483193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.484751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.485761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.485990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.487302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.488868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.490432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.491321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.491717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.491732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.493886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.495439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.496996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.497675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.497904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.499221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.500788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.502357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.502914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.503321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.503336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.505857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.507477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.509192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.509930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.510163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.511808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.513343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.514652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.515008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.515322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.515337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.517704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.519269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.520092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.125 [2024-07-25 06:53:12.521716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.522016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.523666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.525238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.525619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.525976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.526308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.526323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.528721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.530222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.531158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.532452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.532684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.534345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.535399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.535757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.536130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.536361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.536376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.538743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.539318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.540839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.542515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.542743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.544513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.544879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.545240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.546176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.546480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.546494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.548487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.549777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.551083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.552646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.552874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.553740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.554104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.554465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.555976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.556211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.556226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.557536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.558874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.560427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.561989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.562224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.562664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.563027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.564310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.565617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.565846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.565860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.568183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.569587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.571159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.572787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.573128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.573573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.574638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.575935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.577502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.577731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.577745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.579846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.581403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.582966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.583422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.583782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.584235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.585676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.587271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.588837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.589064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.589079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.591483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.593048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.594175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.594535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.594874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.596242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.597559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.599120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.600686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.601043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.601058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.603417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.605108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.605475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.605833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.606090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.607462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.609018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.610590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.126 [2024-07-25 06:53:12.611650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.611880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.611894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.614290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.614978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.615342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.615709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.615937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.617575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.619235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.620926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.621689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.621920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.621935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.624098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.624471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.624833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.626127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.626402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.628058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.629633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.630321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.631995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.632239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.632254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.633462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.633829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.634510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.635828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.636060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.637714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.638993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.640146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.641465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.641693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.641707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.642844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.643232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.644966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.646503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.646731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.648397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.648950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.650243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.651804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.652032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.652047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.653258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.654296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.655608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.657172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.657401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.658502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.659972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.661293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.662858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.663086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.663101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.664593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.666024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.667594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.669162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.669390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.670327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.127 [2024-07-25 06:53:12.671644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.673217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.674786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.675080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.675095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.677421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.678638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.679320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.680958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.681251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.682032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.682405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.682762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.683121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.683470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.683486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.685219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.686254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.687023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.688090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.688437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.688879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.690302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.690717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.692245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.692570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.692588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.693953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.694330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.694691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.695053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.695331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.696981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.697860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.698948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.700490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.700766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.700781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.702247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.702620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.702979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.703343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.703639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.704080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.704448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.704808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.705170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.705461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.705476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.706812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.707200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.707562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.707919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.708264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.708701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.709064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.709438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.709794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.710164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.710179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.711465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.711837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.712204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.712563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.712913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.713358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.713721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.714078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.714443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.714711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.714726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.716169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.716538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.716899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.717287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.717687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.718133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.718498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.718855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.720252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.720628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.720642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.721884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.722260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.723663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.724111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.724344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.724788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.725157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.725516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.727004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.727395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.727410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.729227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.729591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.729948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.730315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.730651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.731089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.731451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.731803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.732158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.732473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.732489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.733782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.734156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.734510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.734867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.735257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.735702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.736061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.736419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.736776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.737168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.737184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.738442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.738804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.739167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.739521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.739816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.740263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.740629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.740984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.741343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.741749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.741764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.743092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.743464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.743817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.744176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.744561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.744998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.745368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.745727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.746082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.746439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.746455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.747901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.748277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.748651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.749003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.749422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.749854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.750221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.750580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.750935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.751326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.494 [2024-07-25 06:53:12.751343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.752695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.752755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.753112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.753169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.753525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.753962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.754324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.754677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.755035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.755435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.755450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.756619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.756667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.756706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.756744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.757076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.757208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.757252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.757291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.757329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.757716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.757732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.758661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.758709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.758747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.758784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.759205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.759329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.759371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.759410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.759451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.759822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.759836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.761834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.762193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.762208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.763267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.763315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.763354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.763392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.763773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.763898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.763940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.763977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.764014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.764248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.764263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.765855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.766105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.766120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.766965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.767012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.767049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.767104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.767548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.767675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.767717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.767755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.767794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.768126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.768147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.769891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.770116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.770131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.771873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.772785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.772833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.772872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.772910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.773279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.773407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.773448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.773486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.773525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.773912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.773928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.774761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.774808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.774845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.774882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.775111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.775246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.775290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.775328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.775365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.775592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.775607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.776472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.776543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.776582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.776620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.777039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.777173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.777217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.777257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.777295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.777667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.777682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.778680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.778734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.778776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.778818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.779044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.779180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.779222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.779259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.779296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.779523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.779538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.780956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.781189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.781208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.782333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.782382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.782423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.782462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.782808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.782932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.782974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.783014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.783052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.783442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.783462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.784365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.784415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.784452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.784488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.784716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.784840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.784893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.495 [2024-07-25 06:53:12.784930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.784966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.785198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.785214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.788735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.788784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.789270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.789314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.789352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.789743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.965904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.972888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.972949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.974460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.974511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.976069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.976115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.977585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.977815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.977830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.977842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.985809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.987529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.989159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.989389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.989403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.993875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.994244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.994598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.994950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.996224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.997537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:12.999090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.000648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.000898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.000913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.005492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.005859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.006217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.006570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.008054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.009376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.010938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.012503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.012807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.012825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.017351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.017713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.018066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.018423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.020443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.021886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.023422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.025003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.025350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.025366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.029493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.029856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.030212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.030563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.032749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.034351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.036074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.037710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.038025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.038040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.040221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.041649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.042002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.042358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.043146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.043726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.045025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.046572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.046800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.496 [2024-07-25 06:53:13.046815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.049225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.050786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.051230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.051584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.052419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.052775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.054157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.054613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.054841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.054856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.057648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.058016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.058383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.058429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.059281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.059640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.059991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.060836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.061066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.061081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.063281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.063333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.063685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.063724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.064600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.064659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.065023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.065077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.065422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.065438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.069452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.069505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.069868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.069911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.070808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.070859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.071217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.071257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.071611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.071627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.074931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.074986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.075344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.075386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.076206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.076256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.076607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.076647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.757 [2024-07-25 06:53:13.076925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.076940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.079803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.079854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.080816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.080860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.082047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.082099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.083377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.083423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.083657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.083672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.086591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.086652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.087014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.087067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.087867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.087917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.088276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.088316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.088713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.088732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.091603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.091658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.092009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.092049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.092775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.092829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.093189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.093232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.093641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.093657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.096387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.096438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.096789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.096830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.097706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.097758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.098118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.098178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.098585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.098600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.101754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.101808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.102175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.102220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.103058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.103107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.103466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.103507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.103832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.103847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.106614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.106665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.107022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.107064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.107772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.107822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.108182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.108243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.108703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.108718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.111494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.111545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.111898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.111936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.112681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.112759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.113119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.113169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.113562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.113581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.116307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.116360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.116712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.116752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.117619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.117677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.118042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.118086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.118447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.118463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.121451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.121505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.121856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.121896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.122718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.122772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.123130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.123178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.123537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.123551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.758 [2024-07-25 06:53:13.126484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.126539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.126893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.126936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.127747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.127797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.128157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.128198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.128594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.128614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.131456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.131508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.131864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.131907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.132646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.132698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.133050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.133091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.133492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.133508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.136163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.136230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.136582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.136621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.137436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.137500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.137859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.137904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.138326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.138342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.141011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.141064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.141422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.141463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.142265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.142315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.142668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.142712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.142996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.143012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.145901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.145962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.146326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.146372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.147230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.147281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.147632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.147672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.147968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.147983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.150753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.150803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.151208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.151253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.151910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.151970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.153415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.153460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.153792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.153807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.157434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.157486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.159018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.159059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.160182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.160233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.161555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.161597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.161824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.161839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.164372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.164423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.165462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.165503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.167395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.167447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.168992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.169033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.169364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.169380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.171276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.171328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.171681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.171741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.172608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.172657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.173587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.173632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.173860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.173875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.175895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.175947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.759 [2024-07-25 06:53:13.176313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.176353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.177481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.177532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.178841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.178883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.179111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.179126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.182263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.182326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.183881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.183921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.184685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.184748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.185100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.185148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.185544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.185562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.188508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.188561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.188598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.188627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.189263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.189311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.189351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.189702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.190091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.190107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.191864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.191916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.191953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.191990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.192338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.192384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.192428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.192465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.192693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.192708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.194347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.194395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.194438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.194476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.194987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.195031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.195069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.195106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.195519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.195535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.197172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.197220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.197257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.197294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.197733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.197776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.197813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.197850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.198117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.198132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.199813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.199863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.199901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.200419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.200463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.200500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.200892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.238260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.239375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.245135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.245193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.245234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.245570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.249304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.249357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.250568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.250609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.250946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.250995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.252289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.252331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.252373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.253969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.254203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.254218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.257124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.257181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.258483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.258524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.258865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.260433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.260476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.261144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.760 [2024-07-25 06:53:13.261374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.261389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.263303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.263354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.263706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.263744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.264246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.265255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.265299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.266595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.266828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.266843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.269875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.269927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.271488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.271530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.272074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.272441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.272484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.272836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.273226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.273243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.276142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.276194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.277458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.277499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.277881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.279446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.279490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.281049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.281394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.281409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.285184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.285236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.286780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.286821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.287167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.288284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.288335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.289875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.290147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.290162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.292206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.292256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.292606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.292645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.293073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.294379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.294422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.295983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.296217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.296232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.299345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.299396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.300217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.300260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.300797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.301160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.301201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.301552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.301843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.301858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.304042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.304094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.305407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.305448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.305786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.307351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.307394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.307945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.308375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:59.761 [2024-07-25 06:53:13.308400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.311970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.312028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.313601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.313645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.313984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.315084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.315127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.316440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.316670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.316685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.318927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.318978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.319339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.319382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.319720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.321302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.321345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.322902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.323130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.323149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.326105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.326161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.326513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.326553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.327054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.327422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.327465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.327962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.328197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.328216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.330378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.330430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.331739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.023 [2024-07-25 06:53:13.331780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.332119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.333688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.333732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.334309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.334744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.334760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.338338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.338406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.339969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.340011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.340354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.341449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.341492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.342797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.343026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.343041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.345267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.345317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.345670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.345711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.346049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.347639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.347683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.349230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.349460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.349474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.352437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.352487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.352840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.352879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.353404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.353759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.353801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.354321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.354554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.354569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.357604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.357659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.359204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.359245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.359583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.360815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.360858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.361216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.361611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.361626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.365185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.365237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.366830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.366874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.367303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.368618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.368662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.370207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.370436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.370450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.372613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.373069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.374462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.374505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.376060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.376293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.376566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.377098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.378623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.378669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.378897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.378911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.381010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.381376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.382108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.383424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.383654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.385394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.386990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.387840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.389157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.389385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.389401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.391490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.391849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.392896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.394197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.394426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.396011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.397277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.398438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.399737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.399967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.399981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.402191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.024 [2024-07-25 06:53:13.402551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.404186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.405644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.405873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.407529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.408010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.409392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.410958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.411191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.411205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.413447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.414341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.415539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.416762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.417054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.418331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.419546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.419902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.420261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.420687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.420703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.422997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.423362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.423716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.424399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.424629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.425065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.426519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.426872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.427227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.427591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.427605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.430319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.430374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.430726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.431077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.431528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.431988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.432360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.432411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.432771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.433233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.433248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.435091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.436749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.438438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.438490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.438853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.439281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.439330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.439685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.440041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.440386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.440401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.443539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.443909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.443951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.444311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.444547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.444716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.445182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.446701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.446748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.447083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.447099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.449574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.449625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.449987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.451398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.451628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.452345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.453722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.453766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.455365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.455711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.455726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.457735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.458099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.458465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.458518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.458867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.459299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.459349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.459701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.460057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.460372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.460388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.462670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.463038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.463081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.463444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.463736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.025 [2024-07-25 06:53:13.463901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.464270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.464626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.464666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.464973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.464987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.467223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.467277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.467629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.467670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.468053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.468479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.468843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.468887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.469244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.469661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.469676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.471946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.472000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.472363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.472409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.472784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.473208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.473255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.473614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.473974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.474383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.474399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.476672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.476724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.477079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.477123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.477432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.477581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.477950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.478317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.478363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.478743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.478758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.481322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.481375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.481742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.481781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.482152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.482595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.482661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.483021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.483065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.483431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.483447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.485772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.485831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.486208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.486259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.486583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.487021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.487068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.487436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.487483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.487771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.487786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.490197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.490251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.490603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.490642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.490926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.491378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.495823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.495877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.496949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.496998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.497298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.504006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.504067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.504429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.504468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.504861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.511317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.511370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.512914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.512955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.513327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.516659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.516713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.518345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.518395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.518623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.524222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.524280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.525596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.525638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.525865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.026 [2024-07-25 06:53:13.530977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.531037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.531398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.531440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.531768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.537377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.537437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.539065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.539106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.539339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.543546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.543609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.545186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.545228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.545455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.549134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.549193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.550178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.550219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.550449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.555611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.555664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.557214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.557256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.557485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.561001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.561057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.561416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.561455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.561682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.565056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.565110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.566254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.566295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.566698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.571468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.571539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.573080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.573122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.027 [2024-07-25 06:53:13.573353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.577846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.577900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.578504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.578861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.579137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.581509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.581558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.583109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.583157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.583440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.587889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.587943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.587980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.588017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.588249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.589712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.589760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.589801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.589839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.590231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.591449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.591497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.591534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.591571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.591899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.593102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.593156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.593220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.593260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.593487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.594944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.594991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.595035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.595073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.595489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.596843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.598386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.598430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.598467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.598695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.602553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.602606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.602659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.604282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.604729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.606610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.606659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.607687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.607729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.608008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.611575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.612953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.612999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.613037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.613536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.616753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.616807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.616845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.617205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.617433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.620201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.620248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.621795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.621836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.622063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.625742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.626770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.626813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.627461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.627847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.630210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.631208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.288 [2024-07-25 06:53:13.631251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.632899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.633128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.636084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.636455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.636497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.637190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.637419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.639193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.640756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.640799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.641467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.641695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.646014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.646387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.646429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.647878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.648259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.650842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.652158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.652202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.653517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.653791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.659410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.659775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.659817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.660270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.660499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.667854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.668234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.668277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.668684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.668912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.671721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.673033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.673076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.674622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.674850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.680403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.682159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.682209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.683629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.683903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.687921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.688411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.688455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.688804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.689032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.693649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.695024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.695067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.696447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.696675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.702099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.702469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.702509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.704154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.704384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.713066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.713436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.713490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.715071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.715514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.722450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.723484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.723532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.725211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.725658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.731695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.733010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.733053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.734617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.734847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.740189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.741746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.741789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.742774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.743003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.749838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.750394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.750437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.751750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.751978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.759213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.759961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.760003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.760945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.761338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.768550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.769425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.769470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.289 [2024-07-25 06:53:13.769507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.769843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.777198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.778776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.778819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.780376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.780677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.788053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.789426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.790739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.792303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.792530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.800089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.801582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.803200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.804766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.804994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.813052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.814355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.815904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.816621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.816852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.824585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.825462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.826648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.827316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.827546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.840691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.290 [2024-07-25 06:53:13.841268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.549 [2024-07-25 06:53:13.842652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.549 [2024-07-25 06:53:13.844193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.246788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.247186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.247542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.249174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.254934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.256414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.256459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.256503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.256541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.256893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.262073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.262454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.262501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.262859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.263305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.809 [2024-07-25 06:53:14.272367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.272879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.272923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.274075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.279113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.279171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.279530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.279897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.285253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.286239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.287215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.287259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.291661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.292680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.292728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.293364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.299047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.299103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.299470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.299835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.302156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.302776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.304343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.304397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.309299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.310067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.310111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.311022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.313841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.313895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.314860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.314901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.319364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.319420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.319778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.319821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.323819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.324209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.324570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.324613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.329531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.329598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.329954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.329997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.332558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.332622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.333290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.333334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.341922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.341976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.343159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.343203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.348544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.348598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.350156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.350197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.356591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.356647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.357447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.357488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.361871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.361924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.362757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:00.810 [2024-07-25 06:53:14.362799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.371266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.371320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.372667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.372709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.378470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.378524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.378876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.378931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.387066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.387127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.387946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.387987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.393555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.393608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.394921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.394962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.398921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.398972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.400670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.400722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.404960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.405012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.406688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.406730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.414901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.414955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.416180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.416222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.421351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.421406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.422947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.422991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.427865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.427919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.429592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.429636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.434562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.435361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.435407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.435445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.438850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.438899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.438936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.438979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.443694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.443747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.443785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.443822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.445341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.445714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.447117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.447168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.448839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.448887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.448924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.448961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.454328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.454383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.454426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.454458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.459068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.459119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.460671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.460724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.466059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.467197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.467241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.467284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.473657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.473712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.473750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.475396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.481313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.481361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.482906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.482949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.489857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.071 [2024-07-25 06:53:14.491282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.491326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.491363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.497151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.497205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.497247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.498677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.502866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.502915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.504460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.504502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.509082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.510631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.510675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.510712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.519252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.519305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.520845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.520887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.525963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.526018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.527561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.529128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.533642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.533691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.535364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.535407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.544614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.544669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.546220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.546262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.553257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.553311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.554876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.554919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.562112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.562176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.563719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.563761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.569618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.569671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.570969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.571010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.576391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.576445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.577224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.577263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.581302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.581357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.582844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.582888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.590255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.590309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.591037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.591077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.598229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.598284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.599840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.599882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.606193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.606250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.607648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.607690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.614368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.614425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.615981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.616023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.622706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.622762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.624292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.072 [2024-07-25 06:53:14.624333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.632446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.632505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.634122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.634178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.641434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.641487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.643051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.643093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.649742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.649795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.651336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.651377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.658198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.658252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.659443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.660218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.667027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.668597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.670024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.670518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.677071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.677666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.678904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.679396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.687410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.687778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.688133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.688495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.720206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.720576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.720622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.721248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.727468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.727520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.729131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.729496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.733255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.733622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.733986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.734030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.742357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.742730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.742774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.744325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.747175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.747235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.748401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.749948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.755553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.755923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.756291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.756339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.761032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.761088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.761455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.761508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.765215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.765271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.765628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.765672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.769160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.769214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.769567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.333 [2024-07-25 06:53:14.769617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.772467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.772520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.772872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.772911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.779391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.779447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.779799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.779838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.785185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.785239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.786407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.786451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.791583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.791638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.793168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.793209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.801191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.801244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.801640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.801683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.806457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.806515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.807341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.807383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.815573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.815628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.817170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.817221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.820389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.820444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.821742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.821784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.828482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.828535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.830090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.830133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.834589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.834643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.836190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.836239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.843991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.844063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.845569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.845611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.851810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.851864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.853412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.853454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.860569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.860624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.861946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.861992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.865633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.865687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.867266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.867316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.875185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.875239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.875594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.875637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.881276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.881337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.882932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.334 [2024-07-25 06:53:14.882973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.888540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.889583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.889628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.889668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.893710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.893764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.893802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.893839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.897331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.897379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.897416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.897452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.898960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.899007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.899044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.899077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.906266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.906316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.906360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.906404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.909693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.909743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.909792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.909830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.919067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.920030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.921318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.921361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.922949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.923326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.923380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.923421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.595 [2024-07-25 06:53:14.929449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.929503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.929540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.930356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.935134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.935189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.935839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.935880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.938732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.940079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.940123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.940174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.943875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.943930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.943968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.945510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.949558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.949608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.949961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.950000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.954162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.954215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.955816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.955867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.960495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.960548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.962114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.962162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.966808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.966868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.967230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.967275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.973066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.973119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.974670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.974712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.979182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.979235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.980792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.980833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.985971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.986026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.986718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.986761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.994908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.994961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.995318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:14.995376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.004792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.004845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.006443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.006493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.013898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.013951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.015513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.015554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.020135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.020193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.021755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.021796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.026845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.026905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.027341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.027384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.033486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.033540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.035051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.035091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.038706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.038760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.039362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.039407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.044144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.044198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.044698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.044743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.049793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.049854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.050704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.050744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.054349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.054403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.054755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.054803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.058877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.058944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.059312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.059358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.062577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.062631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.064177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.064217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.596 [2024-07-25 06:53:15.066980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.067031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.068111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.068160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.071922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.073385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.073761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.073818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.077817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.079373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.079821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.081343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.084893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.086362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.086721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.086758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.093190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.093572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.093933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.093976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.094493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.094721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.099739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.100121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.100486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.100844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.101189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.107435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.107806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.109434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.109484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.109799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.112365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.112414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.113891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.115449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.115818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.120917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.121291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.121648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.121688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.121991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.124715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.125081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.125134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.125501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.125842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.128659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.128719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.129077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.129445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.129885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.132232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.132600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.132958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.133000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.133368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.136523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.136895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.136949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.137315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.137661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.139928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.139980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.140339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.140382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.140611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.145638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.145689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.146778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.146820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.597 [2024-07-25 06:53:15.147090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.153441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.153493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.153846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.153895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.154317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.159619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.159673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.160028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.160073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.160442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.166143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.166197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.167745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.167785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.168013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.172515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.172567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.174234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.174282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.174662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.178977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.179028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.179474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.179518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.179746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.183023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.183074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.184508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.184550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.184843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.190770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.190829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.192251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.192293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.192659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.196890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.196947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.198558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.198608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.198939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.202858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.202914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.204411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.204455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.204681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.210321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.210374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.211909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.211950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.212181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.215702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.215753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.216105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.216149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.216377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.221444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.221496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.221848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.221887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.222117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.227384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.227437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.229176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.229220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.229447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.231610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.231668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.233314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.233353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.233583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.237043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.237097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.238395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.238437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.238665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.243581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.243634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.244940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.244983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.245217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.250388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.250443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.250862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.250909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.251195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.255717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.255772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.256809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.257171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.257581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.859 [2024-07-25 06:53:15.262184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.262232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.263543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.263585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.263812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.267537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.267586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.267627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.267668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.267931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.269294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.269340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.269894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.269938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.270322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.271516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.271563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.273100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.273146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.273424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.275920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.275970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.276008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.276046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.276385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.277943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.277991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.278049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.279572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.279877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.282629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.283005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.283047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.283085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.283370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.286146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.286201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.286238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.287388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.287738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.289412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.289460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.290706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.290749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.290975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.295026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.295406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.295450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.295488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.295714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.299658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.299712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.299750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.301110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.301341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.302965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.303016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.304683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.304725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.304951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.308488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.308852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.308893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.309538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.309768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.312609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.314156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.314199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.315734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.315963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.318096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.319812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.319857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.321390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.321618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.326948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.328407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.328451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.329914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.330145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.333495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.334919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.334961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.335316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.335693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.339479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.340911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.340954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.342397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.342625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.346016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.347700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.860 [2024-07-25 06:53:15.347750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.349304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.349532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.352832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.353203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.353246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.353597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.353829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.357943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.359598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.359666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.361236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.361464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.363245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.364825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.364870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.365709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.365938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.370182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.371776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.371826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.373427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.373655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.376752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.377193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.377238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.378445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.378811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.383476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.383895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.383939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.385189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.385534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.387450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.389116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.389179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.390673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.390996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.393254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.393616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.393668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.395350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.395670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.398960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.399607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.399652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.400664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.401058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.403396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.403763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.403807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.405420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.405649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.408903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.409279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.409326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.409680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.410070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:01.861 [2024-07-25 06:53:15.412435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.412798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.412841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.413197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.413426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.416954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.417333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.417387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.417435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.417772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.421224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.422935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.422988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.423349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.423687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.427394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.427817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.429229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.429584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.429927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.435221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.435592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.435636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.435991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.436354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.442494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.443813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.443855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.444219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.444584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.449119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.450137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.450957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.451315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.451724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.455077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.122 [2024-07-25 06:53:15.455451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.455812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.455869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.456260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.458657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.458710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.460247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.461658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.461984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.464821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.465201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.465560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.465602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.465927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.468603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.468968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.469012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.469374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.469649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.472060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.472111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.472472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.472830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.473278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.475867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.476239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.477401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.477444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.477811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.483764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.485337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.485382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.487033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.487434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.493895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.493955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.495510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.495550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.495777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.501155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.501227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.501579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.501617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.501922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.506827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.506880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.507642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.507683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.507911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.513827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.513880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.515430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.515473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.515707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.519301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.519352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.520712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.520753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.521073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.526067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.526120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.527566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.527607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.527835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.532956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.533010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.533369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.533413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.533780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.539484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.539538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.541087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.541127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.541358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.545734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.545793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.546154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.546198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.546428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.551161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.551214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.552749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.552789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.553043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.557674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.557724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.558192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.558234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.558462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.123 [2024-07-25 06:53:15.561875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.561926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.562799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.562841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.563104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.568940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.568993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.570720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.570766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.571072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.575163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.575217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.576878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.576934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.577167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.582864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.582918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.584436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.584477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.584840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.590319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.590391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.591944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.591984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.592215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.597192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.597246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.597616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.597672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.598097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.603300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.603361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.603714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.603754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.604120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.610044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.610099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.611650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.611696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.611926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.616356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.616410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.617947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.618758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.618985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.623334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.623384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.623842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.623885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.624112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.627043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.627094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.627132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.627175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.627460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.628771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.628817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.630421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.630471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.630797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.633566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.633629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.633983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.634021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.634351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.637899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.637947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.637984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.638025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.638418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.641136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.641189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.641227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.642538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.642767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.647309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.648868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.648913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.648950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.649182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.651966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.652024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.652066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.653492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.653770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.655321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.655374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.655733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.655783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.656010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.659172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.660723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.124 [2024-07-25 06:53:15.660767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.660804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.661030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.664455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.664514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.664553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.664909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.665160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.666911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.666968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.667333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.667385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.667613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.671707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.673253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.673296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.674846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.125 [2024-07-25 06:53:15.675075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.677212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.678763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.678807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.679195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.679633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.681342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.682642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.682687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.684042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.684279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.688859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.690548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.690615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.692157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.692386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.695962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.696970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.697013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.697377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.697804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.702037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.703598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.703645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.705187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.705416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.708603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.710178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.710223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.711876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.712317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.716440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.716804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.716848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.717822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.718118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.722409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.723647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.723694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.724046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.724476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.730131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.730502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.730547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.731756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.732027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.734731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.735608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.735654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.736415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.736836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.739988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.740391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.740439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.740793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.741130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.743227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.743830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.743887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.745407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.745636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.747797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.748174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.748221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.386 [2024-07-25 06:53:15.748574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.748803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.752336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.753821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.753879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.754495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.754723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.756810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.757183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.757226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.757580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.757808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.761094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.761472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.761518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.761873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.762295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.764299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.765715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.765761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.766118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.766471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.769580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.769945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.769986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.770345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.770689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.772634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.773491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.773538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.773576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.773803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.776608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.776972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.777015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.778106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.778414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.782093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.783441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.783810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.784174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.784489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.787317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.788873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.788916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.789938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.790341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.793485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.794813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.794858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.795218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.795585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.799517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.799884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.800249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.800611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.800983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.805831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.806206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.806568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.806611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.806953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.809304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.809351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.809707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.810064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.810431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.813541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.813906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.814273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.814319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.814676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.820011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.821180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.821226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.822782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.823045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.829113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.829172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.830721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.832273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.832501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.837302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.837925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.839222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.839265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.387 [2024-07-25 06:53:15.839491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.846112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.847834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.847880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.849430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.849657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.853833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.853892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.854514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.854556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.854800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.861936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.861989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.862627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.862677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.863039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.868420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.868475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.868889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.868931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.869222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.874677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.874737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.876298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.876339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.876565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.880686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.880740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.881098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.881148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.881376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.886317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.886371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.887923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.887964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.888213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.892750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.892818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.893355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.893399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.893628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.899470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.899524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.901056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.901097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.901430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.903611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.903664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.905187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.905229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.905458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.907353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.907420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.388 [2024-07-25 06:53:15.910351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:02.959 00:37:02.959 Latency(us) 00:37:02.959 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:02.960 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:02.960 Verification LBA range: start 0x0 length 0x100 00:37:02.960 crypto_ram : 5.78 44.65 2.79 0.00 0.00 2774136.64 29150.41 2657511.01 00:37:02.960 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:02.960 Verification LBA range: start 0x100 length 0x100 00:37:02.960 crypto_ram : 5.82 44.01 2.75 0.00 0.00 2816992.05 32296.14 2778306.97 00:37:02.960 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:02.960 Verification LBA range: start 0x0 length 0x100 00:37:02.960 crypto_ram2 : 5.78 44.81 2.80 0.00 0.00 2669730.77 27262.98 2657511.01 00:37:02.960 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:02.960 Verification LBA range: start 0x100 length 0x100 00:37:02.960 crypto_ram2 : 5.82 44.34 2.77 0.00 0.00 2704526.03 18769.51 2778306.97 00:37:02.960 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:02.960 Verification LBA range: start 0x0 length 0x100 00:37:02.960 crypto_ram3 : 5.55 305.86 19.12 0.00 0.00 375271.46 30828.13 567069.90 00:37:02.960 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:02.960 Verification LBA range: start 0x100 length 0x100 00:37:02.960 crypto_ram3 : 5.56 294.25 18.39 0.00 0.00 390370.67 51589.94 580491.67 00:37:02.960 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:02.960 Verification LBA range: start 0x0 length 0x100 00:37:02.960 crypto_ram4 : 5.67 318.77 19.92 0.00 0.00 349829.66 16357.79 427819.01 00:37:02.960 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:02.960 Verification LBA range: start 0x100 length 0x100 00:37:02.960 crypto_ram4 : 5.70 309.80 19.36 0.00 0.00 359910.79 31457.28 449629.39 00:37:02.960 =================================================================================================================== 00:37:02.961 Total : 1406.49 87.91 0.00 0.00 676573.04 16357.79 2778306.97 00:37:03.223 00:37:03.223 real 0m8.964s 00:37:03.223 user 0m16.991s 00:37:03.223 sys 0m0.531s 00:37:03.223 06:53:16 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:03.223 06:53:16 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:37:03.223 ************************************ 00:37:03.223 END TEST bdev_verify_big_io 00:37:03.223 ************************************ 00:37:03.223 06:53:16 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:03.223 06:53:16 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:37:03.223 06:53:16 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:03.223 06:53:16 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:37:03.223 ************************************ 00:37:03.223 START TEST bdev_write_zeroes 00:37:03.223 ************************************ 00:37:03.223 06:53:16 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:03.482 [2024-07-25 06:53:16.785205] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:03.482 [2024-07-25 06:53:16.785259] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1350380 ] 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:03.482 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:03.482 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:03.482 [2024-07-25 06:53:16.918787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:03.482 [2024-07-25 06:53:16.962254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:03.482 [2024-07-25 06:53:16.983484] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:37:03.482 [2024-07-25 06:53:16.991510] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:37:03.482 [2024-07-25 06:53:16.999529] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:37:03.741 [2024-07-25 06:53:17.100352] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:37:06.279 [2024-07-25 06:53:19.426246] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:37:06.279 [2024-07-25 06:53:19.426295] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:06.279 [2024-07-25 06:53:19.426308] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:06.279 [2024-07-25 06:53:19.434265] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:37:06.279 [2024-07-25 06:53:19.434283] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:06.279 [2024-07-25 06:53:19.434293] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:06.279 [2024-07-25 06:53:19.442286] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:37:06.279 [2024-07-25 06:53:19.442302] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:37:06.279 [2024-07-25 06:53:19.442312] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:06.279 [2024-07-25 06:53:19.450306] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:37:06.280 [2024-07-25 06:53:19.450322] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:37:06.280 [2024-07-25 06:53:19.450332] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:06.280 Running I/O for 1 seconds... 00:37:07.223 00:37:07.223 Latency(us) 00:37:07.223 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:07.223 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:07.223 crypto_ram : 1.02 2143.23 8.37 0.00 0.00 59304.21 5006.95 71303.17 00:37:07.223 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:07.223 crypto_ram2 : 1.02 2156.96 8.43 0.00 0.00 58685.56 4954.52 66270.00 00:37:07.223 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:07.223 crypto_ram3 : 1.02 16511.16 64.50 0.00 0.00 7635.40 2267.55 9961.47 00:37:07.224 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:07.224 crypto_ram4 : 1.02 16555.99 64.67 0.00 0.00 7592.03 1808.79 7969.18 00:37:07.224 =================================================================================================================== 00:37:07.224 Total : 37367.34 145.97 0.00 0.00 13542.98 1808.79 71303.17 00:37:07.482 00:37:07.482 real 0m4.141s 00:37:07.482 user 0m3.631s 00:37:07.482 sys 0m0.469s 00:37:07.482 06:53:20 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:07.482 06:53:20 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:37:07.482 ************************************ 00:37:07.482 END TEST bdev_write_zeroes 00:37:07.482 ************************************ 00:37:07.482 06:53:20 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:07.482 06:53:20 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:37:07.482 06:53:20 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:07.482 06:53:20 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:37:07.482 ************************************ 00:37:07.482 START TEST bdev_json_nonenclosed 00:37:07.482 ************************************ 00:37:07.482 06:53:20 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:07.482 [2024-07-25 06:53:21.013181] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:07.482 [2024-07-25 06:53:21.013236] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1350938 ] 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:07.741 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:07.741 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:07.741 [2024-07-25 06:53:21.147982] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:07.742 [2024-07-25 06:53:21.192395] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:07.742 [2024-07-25 06:53:21.192457] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:37:07.742 [2024-07-25 06:53:21.192473] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:37:07.742 [2024-07-25 06:53:21.192484] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:37:07.742 00:37:07.742 real 0m0.310s 00:37:07.742 user 0m0.161s 00:37:07.742 sys 0m0.148s 00:37:07.742 06:53:21 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:07.742 06:53:21 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:37:07.742 ************************************ 00:37:07.742 END TEST bdev_json_nonenclosed 00:37:07.742 ************************************ 00:37:08.001 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:08.001 06:53:21 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:37:08.001 06:53:21 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:08.001 06:53:21 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:37:08.001 ************************************ 00:37:08.001 START TEST bdev_json_nonarray 00:37:08.001 ************************************ 00:37:08.001 06:53:21 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:08.001 [2024-07-25 06:53:21.409394] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:08.001 [2024-07-25 06:53:21.409449] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351127 ] 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:08.001 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.001 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:08.001 [2024-07-25 06:53:21.543332] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:08.260 [2024-07-25 06:53:21.587606] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:08.260 [2024-07-25 06:53:21.587672] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:37:08.260 [2024-07-25 06:53:21.587688] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:37:08.260 [2024-07-25 06:53:21.587698] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:37:08.260 00:37:08.260 real 0m0.307s 00:37:08.260 user 0m0.161s 00:37:08.260 sys 0m0.144s 00:37:08.260 06:53:21 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:08.260 06:53:21 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:37:08.260 ************************************ 00:37:08.260 END TEST bdev_json_nonarray 00:37:08.260 ************************************ 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:37:08.260 06:53:21 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:37:08.260 00:37:08.260 real 1m11.186s 00:37:08.260 user 2m54.723s 00:37:08.260 sys 0m9.939s 00:37:08.260 06:53:21 blockdev_crypto_aesni -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:08.260 06:53:21 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:37:08.260 ************************************ 00:37:08.260 END TEST blockdev_crypto_aesni 00:37:08.260 ************************************ 00:37:08.260 06:53:21 -- spdk/autotest.sh@362 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:37:08.260 06:53:21 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:37:08.260 06:53:21 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:08.260 06:53:21 -- common/autotest_common.sh@10 -- # set +x 00:37:08.260 ************************************ 00:37:08.260 START TEST blockdev_crypto_sw 00:37:08.260 ************************************ 00:37:08.260 06:53:21 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:37:08.519 * Looking for test storage... 00:37:08.519 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1351272 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:37:08.519 06:53:21 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1351272 00:37:08.519 06:53:21 blockdev_crypto_sw -- common/autotest_common.sh@831 -- # '[' -z 1351272 ']' 00:37:08.519 06:53:21 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:08.519 06:53:21 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # local max_retries=100 00:37:08.519 06:53:21 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:08.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:08.519 06:53:21 blockdev_crypto_sw -- common/autotest_common.sh@840 -- # xtrace_disable 00:37:08.519 06:53:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:08.519 [2024-07-25 06:53:21.986089] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:08.519 [2024-07-25 06:53:21.986165] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351272 ] 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.519 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:08.519 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.520 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:08.520 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.520 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:08.520 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.520 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:08.520 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.520 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:08.520 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.520 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:08.520 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.520 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:08.520 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.520 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:08.520 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:08.520 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:08.778 [2024-07-25 06:53:22.124084] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:08.778 [2024-07-25 06:53:22.169322] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:09.345 06:53:22 blockdev_crypto_sw -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:37:09.345 06:53:22 blockdev_crypto_sw -- common/autotest_common.sh@864 -- # return 0 00:37:09.345 06:53:22 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:37:09.345 06:53:22 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:37:09.345 06:53:22 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:37:09.345 06:53:22 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:09.345 06:53:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:09.603 Malloc0 00:37:09.603 Malloc1 00:37:09.603 true 00:37:09.603 true 00:37:09.603 true 00:37:09.603 [2024-07-25 06:53:23.117259] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:37:09.603 crypto_ram 00:37:09.603 [2024-07-25 06:53:23.125285] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:37:09.603 crypto_ram2 00:37:09.603 [2024-07-25 06:53:23.133307] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:37:09.603 crypto_ram3 00:37:09.603 [ 00:37:09.603 { 00:37:09.603 "name": "Malloc1", 00:37:09.603 "aliases": [ 00:37:09.603 "37426488-5aa2-4937-b082-7c531659ab19" 00:37:09.603 ], 00:37:09.603 "product_name": "Malloc disk", 00:37:09.603 "block_size": 4096, 00:37:09.603 "num_blocks": 4096, 00:37:09.603 "uuid": "37426488-5aa2-4937-b082-7c531659ab19", 00:37:09.603 "assigned_rate_limits": { 00:37:09.603 "rw_ios_per_sec": 0, 00:37:09.603 "rw_mbytes_per_sec": 0, 00:37:09.603 "r_mbytes_per_sec": 0, 00:37:09.603 "w_mbytes_per_sec": 0 00:37:09.603 }, 00:37:09.603 "claimed": true, 00:37:09.603 "claim_type": "exclusive_write", 00:37:09.603 "zoned": false, 00:37:09.603 "supported_io_types": { 00:37:09.603 "read": true, 00:37:09.603 "write": true, 00:37:09.603 "unmap": true, 00:37:09.603 "flush": true, 00:37:09.603 "reset": true, 00:37:09.603 "nvme_admin": false, 00:37:09.603 "nvme_io": false, 00:37:09.603 "nvme_io_md": false, 00:37:09.603 "write_zeroes": true, 00:37:09.603 "zcopy": true, 00:37:09.603 "get_zone_info": false, 00:37:09.603 "zone_management": false, 00:37:09.603 "zone_append": false, 00:37:09.603 "compare": false, 00:37:09.603 "compare_and_write": false, 00:37:09.603 "abort": true, 00:37:09.603 "seek_hole": false, 00:37:09.603 "seek_data": false, 00:37:09.603 "copy": true, 00:37:09.603 "nvme_iov_md": false 00:37:09.603 }, 00:37:09.603 "memory_domains": [ 00:37:09.603 { 00:37:09.603 "dma_device_id": "system", 00:37:09.603 "dma_device_type": 1 00:37:09.603 }, 00:37:09.603 { 00:37:09.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:37:09.861 "dma_device_type": 2 00:37:09.861 } 00:37:09.861 ], 00:37:09.861 "driver_specific": {} 00:37:09.861 } 00:37:09.861 ] 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f71d6c3c-7b57-56d8-8e08-90fb3c295732"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f71d6c3c-7b57-56d8-8e08-90fb3c295732",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "62012f7c-3a4b-5896-83a6-c62b9ad856bf"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "62012f7c-3a4b-5896-83a6-c62b9ad856bf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:37:09.861 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 1351272 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@950 -- # '[' -z 1351272 ']' 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # kill -0 1351272 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # uname 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1351272 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1351272' 00:37:09.861 killing process with pid 1351272 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@969 -- # kill 1351272 00:37:09.861 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@974 -- # wait 1351272 00:37:10.119 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:37:10.119 06:53:23 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:37:10.119 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:37:10.377 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:10.377 06:53:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:10.377 ************************************ 00:37:10.377 START TEST bdev_hello_world 00:37:10.377 ************************************ 00:37:10.377 06:53:23 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:37:10.377 [2024-07-25 06:53:23.774624] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:10.377 [2024-07-25 06:53:23.774678] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351563 ] 00:37:10.377 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.377 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:10.377 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.377 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:10.377 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.377 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:10.377 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.377 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:10.377 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.377 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:10.377 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.377 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:10.377 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.377 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:10.377 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:10.378 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:10.378 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:10.378 [2024-07-25 06:53:23.909605] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:10.636 [2024-07-25 06:53:23.953405] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:10.636 [2024-07-25 06:53:24.112516] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:37:10.636 [2024-07-25 06:53:24.112576] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:10.636 [2024-07-25 06:53:24.112590] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:10.636 [2024-07-25 06:53:24.120534] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:37:10.636 [2024-07-25 06:53:24.120551] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:10.636 [2024-07-25 06:53:24.120562] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:10.636 [2024-07-25 06:53:24.128554] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:37:10.636 [2024-07-25 06:53:24.128570] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:37:10.636 [2024-07-25 06:53:24.128580] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:10.636 [2024-07-25 06:53:24.168244] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:37:10.636 [2024-07-25 06:53:24.168275] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:37:10.636 [2024-07-25 06:53:24.168291] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:37:10.636 [2024-07-25 06:53:24.169581] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:37:10.636 [2024-07-25 06:53:24.169642] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:37:10.636 [2024-07-25 06:53:24.169656] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:37:10.636 [2024-07-25 06:53:24.169687] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:37:10.636 00:37:10.636 [2024-07-25 06:53:24.169703] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:37:10.894 00:37:10.894 real 0m0.627s 00:37:10.894 user 0m0.398s 00:37:10.894 sys 0m0.216s 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:37:10.894 ************************************ 00:37:10.894 END TEST bdev_hello_world 00:37:10.894 ************************************ 00:37:10.894 06:53:24 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:37:10.894 06:53:24 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:37:10.894 06:53:24 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:10.894 06:53:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:10.894 ************************************ 00:37:10.894 START TEST bdev_bounds 00:37:10.894 ************************************ 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1351678 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1351678' 00:37:10.894 Process bdevio pid: 1351678 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1351678 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1351678 ']' 00:37:10.894 06:53:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:10.895 06:53:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:37:10.895 06:53:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:10.895 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:10.895 06:53:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:37:10.895 06:53:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:37:11.154 [2024-07-25 06:53:24.484746] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:11.154 [2024-07-25 06:53:24.484805] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351678 ] 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:11.154 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:11.154 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:11.154 [2024-07-25 06:53:24.608721] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:37:11.154 [2024-07-25 06:53:24.656934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:11.154 [2024-07-25 06:53:24.657029] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:37:11.154 [2024-07-25 06:53:24.657033] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:11.414 [2024-07-25 06:53:24.811739] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:37:11.414 [2024-07-25 06:53:24.811803] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:11.414 [2024-07-25 06:53:24.811817] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:11.414 [2024-07-25 06:53:24.819759] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:37:11.414 [2024-07-25 06:53:24.819776] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:11.414 [2024-07-25 06:53:24.819786] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:11.414 [2024-07-25 06:53:24.827781] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:37:11.414 [2024-07-25 06:53:24.827797] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:37:11.414 [2024-07-25 06:53:24.827807] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:11.982 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:37:11.982 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:37:11.982 06:53:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:37:11.982 I/O targets: 00:37:11.982 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:37:11.982 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:37:11.982 00:37:11.982 00:37:11.982 CUnit - A unit testing framework for C - Version 2.1-3 00:37:11.982 http://cunit.sourceforge.net/ 00:37:11.982 00:37:11.982 00:37:11.982 Suite: bdevio tests on: crypto_ram3 00:37:11.982 Test: blockdev write read block ...passed 00:37:11.982 Test: blockdev write zeroes read block ...passed 00:37:11.982 Test: blockdev write zeroes read no split ...passed 00:37:11.982 Test: blockdev write zeroes read split ...passed 00:37:11.982 Test: blockdev write zeroes read split partial ...passed 00:37:11.982 Test: blockdev reset ...passed 00:37:11.982 Test: blockdev write read 8 blocks ...passed 00:37:11.982 Test: blockdev write read size > 128k ...passed 00:37:11.982 Test: blockdev write read invalid size ...passed 00:37:11.982 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:37:11.982 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:37:11.982 Test: blockdev write read max offset ...passed 00:37:11.982 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:37:11.982 Test: blockdev writev readv 8 blocks ...passed 00:37:11.982 Test: blockdev writev readv 30 x 1block ...passed 00:37:11.983 Test: blockdev writev readv block ...passed 00:37:11.983 Test: blockdev writev readv size > 128k ...passed 00:37:11.983 Test: blockdev writev readv size > 128k in two iovs ...passed 00:37:11.983 Test: blockdev comparev and writev ...passed 00:37:11.983 Test: blockdev nvme passthru rw ...passed 00:37:11.983 Test: blockdev nvme passthru vendor specific ...passed 00:37:11.983 Test: blockdev nvme admin passthru ...passed 00:37:11.983 Test: blockdev copy ...passed 00:37:11.983 Suite: bdevio tests on: crypto_ram 00:37:11.983 Test: blockdev write read block ...passed 00:37:11.983 Test: blockdev write zeroes read block ...passed 00:37:11.983 Test: blockdev write zeroes read no split ...passed 00:37:11.983 Test: blockdev write zeroes read split ...passed 00:37:11.983 Test: blockdev write zeroes read split partial ...passed 00:37:11.983 Test: blockdev reset ...passed 00:37:11.983 Test: blockdev write read 8 blocks ...passed 00:37:11.983 Test: blockdev write read size > 128k ...passed 00:37:11.983 Test: blockdev write read invalid size ...passed 00:37:11.983 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:37:11.983 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:37:11.983 Test: blockdev write read max offset ...passed 00:37:11.983 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:37:11.983 Test: blockdev writev readv 8 blocks ...passed 00:37:11.983 Test: blockdev writev readv 30 x 1block ...passed 00:37:11.983 Test: blockdev writev readv block ...passed 00:37:11.983 Test: blockdev writev readv size > 128k ...passed 00:37:11.983 Test: blockdev writev readv size > 128k in two iovs ...passed 00:37:11.983 Test: blockdev comparev and writev ...passed 00:37:11.983 Test: blockdev nvme passthru rw ...passed 00:37:11.983 Test: blockdev nvme passthru vendor specific ...passed 00:37:11.983 Test: blockdev nvme admin passthru ...passed 00:37:11.983 Test: blockdev copy ...passed 00:37:11.983 00:37:11.983 Run Summary: Type Total Ran Passed Failed Inactive 00:37:11.983 suites 2 2 n/a 0 0 00:37:11.983 tests 46 46 46 0 0 00:37:11.983 asserts 260 260 260 0 n/a 00:37:11.983 00:37:11.983 Elapsed time = 0.078 seconds 00:37:11.983 0 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1351678 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1351678 ']' 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1351678 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1351678 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1351678' 00:37:12.242 killing process with pid 1351678 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1351678 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1351678 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:37:12.242 00:37:12.242 real 0m1.356s 00:37:12.242 user 0m3.656s 00:37:12.242 sys 0m0.370s 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:12.242 06:53:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:37:12.242 ************************************ 00:37:12.242 END TEST bdev_bounds 00:37:12.242 ************************************ 00:37:12.502 06:53:25 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:37:12.502 06:53:25 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:37:12.502 06:53:25 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:12.502 06:53:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:12.502 ************************************ 00:37:12.502 START TEST bdev_nbd 00:37:12.502 ************************************ 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1351897 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1351897 /var/tmp/spdk-nbd.sock 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1351897 ']' 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:37:12.502 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:37:12.502 06:53:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:37:12.502 [2024-07-25 06:53:25.931796] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:12.502 [2024-07-25 06:53:25.931851] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.502 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:12.502 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.503 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:12.503 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.503 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:12.503 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.503 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:12.503 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.503 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:12.503 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:12.503 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:12.762 [2024-07-25 06:53:26.066726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:12.762 [2024-07-25 06:53:26.111728] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:12.762 [2024-07-25 06:53:26.271234] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:37:12.762 [2024-07-25 06:53:26.271291] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:12.762 [2024-07-25 06:53:26.271305] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:12.762 [2024-07-25 06:53:26.279253] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:37:12.762 [2024-07-25 06:53:26.279271] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:12.762 [2024-07-25 06:53:26.279282] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:12.762 [2024-07-25 06:53:26.287276] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:37:12.762 [2024-07-25 06:53:26.287294] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:37:12.762 [2024-07-25 06:53:26.287304] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:37:13.329 06:53:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:13.587 1+0 records in 00:37:13.587 1+0 records out 00:37:13.587 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266914 s, 15.3 MB/s 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:37:13.587 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:37:13.588 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:37:13.588 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:37:13.588 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:13.846 1+0 records in 00:37:13.846 1+0 records out 00:37:13.846 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329355 s, 12.4 MB/s 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:37:13.846 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:37:14.104 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:37:14.104 { 00:37:14.104 "nbd_device": "/dev/nbd0", 00:37:14.104 "bdev_name": "crypto_ram" 00:37:14.104 }, 00:37:14.104 { 00:37:14.104 "nbd_device": "/dev/nbd1", 00:37:14.104 "bdev_name": "crypto_ram3" 00:37:14.104 } 00:37:14.104 ]' 00:37:14.104 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:37:14.104 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:37:14.104 { 00:37:14.105 "nbd_device": "/dev/nbd0", 00:37:14.105 "bdev_name": "crypto_ram" 00:37:14.105 }, 00:37:14.105 { 00:37:14.105 "nbd_device": "/dev/nbd1", 00:37:14.105 "bdev_name": "crypto_ram3" 00:37:14.105 } 00:37:14.105 ]' 00:37:14.105 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:37:14.105 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:37:14.105 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:14.105 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:14.105 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:14.105 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:37:14.105 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:14.105 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:14.363 06:53:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:14.622 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:14.880 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:37:15.138 /dev/nbd0 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:15.138 1+0 records in 00:37:15.138 1+0 records out 00:37:15.138 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237473 s, 17.2 MB/s 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:37:15.138 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:37:15.139 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:15.139 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:15.139 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:37:15.397 /dev/nbd1 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:37:15.397 1+0 records in 00:37:15.397 1+0 records out 00:37:15.397 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310588 s, 13.2 MB/s 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:15.397 06:53:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:37:15.655 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:37:15.655 { 00:37:15.655 "nbd_device": "/dev/nbd0", 00:37:15.655 "bdev_name": "crypto_ram" 00:37:15.655 }, 00:37:15.655 { 00:37:15.655 "nbd_device": "/dev/nbd1", 00:37:15.655 "bdev_name": "crypto_ram3" 00:37:15.655 } 00:37:15.655 ]' 00:37:15.655 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:37:15.655 { 00:37:15.655 "nbd_device": "/dev/nbd0", 00:37:15.655 "bdev_name": "crypto_ram" 00:37:15.655 }, 00:37:15.655 { 00:37:15.655 "nbd_device": "/dev/nbd1", 00:37:15.655 "bdev_name": "crypto_ram3" 00:37:15.655 } 00:37:15.656 ]' 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:37:15.656 /dev/nbd1' 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:37:15.656 /dev/nbd1' 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:37:15.656 256+0 records in 00:37:15.656 256+0 records out 00:37:15.656 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109677 s, 95.6 MB/s 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:37:15.656 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:37:15.914 256+0 records in 00:37:15.914 256+0 records out 00:37:15.914 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0181753 s, 57.7 MB/s 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:37:15.914 256+0 records in 00:37:15.914 256+0 records out 00:37:15.914 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.032837 s, 31.9 MB/s 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:15.914 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:16.173 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:37:16.433 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:37:16.693 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:37:16.693 06:53:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:37:16.693 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:37:16.951 malloc_lvol_verify 00:37:16.952 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:37:16.952 996d8783-876d-40af-ae3b-575eb9b707cb 00:37:16.952 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:37:17.211 a1b45dc5-72f4-4198-bf86-0c99dd3a8ac1 00:37:17.211 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:37:17.470 /dev/nbd0 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:37:17.470 mke2fs 1.46.5 (30-Dec-2021) 00:37:17.470 Discarding device blocks: 0/4096 done 00:37:17.470 Creating filesystem with 4096 1k blocks and 1024 inodes 00:37:17.470 00:37:17.470 Allocating group tables: 0/1 done 00:37:17.470 Writing inode tables: 0/1 done 00:37:17.470 Creating journal (1024 blocks): done 00:37:17.470 Writing superblocks and filesystem accounting information: 0/1 done 00:37:17.470 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:17.470 06:53:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:37:17.729 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1351897 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1351897 ']' 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1351897 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1351897 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1351897' 00:37:17.730 killing process with pid 1351897 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1351897 00:37:17.730 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1351897 00:37:17.989 06:53:31 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:37:17.990 00:37:17.990 real 0m5.585s 00:37:17.990 user 0m7.954s 00:37:17.990 sys 0m2.249s 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:37:17.990 ************************************ 00:37:17.990 END TEST bdev_nbd 00:37:17.990 ************************************ 00:37:17.990 06:53:31 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:37:17.990 06:53:31 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:37:17.990 06:53:31 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:37:17.990 06:53:31 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:37:17.990 06:53:31 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:37:17.990 06:53:31 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:17.990 06:53:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:17.990 ************************************ 00:37:17.990 START TEST bdev_fio 00:37:17.990 ************************************ 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:37:17.990 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:37:17.990 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:37:18.250 ************************************ 00:37:18.250 START TEST bdev_fio_rw_verify 00:37:18.250 ************************************ 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:37:18.250 06:53:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:18.509 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:18.509 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:18.509 fio-3.35 00:37:18.509 Starting 2 threads 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:18.769 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:18.769 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:31.008 00:37:31.008 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1353249: Thu Jul 25 06:53:42 2024 00:37:31.008 read: IOPS=27.8k, BW=109MiB/s (114MB/s)(1086MiB/10001msec) 00:37:31.008 slat (usec): min=8, max=1977, avg=15.93, stdev= 6.32 00:37:31.008 clat (usec): min=4, max=2154, avg=114.96, stdev=59.69 00:37:31.008 lat (usec): min=13, max=2167, avg=130.89, stdev=63.07 00:37:31.008 clat percentiles (usec): 00:37:31.008 | 50.000th=[ 105], 99.000th=[ 255], 99.900th=[ 277], 99.990th=[ 326], 00:37:31.008 | 99.999th=[ 2114] 00:37:31.008 write: IOPS=33.5k, BW=131MiB/s (137MB/s)(1242MiB/9489msec); 0 zone resets 00:37:31.008 slat (usec): min=9, max=250, avg=26.50, stdev= 7.85 00:37:31.008 clat (usec): min=16, max=1205, avg=154.06, stdev=88.79 00:37:31.008 lat (usec): min=33, max=1363, avg=180.56, stdev=94.00 00:37:31.008 clat percentiles (usec): 00:37:31.008 | 50.000th=[ 139], 99.000th=[ 375], 99.900th=[ 469], 99.990th=[ 783], 00:37:31.008 | 99.999th=[ 1156] 00:37:31.008 bw ( KiB/s): min=101704, max=134760, per=94.73%, avg=127009.26, stdev=3854.44, samples=38 00:37:31.008 iops : min=25426, max=33690, avg=31752.32, stdev=963.61, samples=38 00:37:31.008 lat (usec) : 10=0.01%, 20=0.01%, 50=10.72%, 100=27.43%, 250=52.68% 00:37:31.008 lat (usec) : 500=9.13%, 750=0.02%, 1000=0.01% 00:37:31.008 lat (msec) : 2=0.01%, 4=0.01% 00:37:31.008 cpu : usr=99.66%, sys=0.01%, ctx=35, majf=0, minf=441 00:37:31.008 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:37:31.008 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:37:31.008 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:37:31.008 issued rwts: total=277931,318061,0,0 short=0,0,0,0 dropped=0,0,0,0 00:37:31.008 latency : target=0, window=0, percentile=100.00%, depth=8 00:37:31.008 00:37:31.008 Run status group 0 (all jobs): 00:37:31.008 READ: bw=109MiB/s (114MB/s), 109MiB/s-109MiB/s (114MB/s-114MB/s), io=1086MiB (1138MB), run=10001-10001msec 00:37:31.008 WRITE: bw=131MiB/s (137MB/s), 131MiB/s-131MiB/s (137MB/s-137MB/s), io=1242MiB (1303MB), run=9489-9489msec 00:37:31.008 00:37:31.008 real 0m11.143s 00:37:31.008 user 0m32.253s 00:37:31.008 sys 0m0.381s 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:37:31.008 ************************************ 00:37:31.008 END TEST bdev_fio_rw_verify 00:37:31.008 ************************************ 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f71d6c3c-7b57-56d8-8e08-90fb3c295732"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f71d6c3c-7b57-56d8-8e08-90fb3c295732",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "62012f7c-3a4b-5896-83a6-c62b9ad856bf"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "62012f7c-3a4b-5896-83a6-c62b9ad856bf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:37:31.008 crypto_ram3 ]] 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f71d6c3c-7b57-56d8-8e08-90fb3c295732"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f71d6c3c-7b57-56d8-8e08-90fb3c295732",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "62012f7c-3a4b-5896-83a6-c62b9ad856bf"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "62012f7c-3a4b-5896-83a6-c62b9ad856bf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:31.008 06:53:42 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:37:31.008 ************************************ 00:37:31.008 START TEST bdev_fio_trim 00:37:31.008 ************************************ 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:31.009 06:53:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:37:31.009 06:53:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:37:31.009 06:53:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:37:31.009 06:53:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:37:31.009 06:53:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:31.009 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:31.009 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:31.009 fio-3.35 00:37:31.009 Starting 2 threads 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:31.009 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:31.009 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:40.978 00:37:40.978 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1355238: Thu Jul 25 06:53:53 2024 00:37:40.978 write: IOPS=41.9k, BW=164MiB/s (172MB/s)(1637MiB/10001msec); 0 zone resets 00:37:40.978 slat (usec): min=12, max=1532, avg=20.70, stdev= 4.55 00:37:40.978 clat (usec): min=22, max=1811, avg=157.38, stdev=87.29 00:37:40.978 lat (usec): min=42, max=1876, avg=178.09, stdev=90.36 00:37:40.978 clat percentiles (usec): 00:37:40.978 | 50.000th=[ 125], 99.000th=[ 326], 99.900th=[ 343], 99.990th=[ 465], 00:37:40.978 | 99.999th=[ 734] 00:37:40.978 bw ( KiB/s): min=165680, max=169096, per=100.00%, avg=167763.58, stdev=380.77, samples=38 00:37:40.978 iops : min=41420, max=42274, avg=41940.79, stdev=95.21, samples=38 00:37:40.978 trim: IOPS=41.9k, BW=164MiB/s (172MB/s)(1637MiB/10001msec); 0 zone resets 00:37:40.978 slat (usec): min=5, max=201, avg= 9.45, stdev= 2.29 00:37:40.978 clat (usec): min=43, max=567, avg=105.05, stdev=31.23 00:37:40.978 lat (usec): min=53, max=574, avg=114.50, stdev=31.41 00:37:40.978 clat percentiles (usec): 00:37:40.978 | 50.000th=[ 106], 99.000th=[ 169], 99.900th=[ 180], 99.990th=[ 306], 00:37:40.978 | 99.999th=[ 469] 00:37:40.978 bw ( KiB/s): min=165704, max=169104, per=100.00%, avg=167765.26, stdev=379.92, samples=38 00:37:40.978 iops : min=41426, max=42276, avg=41941.21, stdev=95.00, samples=38 00:37:40.978 lat (usec) : 50=4.01%, 100=36.77%, 250=48.06%, 500=11.15%, 750=0.01% 00:37:40.978 lat (usec) : 1000=0.01% 00:37:40.978 lat (msec) : 2=0.01% 00:37:40.978 cpu : usr=99.65%, sys=0.01%, ctx=28, majf=0, minf=252 00:37:40.978 IO depths : 1=7.5%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:37:40.978 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:37:40.978 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:37:40.978 issued rwts: total=0,419136,419137,0 short=0,0,0,0 dropped=0,0,0,0 00:37:40.978 latency : target=0, window=0, percentile=100.00%, depth=8 00:37:40.978 00:37:40.978 Run status group 0 (all jobs): 00:37:40.978 WRITE: bw=164MiB/s (172MB/s), 164MiB/s-164MiB/s (172MB/s-172MB/s), io=1637MiB (1717MB), run=10001-10001msec 00:37:40.978 TRIM: bw=164MiB/s (172MB/s), 164MiB/s-164MiB/s (172MB/s-172MB/s), io=1637MiB (1717MB), run=10001-10001msec 00:37:40.978 00:37:40.978 real 0m11.125s 00:37:40.978 user 0m32.310s 00:37:40.978 sys 0m0.404s 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:37:40.978 ************************************ 00:37:40.978 END TEST bdev_fio_trim 00:37:40.978 ************************************ 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:37:40.978 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:37:40.978 00:37:40.978 real 0m22.585s 00:37:40.978 user 1m4.715s 00:37:40.978 sys 0m0.968s 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:37:40.978 ************************************ 00:37:40.978 END TEST bdev_fio 00:37:40.978 ************************************ 00:37:40.978 06:53:54 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:37:40.978 06:53:54 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:37:40.978 06:53:54 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:37:40.978 06:53:54 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:40.978 06:53:54 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:40.978 ************************************ 00:37:40.978 START TEST bdev_verify 00:37:40.978 ************************************ 00:37:40.978 06:53:54 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:37:40.978 [2024-07-25 06:53:54.248580] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:40.978 [2024-07-25 06:53:54.248634] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1357061 ] 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.978 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:40.978 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.979 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:40.979 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.979 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:40.979 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.979 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:40.979 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.979 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:40.979 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.979 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:40.979 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:40.979 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:40.979 [2024-07-25 06:53:54.385337] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:40.979 [2024-07-25 06:53:54.430513] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:40.979 [2024-07-25 06:53:54.430518] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:41.237 [2024-07-25 06:53:54.585478] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:37:41.237 [2024-07-25 06:53:54.585542] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:41.237 [2024-07-25 06:53:54.585556] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:41.237 [2024-07-25 06:53:54.593499] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:37:41.237 [2024-07-25 06:53:54.593516] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:41.237 [2024-07-25 06:53:54.593527] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:41.237 [2024-07-25 06:53:54.601524] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:37:41.237 [2024-07-25 06:53:54.601540] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:37:41.237 [2024-07-25 06:53:54.601551] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:41.237 Running I/O for 5 seconds... 00:37:46.510 00:37:46.510 Latency(us) 00:37:46.510 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:46.510 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:46.510 Verification LBA range: start 0x0 length 0x800 00:37:46.510 crypto_ram : 5.02 5630.99 22.00 0.00 0.00 22629.09 1474.56 29360.13 00:37:46.510 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:46.510 Verification LBA range: start 0x800 length 0x800 00:37:46.510 crypto_ram : 5.02 5632.58 22.00 0.00 0.00 22623.16 1769.47 29360.13 00:37:46.510 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:46.510 Verification LBA range: start 0x0 length 0x800 00:37:46.510 crypto_ram3 : 5.03 2824.39 11.03 0.00 0.00 45053.96 1585.97 33764.15 00:37:46.510 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:46.510 Verification LBA range: start 0x800 length 0x800 00:37:46.510 crypto_ram3 : 5.03 2825.14 11.04 0.00 0.00 45039.93 1939.87 33764.15 00:37:46.510 =================================================================================================================== 00:37:46.510 Total : 16913.10 66.07 0.00 0.00 30122.24 1474.56 33764.15 00:37:46.510 00:37:46.510 real 0m5.691s 00:37:46.510 user 0m10.772s 00:37:46.510 sys 0m0.237s 00:37:46.510 06:53:59 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:46.510 06:53:59 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:37:46.510 ************************************ 00:37:46.510 END TEST bdev_verify 00:37:46.510 ************************************ 00:37:46.510 06:53:59 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:37:46.510 06:53:59 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:37:46.510 06:53:59 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:46.510 06:53:59 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:46.510 ************************************ 00:37:46.510 START TEST bdev_verify_big_io 00:37:46.510 ************************************ 00:37:46.510 06:53:59 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:37:46.510 [2024-07-25 06:54:00.025222] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:46.510 [2024-07-25 06:54:00.025279] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1357875 ] 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.770 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:46.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.771 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:46.771 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.771 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:46.771 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.771 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:46.771 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.771 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:46.771 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.771 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:46.771 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:46.771 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:46.771 [2024-07-25 06:54:00.163731] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:46.771 [2024-07-25 06:54:00.210616] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:46.771 [2024-07-25 06:54:00.210620] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:47.030 [2024-07-25 06:54:00.371057] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:37:47.030 [2024-07-25 06:54:00.371113] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:47.030 [2024-07-25 06:54:00.371126] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:47.030 [2024-07-25 06:54:00.379078] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:37:47.030 [2024-07-25 06:54:00.379096] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:47.030 [2024-07-25 06:54:00.379107] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:47.030 [2024-07-25 06:54:00.387100] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:37:47.030 [2024-07-25 06:54:00.387124] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:37:47.030 [2024-07-25 06:54:00.387135] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:47.030 Running I/O for 5 seconds... 00:37:52.304 00:37:52.304 Latency(us) 00:37:52.304 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:52.304 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:52.304 Verification LBA range: start 0x0 length 0x80 00:37:52.304 crypto_ram : 5.37 453.13 28.32 0.00 0.00 276166.49 6422.53 379165.08 00:37:52.304 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:52.304 Verification LBA range: start 0x80 length 0x80 00:37:52.304 crypto_ram : 5.35 454.45 28.40 0.00 0.00 275479.94 5557.45 375809.64 00:37:52.304 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:52.304 Verification LBA range: start 0x0 length 0x80 00:37:52.304 crypto_ram3 : 5.38 237.95 14.87 0.00 0.00 506703.02 5111.81 379165.08 00:37:52.304 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:52.304 Verification LBA range: start 0x80 length 0x80 00:37:52.304 crypto_ram3 : 5.37 238.53 14.91 0.00 0.00 505669.71 6081.74 385875.97 00:37:52.304 =================================================================================================================== 00:37:52.304 Total : 1384.06 86.50 0.00 0.00 355258.79 5111.81 385875.97 00:37:52.601 00:37:52.601 real 0m6.058s 00:37:52.601 user 0m11.494s 00:37:52.601 sys 0m0.234s 00:37:52.601 06:54:06 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:52.601 06:54:06 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:37:52.601 ************************************ 00:37:52.601 END TEST bdev_verify_big_io 00:37:52.601 ************************************ 00:37:52.601 06:54:06 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:52.601 06:54:06 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:37:52.601 06:54:06 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:52.601 06:54:06 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:52.601 ************************************ 00:37:52.601 START TEST bdev_write_zeroes 00:37:52.601 ************************************ 00:37:52.601 06:54:06 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:52.861 [2024-07-25 06:54:06.157637] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:52.861 [2024-07-25 06:54:06.157690] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1359475 ] 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:52.861 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:52.861 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:52.861 [2024-07-25 06:54:06.293020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:52.861 [2024-07-25 06:54:06.337210] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:53.121 [2024-07-25 06:54:06.490618] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:37:53.121 [2024-07-25 06:54:06.490677] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:53.121 [2024-07-25 06:54:06.490690] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:53.121 [2024-07-25 06:54:06.498636] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:37:53.121 [2024-07-25 06:54:06.498653] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:53.121 [2024-07-25 06:54:06.498664] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:53.121 [2024-07-25 06:54:06.506657] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:37:53.121 [2024-07-25 06:54:06.506673] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:37:53.121 [2024-07-25 06:54:06.506688] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:53.121 Running I/O for 1 seconds... 00:37:54.053 00:37:54.053 Latency(us) 00:37:54.053 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:54.053 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:54.053 crypto_ram : 1.01 28525.50 111.43 0.00 0.00 4476.66 1913.65 6239.03 00:37:54.053 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:54.053 crypto_ram3 : 1.01 14292.63 55.83 0.00 0.00 8895.92 3093.30 9279.90 00:37:54.053 =================================================================================================================== 00:37:54.053 Total : 42818.13 167.26 0.00 0.00 5954.10 1913.65 9279.90 00:37:54.311 00:37:54.311 real 0m1.645s 00:37:54.311 user 0m1.399s 00:37:54.311 sys 0m0.229s 00:37:54.311 06:54:07 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:54.311 06:54:07 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:37:54.311 ************************************ 00:37:54.311 END TEST bdev_write_zeroes 00:37:54.311 ************************************ 00:37:54.311 06:54:07 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:54.311 06:54:07 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:37:54.311 06:54:07 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:54.311 06:54:07 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:54.311 ************************************ 00:37:54.311 START TEST bdev_json_nonenclosed 00:37:54.311 ************************************ 00:37:54.311 06:54:07 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:54.570 [2024-07-25 06:54:07.884916] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:54.570 [2024-07-25 06:54:07.884970] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1359764 ] 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:54.570 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.570 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:54.570 [2024-07-25 06:54:08.006641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:54.570 [2024-07-25 06:54:08.050213] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:54.570 [2024-07-25 06:54:08.050277] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:37:54.570 [2024-07-25 06:54:08.050293] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:37:54.570 [2024-07-25 06:54:08.050304] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:37:54.570 00:37:54.570 real 0m0.297s 00:37:54.570 user 0m0.152s 00:37:54.570 sys 0m0.143s 00:37:54.570 06:54:08 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:54.570 06:54:08 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:37:54.570 ************************************ 00:37:54.570 END TEST bdev_json_nonenclosed 00:37:54.570 ************************************ 00:37:54.829 06:54:08 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:54.829 06:54:08 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:37:54.829 06:54:08 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:54.829 06:54:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:54.829 ************************************ 00:37:54.829 START TEST bdev_json_nonarray 00:37:54.829 ************************************ 00:37:54.829 06:54:08 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:54.829 [2024-07-25 06:54:08.252399] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:54.829 [2024-07-25 06:54:08.252457] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1359790 ] 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:54.829 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:54.829 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:55.087 [2024-07-25 06:54:08.387382] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:55.087 [2024-07-25 06:54:08.431483] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:55.087 [2024-07-25 06:54:08.431554] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:37:55.087 [2024-07-25 06:54:08.431571] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:37:55.087 [2024-07-25 06:54:08.431582] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:37:55.087 00:37:55.087 real 0m0.309s 00:37:55.087 user 0m0.156s 00:37:55.087 sys 0m0.151s 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:37:55.087 ************************************ 00:37:55.087 END TEST bdev_json_nonarray 00:37:55.087 ************************************ 00:37:55.087 06:54:08 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:37:55.087 06:54:08 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:37:55.087 06:54:08 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:37:55.087 06:54:08 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:37:55.087 06:54:08 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:37:55.087 06:54:08 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:37:55.087 06:54:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:37:55.087 ************************************ 00:37:55.087 START TEST bdev_crypto_enomem 00:37:55.087 ************************************ 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # bdev_crypto_enomem 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:37:55.087 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=1359962 00:37:55.088 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:37:55.088 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 1359962 00:37:55.088 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@831 -- # '[' -z 1359962 ']' 00:37:55.088 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:55.088 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # local max_retries=100 00:37:55.088 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:55.088 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:55.088 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@840 -- # xtrace_disable 00:37:55.088 06:54:08 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:37:55.088 [2024-07-25 06:54:08.625733] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:37:55.088 [2024-07-25 06:54:08.625787] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1359962 ] 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:01.0 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:01.1 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:01.2 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:01.3 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:01.4 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:01.5 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:01.6 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:01.7 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:02.0 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:02.1 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:02.2 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:02.3 cannot be used 00:37:55.345 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.345 EAL: Requested device 0000:3d:02.4 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3d:02.5 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3d:02.6 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3d:02.7 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:01.0 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:01.1 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:01.2 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:01.3 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:01.4 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:01.5 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:01.6 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:01.7 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:02.0 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:02.1 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:02.2 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:02.3 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:02.4 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:02.5 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:02.6 cannot be used 00:37:55.346 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:37:55.346 EAL: Requested device 0000:3f:02.7 cannot be used 00:37:55.346 [2024-07-25 06:54:08.750024] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:55.346 [2024-07-25 06:54:08.795422] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@864 -- # return 0 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:37:56.280 true 00:37:56.280 base0 00:37:56.280 true 00:37:56.280 [2024-07-25 06:54:09.557685] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:37:56.280 crypt0 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_name=crypt0 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # local i 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:37:56.280 [ 00:37:56.280 { 00:37:56.280 "name": "crypt0", 00:37:56.280 "aliases": [ 00:37:56.280 "843c2ba4-b96c-5004-8947-802b7d90510a" 00:37:56.280 ], 00:37:56.280 "product_name": "crypto", 00:37:56.280 "block_size": 512, 00:37:56.280 "num_blocks": 2097152, 00:37:56.280 "uuid": "843c2ba4-b96c-5004-8947-802b7d90510a", 00:37:56.280 "assigned_rate_limits": { 00:37:56.280 "rw_ios_per_sec": 0, 00:37:56.280 "rw_mbytes_per_sec": 0, 00:37:56.280 "r_mbytes_per_sec": 0, 00:37:56.280 "w_mbytes_per_sec": 0 00:37:56.280 }, 00:37:56.280 "claimed": false, 00:37:56.280 "zoned": false, 00:37:56.280 "supported_io_types": { 00:37:56.280 "read": true, 00:37:56.280 "write": true, 00:37:56.280 "unmap": false, 00:37:56.280 "flush": false, 00:37:56.280 "reset": true, 00:37:56.280 "nvme_admin": false, 00:37:56.280 "nvme_io": false, 00:37:56.280 "nvme_io_md": false, 00:37:56.280 "write_zeroes": true, 00:37:56.280 "zcopy": false, 00:37:56.280 "get_zone_info": false, 00:37:56.280 "zone_management": false, 00:37:56.280 "zone_append": false, 00:37:56.280 "compare": false, 00:37:56.280 "compare_and_write": false, 00:37:56.280 "abort": false, 00:37:56.280 "seek_hole": false, 00:37:56.280 "seek_data": false, 00:37:56.280 "copy": false, 00:37:56.280 "nvme_iov_md": false 00:37:56.280 }, 00:37:56.280 "memory_domains": [ 00:37:56.280 { 00:37:56.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:37:56.280 "dma_device_type": 2 00:37:56.280 } 00:37:56.280 ], 00:37:56.280 "driver_specific": { 00:37:56.280 "crypto": { 00:37:56.280 "base_bdev_name": "EE_base0", 00:37:56.280 "name": "crypt0", 00:37:56.280 "key_name": "test_dek_sw" 00:37:56.280 } 00:37:56.280 } 00:37:56.280 } 00:37:56.280 ] 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@907 -- # return 0 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=1360074 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:37:56.280 06:54:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:37:56.280 Running I/O for 5 seconds... 00:37:57.215 06:54:10 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:37:57.215 06:54:10 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:37:57.215 06:54:10 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:37:57.215 06:54:10 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:37:57.215 06:54:10 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 1360074 00:38:01.400 00:38:01.400 Latency(us) 00:38:01.400 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:01.400 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:38:01.400 crypt0 : 5.00 38988.08 152.30 0.00 0.00 817.27 391.58 1081.34 00:38:01.400 =================================================================================================================== 00:38:01.400 Total : 38988.08 152.30 0.00 0.00 817.27 391.58 1081.34 00:38:01.400 0 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 1359962 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@950 -- # '[' -z 1359962 ']' 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # kill -0 1359962 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # uname 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1359962 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1359962' 00:38:01.400 killing process with pid 1359962 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@969 -- # kill 1359962 00:38:01.400 Received shutdown signal, test time was about 5.000000 seconds 00:38:01.400 00:38:01.400 Latency(us) 00:38:01.400 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:01.400 =================================================================================================================== 00:38:01.400 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@974 -- # wait 1359962 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:38:01.400 00:38:01.400 real 0m6.364s 00:38:01.400 user 0m6.602s 00:38:01.400 sys 0m0.347s 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:01.400 06:54:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:38:01.400 ************************************ 00:38:01.400 END TEST bdev_crypto_enomem 00:38:01.400 ************************************ 00:38:01.659 06:54:14 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:38:01.659 06:54:14 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:38:01.659 06:54:14 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:38:01.659 06:54:14 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:38:01.659 06:54:14 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:38:01.659 06:54:14 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:38:01.659 06:54:14 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:38:01.659 06:54:14 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:38:01.659 00:38:01.659 real 0m53.195s 00:38:01.659 user 1m49.534s 00:38:01.659 sys 0m6.286s 00:38:01.659 06:54:14 blockdev_crypto_sw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:01.659 06:54:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:38:01.659 ************************************ 00:38:01.659 END TEST blockdev_crypto_sw 00:38:01.659 ************************************ 00:38:01.659 06:54:15 -- spdk/autotest.sh@363 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:38:01.659 06:54:15 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:38:01.659 06:54:15 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:01.659 06:54:15 -- common/autotest_common.sh@10 -- # set +x 00:38:01.659 ************************************ 00:38:01.659 START TEST blockdev_crypto_qat 00:38:01.659 ************************************ 00:38:01.659 06:54:15 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:38:01.659 * Looking for test storage... 00:38:01.659 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1361133 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:38:01.659 06:54:15 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1361133 00:38:01.659 06:54:15 blockdev_crypto_qat -- common/autotest_common.sh@831 -- # '[' -z 1361133 ']' 00:38:01.659 06:54:15 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:38:01.659 06:54:15 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # local max_retries=100 00:38:01.659 06:54:15 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:38:01.659 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:38:01.660 06:54:15 blockdev_crypto_qat -- common/autotest_common.sh@840 -- # xtrace_disable 00:38:01.660 06:54:15 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:01.918 [2024-07-25 06:54:15.327176] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:38:01.918 [2024-07-25 06:54:15.327310] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1361133 ] 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:01.0 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:01.1 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:01.2 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:01.3 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:01.4 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:01.5 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:01.6 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:01.7 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:02.0 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:02.1 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:02.2 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:02.3 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:02.4 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:02.5 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:02.6 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3d:02.7 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:01.0 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:01.1 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:01.2 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:01.3 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:01.4 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:01.5 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:01.6 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:01.7 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:02.0 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:02.1 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:02.2 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:02.3 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.177 EAL: Requested device 0000:3f:02.4 cannot be used 00:38:02.177 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.178 EAL: Requested device 0000:3f:02.5 cannot be used 00:38:02.178 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.178 EAL: Requested device 0000:3f:02.6 cannot be used 00:38:02.178 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:02.178 EAL: Requested device 0000:3f:02.7 cannot be used 00:38:02.178 [2024-07-25 06:54:15.545582] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:02.178 [2024-07-25 06:54:15.588548] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:02.745 06:54:16 blockdev_crypto_qat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:38:02.745 06:54:16 blockdev_crypto_qat -- common/autotest_common.sh@864 -- # return 0 00:38:02.745 06:54:16 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:38:02.745 06:54:16 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:38:02.745 06:54:16 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:38:02.745 06:54:16 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:38:02.745 06:54:16 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:02.745 [2024-07-25 06:54:16.170423] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:38:02.745 [2024-07-25 06:54:16.178456] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:38:02.745 [2024-07-25 06:54:16.186474] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:38:02.745 [2024-07-25 06:54:16.256721] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:38:05.278 true 00:38:05.278 true 00:38:05.278 true 00:38:05.278 true 00:38:05.278 Malloc0 00:38:05.278 Malloc1 00:38:05.278 Malloc2 00:38:05.278 Malloc3 00:38:05.278 [2024-07-25 06:54:18.728042] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:38:05.278 crypto_ram 00:38:05.278 [2024-07-25 06:54:18.736063] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:38:05.278 crypto_ram1 00:38:05.278 [2024-07-25 06:54:18.744086] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:38:05.278 crypto_ram2 00:38:05.278 [2024-07-25 06:54:18.752108] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:38:05.278 crypto_ram3 00:38:05.278 [ 00:38:05.278 { 00:38:05.278 "name": "Malloc1", 00:38:05.278 "aliases": [ 00:38:05.278 "05c82c06-db7b-4a89-9f6e-b22eb24ecf51" 00:38:05.278 ], 00:38:05.278 "product_name": "Malloc disk", 00:38:05.278 "block_size": 512, 00:38:05.278 "num_blocks": 65536, 00:38:05.278 "uuid": "05c82c06-db7b-4a89-9f6e-b22eb24ecf51", 00:38:05.278 "assigned_rate_limits": { 00:38:05.278 "rw_ios_per_sec": 0, 00:38:05.278 "rw_mbytes_per_sec": 0, 00:38:05.278 "r_mbytes_per_sec": 0, 00:38:05.278 "w_mbytes_per_sec": 0 00:38:05.278 }, 00:38:05.278 "claimed": true, 00:38:05.278 "claim_type": "exclusive_write", 00:38:05.278 "zoned": false, 00:38:05.278 "supported_io_types": { 00:38:05.278 "read": true, 00:38:05.278 "write": true, 00:38:05.278 "unmap": true, 00:38:05.278 "flush": true, 00:38:05.278 "reset": true, 00:38:05.278 "nvme_admin": false, 00:38:05.278 "nvme_io": false, 00:38:05.278 "nvme_io_md": false, 00:38:05.278 "write_zeroes": true, 00:38:05.278 "zcopy": true, 00:38:05.278 "get_zone_info": false, 00:38:05.278 "zone_management": false, 00:38:05.278 "zone_append": false, 00:38:05.278 "compare": false, 00:38:05.278 "compare_and_write": false, 00:38:05.278 "abort": true, 00:38:05.278 "seek_hole": false, 00:38:05.278 "seek_data": false, 00:38:05.278 "copy": true, 00:38:05.278 "nvme_iov_md": false 00:38:05.278 }, 00:38:05.278 "memory_domains": [ 00:38:05.278 { 00:38:05.278 "dma_device_id": "system", 00:38:05.278 "dma_device_type": 1 00:38:05.278 }, 00:38:05.278 { 00:38:05.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:38:05.278 "dma_device_type": 2 00:38:05.278 } 00:38:05.278 ], 00:38:05.278 "driver_specific": {} 00:38:05.278 } 00:38:05.278 ] 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:38:05.278 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:38:05.278 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:38:05.278 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:38:05.278 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:38:05.278 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:05.538 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:38:05.538 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:38:05.538 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:38:05.538 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:05.538 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:38:05.538 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:38:05.538 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:38:05.538 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:38:05.538 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:05.538 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:38:05.538 06:54:18 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:38:05.538 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:38:05.538 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:38:05.538 06:54:18 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0fb69492-45d0-5726-94f6-ac4a2f861355"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0fb69492-45d0-5726-94f6-ac4a2f861355",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "0640b008-aa8d-5e8f-9a51-89fa0e383c94"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0640b008-aa8d-5e8f-9a51-89fa0e383c94",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "dce68e69-bb75-5a52-adab-a0e1068f41fd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dce68e69-bb75-5a52-adab-a0e1068f41fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "9386ba65-9b5a-5697-8448-a104cb4e5629"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9386ba65-9b5a-5697-8448-a104cb4e5629",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:38:05.538 06:54:19 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:38:05.538 06:54:19 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:38:05.538 06:54:19 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:38:05.538 06:54:19 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 1361133 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@950 -- # '[' -z 1361133 ']' 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # kill -0 1361133 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # uname 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1361133 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1361133' 00:38:05.538 killing process with pid 1361133 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@969 -- # kill 1361133 00:38:05.538 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@974 -- # wait 1361133 00:38:06.106 06:54:19 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:38:06.106 06:54:19 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:38:06.106 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:38:06.107 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:06.107 06:54:19 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:06.107 ************************************ 00:38:06.107 START TEST bdev_hello_world 00:38:06.107 ************************************ 00:38:06.107 06:54:19 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:38:06.107 [2024-07-25 06:54:19.590106] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:38:06.107 [2024-07-25 06:54:19.590176] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1361734 ] 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:01.0 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:01.1 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:01.2 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:01.3 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:01.4 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:01.5 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:01.6 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:01.7 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:02.0 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:02.1 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:02.2 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:02.3 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:02.4 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:02.5 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:02.6 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3d:02.7 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:01.0 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:01.1 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:01.2 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:01.3 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:01.4 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:01.5 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:01.6 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:01.7 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:02.0 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:02.1 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:02.2 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:02.3 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:02.4 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:02.5 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:02.6 cannot be used 00:38:06.366 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:06.366 EAL: Requested device 0000:3f:02.7 cannot be used 00:38:06.366 [2024-07-25 06:54:19.726926] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:06.366 [2024-07-25 06:54:19.769862] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:06.366 [2024-07-25 06:54:19.791113] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:38:06.366 [2024-07-25 06:54:19.799146] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:38:06.366 [2024-07-25 06:54:19.807162] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:38:06.366 [2024-07-25 06:54:19.916113] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:38:08.903 [2024-07-25 06:54:22.245476] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:38:08.903 [2024-07-25 06:54:22.245540] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:38:08.903 [2024-07-25 06:54:22.245555] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:08.903 [2024-07-25 06:54:22.253496] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:38:08.903 [2024-07-25 06:54:22.253513] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:38:08.903 [2024-07-25 06:54:22.253524] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:08.903 [2024-07-25 06:54:22.261515] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:38:08.903 [2024-07-25 06:54:22.261531] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:38:08.904 [2024-07-25 06:54:22.261541] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:08.904 [2024-07-25 06:54:22.269535] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:38:08.904 [2024-07-25 06:54:22.269551] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:38:08.904 [2024-07-25 06:54:22.269561] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:08.904 [2024-07-25 06:54:22.340760] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:38:08.904 [2024-07-25 06:54:22.340801] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:38:08.904 [2024-07-25 06:54:22.340817] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:38:08.904 [2024-07-25 06:54:22.342118] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:38:08.904 [2024-07-25 06:54:22.342192] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:38:08.904 [2024-07-25 06:54:22.342217] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:38:08.904 [2024-07-25 06:54:22.342256] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:38:08.904 00:38:08.904 [2024-07-25 06:54:22.342273] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:38:09.162 00:38:09.162 real 0m3.095s 00:38:09.162 user 0m2.571s 00:38:09.162 sys 0m0.487s 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:38:09.162 ************************************ 00:38:09.162 END TEST bdev_hello_world 00:38:09.162 ************************************ 00:38:09.162 06:54:22 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:38:09.162 06:54:22 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:38:09.162 06:54:22 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:09.162 06:54:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:09.162 ************************************ 00:38:09.162 START TEST bdev_bounds 00:38:09.162 ************************************ 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1362278 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1362278' 00:38:09.162 Process bdevio pid: 1362278 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1362278 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1362278 ']' 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:38:09.162 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:38:09.162 06:54:22 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:38:09.422 [2024-07-25 06:54:22.769516] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:38:09.422 [2024-07-25 06:54:22.769573] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1362278 ] 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:01.0 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:01.1 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:01.2 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:01.3 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:01.4 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:01.5 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:01.6 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:01.7 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:02.0 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:02.1 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:02.2 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:02.3 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:02.4 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:02.5 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:02.6 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3d:02.7 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:01.0 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:01.1 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:01.2 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:01.3 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:01.4 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:01.5 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:01.6 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:01.7 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:02.0 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:02.1 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:02.2 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:02.3 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:02.4 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:02.5 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:02.6 cannot be used 00:38:09.422 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:09.422 EAL: Requested device 0000:3f:02.7 cannot be used 00:38:09.422 [2024-07-25 06:54:22.903798] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:38:09.422 [2024-07-25 06:54:22.951318] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:38:09.422 [2024-07-25 06:54:22.951412] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:38:09.422 [2024-07-25 06:54:22.951413] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:09.422 [2024-07-25 06:54:22.972714] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:38:09.682 [2024-07-25 06:54:22.980742] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:38:09.682 [2024-07-25 06:54:22.988764] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:38:09.682 [2024-07-25 06:54:23.092194] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:38:12.215 [2024-07-25 06:54:25.405137] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:38:12.215 [2024-07-25 06:54:25.405205] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:38:12.215 [2024-07-25 06:54:25.405220] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:12.215 [2024-07-25 06:54:25.413162] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:38:12.215 [2024-07-25 06:54:25.413179] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:38:12.215 [2024-07-25 06:54:25.413190] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:12.215 [2024-07-25 06:54:25.421184] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:38:12.215 [2024-07-25 06:54:25.421200] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:38:12.215 [2024-07-25 06:54:25.421211] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:12.215 [2024-07-25 06:54:25.429204] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:38:12.215 [2024-07-25 06:54:25.429225] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:38:12.215 [2024-07-25 06:54:25.429236] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:12.215 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:38:12.215 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:38:12.215 06:54:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:38:12.215 I/O targets: 00:38:12.215 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:38:12.215 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:38:12.215 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:38:12.215 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:38:12.215 00:38:12.215 00:38:12.215 CUnit - A unit testing framework for C - Version 2.1-3 00:38:12.215 http://cunit.sourceforge.net/ 00:38:12.215 00:38:12.215 00:38:12.215 Suite: bdevio tests on: crypto_ram3 00:38:12.215 Test: blockdev write read block ...passed 00:38:12.215 Test: blockdev write zeroes read block ...passed 00:38:12.215 Test: blockdev write zeroes read no split ...passed 00:38:12.215 Test: blockdev write zeroes read split ...passed 00:38:12.215 Test: blockdev write zeroes read split partial ...passed 00:38:12.215 Test: blockdev reset ...passed 00:38:12.215 Test: blockdev write read 8 blocks ...passed 00:38:12.215 Test: blockdev write read size > 128k ...passed 00:38:12.215 Test: blockdev write read invalid size ...passed 00:38:12.215 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:38:12.215 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:38:12.215 Test: blockdev write read max offset ...passed 00:38:12.215 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:38:12.215 Test: blockdev writev readv 8 blocks ...passed 00:38:12.215 Test: blockdev writev readv 30 x 1block ...passed 00:38:12.215 Test: blockdev writev readv block ...passed 00:38:12.215 Test: blockdev writev readv size > 128k ...passed 00:38:12.215 Test: blockdev writev readv size > 128k in two iovs ...passed 00:38:12.215 Test: blockdev comparev and writev ...passed 00:38:12.215 Test: blockdev nvme passthru rw ...passed 00:38:12.215 Test: blockdev nvme passthru vendor specific ...passed 00:38:12.215 Test: blockdev nvme admin passthru ...passed 00:38:12.215 Test: blockdev copy ...passed 00:38:12.215 Suite: bdevio tests on: crypto_ram2 00:38:12.215 Test: blockdev write read block ...passed 00:38:12.215 Test: blockdev write zeroes read block ...passed 00:38:12.215 Test: blockdev write zeroes read no split ...passed 00:38:12.215 Test: blockdev write zeroes read split ...passed 00:38:12.215 Test: blockdev write zeroes read split partial ...passed 00:38:12.215 Test: blockdev reset ...passed 00:38:12.215 Test: blockdev write read 8 blocks ...passed 00:38:12.215 Test: blockdev write read size > 128k ...passed 00:38:12.215 Test: blockdev write read invalid size ...passed 00:38:12.215 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:38:12.215 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:38:12.215 Test: blockdev write read max offset ...passed 00:38:12.215 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:38:12.215 Test: blockdev writev readv 8 blocks ...passed 00:38:12.215 Test: blockdev writev readv 30 x 1block ...passed 00:38:12.215 Test: blockdev writev readv block ...passed 00:38:12.215 Test: blockdev writev readv size > 128k ...passed 00:38:12.215 Test: blockdev writev readv size > 128k in two iovs ...passed 00:38:12.215 Test: blockdev comparev and writev ...passed 00:38:12.215 Test: blockdev nvme passthru rw ...passed 00:38:12.215 Test: blockdev nvme passthru vendor specific ...passed 00:38:12.215 Test: blockdev nvme admin passthru ...passed 00:38:12.215 Test: blockdev copy ...passed 00:38:12.215 Suite: bdevio tests on: crypto_ram1 00:38:12.215 Test: blockdev write read block ...passed 00:38:12.215 Test: blockdev write zeroes read block ...passed 00:38:12.215 Test: blockdev write zeroes read no split ...passed 00:38:12.215 Test: blockdev write zeroes read split ...passed 00:38:12.215 Test: blockdev write zeroes read split partial ...passed 00:38:12.215 Test: blockdev reset ...passed 00:38:12.215 Test: blockdev write read 8 blocks ...passed 00:38:12.475 Test: blockdev write read size > 128k ...passed 00:38:12.475 Test: blockdev write read invalid size ...passed 00:38:12.475 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:38:12.475 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:38:12.475 Test: blockdev write read max offset ...passed 00:38:12.475 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:38:12.475 Test: blockdev writev readv 8 blocks ...passed 00:38:12.475 Test: blockdev writev readv 30 x 1block ...passed 00:38:12.475 Test: blockdev writev readv block ...passed 00:38:12.475 Test: blockdev writev readv size > 128k ...passed 00:38:12.475 Test: blockdev writev readv size > 128k in two iovs ...passed 00:38:12.475 Test: blockdev comparev and writev ...passed 00:38:12.475 Test: blockdev nvme passthru rw ...passed 00:38:12.475 Test: blockdev nvme passthru vendor specific ...passed 00:38:12.475 Test: blockdev nvme admin passthru ...passed 00:38:12.475 Test: blockdev copy ...passed 00:38:12.475 Suite: bdevio tests on: crypto_ram 00:38:12.475 Test: blockdev write read block ...passed 00:38:12.475 Test: blockdev write zeroes read block ...passed 00:38:12.475 Test: blockdev write zeroes read no split ...passed 00:38:12.475 Test: blockdev write zeroes read split ...passed 00:38:12.475 Test: blockdev write zeroes read split partial ...passed 00:38:12.475 Test: blockdev reset ...passed 00:38:12.475 Test: blockdev write read 8 blocks ...passed 00:38:12.475 Test: blockdev write read size > 128k ...passed 00:38:12.475 Test: blockdev write read invalid size ...passed 00:38:12.475 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:38:12.475 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:38:12.475 Test: blockdev write read max offset ...passed 00:38:12.475 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:38:12.475 Test: blockdev writev readv 8 blocks ...passed 00:38:12.475 Test: blockdev writev readv 30 x 1block ...passed 00:38:12.475 Test: blockdev writev readv block ...passed 00:38:12.475 Test: blockdev writev readv size > 128k ...passed 00:38:12.475 Test: blockdev writev readv size > 128k in two iovs ...passed 00:38:12.475 Test: blockdev comparev and writev ...passed 00:38:12.475 Test: blockdev nvme passthru rw ...passed 00:38:12.475 Test: blockdev nvme passthru vendor specific ...passed 00:38:12.475 Test: blockdev nvme admin passthru ...passed 00:38:12.475 Test: blockdev copy ...passed 00:38:12.475 00:38:12.475 Run Summary: Type Total Ran Passed Failed Inactive 00:38:12.475 suites 4 4 n/a 0 0 00:38:12.475 tests 92 92 92 0 0 00:38:12.475 asserts 520 520 520 0 n/a 00:38:12.475 00:38:12.475 Elapsed time = 0.469 seconds 00:38:12.475 0 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1362278 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1362278 ']' 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1362278 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1362278 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1362278' 00:38:12.475 killing process with pid 1362278 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1362278 00:38:12.475 06:54:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1362278 00:38:12.735 06:54:26 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:38:12.735 00:38:12.735 real 0m3.529s 00:38:12.735 user 0m9.883s 00:38:12.735 sys 0m0.662s 00:38:12.735 06:54:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:12.735 06:54:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:38:12.735 ************************************ 00:38:12.735 END TEST bdev_bounds 00:38:12.735 ************************************ 00:38:12.735 06:54:26 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:38:12.735 06:54:26 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:38:12.735 06:54:26 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:12.735 06:54:26 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:12.995 ************************************ 00:38:12.995 START TEST bdev_nbd 00:38:12.995 ************************************ 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1362908 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1362908 /var/tmp/spdk-nbd.sock 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1362908 ']' 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:38:12.995 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:38:12.995 06:54:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:38:12.995 [2024-07-25 06:54:26.395504] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:38:12.995 [2024-07-25 06:54:26.395562] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:01.0 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:01.1 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:01.2 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:01.3 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:01.4 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:01.5 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:01.6 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:01.7 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:02.0 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:02.1 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:02.2 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:02.3 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:02.4 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:02.5 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:02.6 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3d:02.7 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:01.0 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:01.1 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:01.2 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:01.3 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:01.4 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:01.5 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:01.6 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:01.7 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:02.0 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:02.1 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:02.2 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:02.3 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:02.4 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:02.5 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:02.6 cannot be used 00:38:12.995 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:12.995 EAL: Requested device 0000:3f:02.7 cannot be used 00:38:12.995 [2024-07-25 06:54:26.544466] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:13.255 [2024-07-25 06:54:26.588792] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:13.255 [2024-07-25 06:54:26.610033] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:38:13.255 [2024-07-25 06:54:26.618061] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:38:13.255 [2024-07-25 06:54:26.626080] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:38:13.255 [2024-07-25 06:54:26.738400] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:38:15.793 [2024-07-25 06:54:29.064536] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:38:15.793 [2024-07-25 06:54:29.064602] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:38:15.793 [2024-07-25 06:54:29.064616] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:15.793 [2024-07-25 06:54:29.072556] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:38:15.793 [2024-07-25 06:54:29.072573] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:38:15.793 [2024-07-25 06:54:29.072584] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:15.793 [2024-07-25 06:54:29.080574] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:38:15.793 [2024-07-25 06:54:29.080595] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:38:15.793 [2024-07-25 06:54:29.080608] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:15.793 [2024-07-25 06:54:29.088595] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:38:15.793 [2024-07-25 06:54:29.088615] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:38:15.793 [2024-07-25 06:54:29.088626] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:38:15.793 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:16.053 1+0 records in 00:38:16.053 1+0 records out 00:38:16.053 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297463 s, 13.8 MB/s 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:38:16.053 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:16.312 1+0 records in 00:38:16.312 1+0 records out 00:38:16.312 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277602 s, 14.8 MB/s 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:38:16.312 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:16.570 1+0 records in 00:38:16.570 1+0 records out 00:38:16.570 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304236 s, 13.5 MB/s 00:38:16.570 06:54:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:16.570 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:38:16.570 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:16.570 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:38:16.570 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:38:16.571 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:38:16.571 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:38:16.571 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:38:16.829 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:16.830 1+0 records in 00:38:16.830 1+0 records out 00:38:16.830 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333118 s, 12.3 MB/s 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:38:16.830 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:38:17.088 { 00:38:17.088 "nbd_device": "/dev/nbd0", 00:38:17.088 "bdev_name": "crypto_ram" 00:38:17.088 }, 00:38:17.088 { 00:38:17.088 "nbd_device": "/dev/nbd1", 00:38:17.088 "bdev_name": "crypto_ram1" 00:38:17.088 }, 00:38:17.088 { 00:38:17.088 "nbd_device": "/dev/nbd2", 00:38:17.088 "bdev_name": "crypto_ram2" 00:38:17.088 }, 00:38:17.088 { 00:38:17.088 "nbd_device": "/dev/nbd3", 00:38:17.088 "bdev_name": "crypto_ram3" 00:38:17.088 } 00:38:17.088 ]' 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:38:17.088 { 00:38:17.088 "nbd_device": "/dev/nbd0", 00:38:17.088 "bdev_name": "crypto_ram" 00:38:17.088 }, 00:38:17.088 { 00:38:17.088 "nbd_device": "/dev/nbd1", 00:38:17.088 "bdev_name": "crypto_ram1" 00:38:17.088 }, 00:38:17.088 { 00:38:17.088 "nbd_device": "/dev/nbd2", 00:38:17.088 "bdev_name": "crypto_ram2" 00:38:17.088 }, 00:38:17.088 { 00:38:17.088 "nbd_device": "/dev/nbd3", 00:38:17.088 "bdev_name": "crypto_ram3" 00:38:17.088 } 00:38:17.088 ]' 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:17.088 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:17.346 06:54:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:17.605 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:17.864 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:18.123 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:38:18.382 06:54:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:38:18.642 /dev/nbd0 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:18.642 1+0 records in 00:38:18.642 1+0 records out 00:38:18.642 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000322687 s, 12.7 MB/s 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:38:18.642 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:38:18.901 /dev/nbd1 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:18.901 1+0 records in 00:38:18.901 1+0 records out 00:38:18.901 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021853 s, 18.7 MB/s 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:38:18.901 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:38:19.160 /dev/nbd10 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:19.160 1+0 records in 00:38:19.160 1+0 records out 00:38:19.160 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319039 s, 12.8 MB/s 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:38:19.160 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:38:19.420 /dev/nbd11 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:38:19.420 1+0 records in 00:38:19.420 1+0 records out 00:38:19.420 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031655 s, 12.9 MB/s 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:19.420 06:54:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:38:19.681 { 00:38:19.681 "nbd_device": "/dev/nbd0", 00:38:19.681 "bdev_name": "crypto_ram" 00:38:19.681 }, 00:38:19.681 { 00:38:19.681 "nbd_device": "/dev/nbd1", 00:38:19.681 "bdev_name": "crypto_ram1" 00:38:19.681 }, 00:38:19.681 { 00:38:19.681 "nbd_device": "/dev/nbd10", 00:38:19.681 "bdev_name": "crypto_ram2" 00:38:19.681 }, 00:38:19.681 { 00:38:19.681 "nbd_device": "/dev/nbd11", 00:38:19.681 "bdev_name": "crypto_ram3" 00:38:19.681 } 00:38:19.681 ]' 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:38:19.681 { 00:38:19.681 "nbd_device": "/dev/nbd0", 00:38:19.681 "bdev_name": "crypto_ram" 00:38:19.681 }, 00:38:19.681 { 00:38:19.681 "nbd_device": "/dev/nbd1", 00:38:19.681 "bdev_name": "crypto_ram1" 00:38:19.681 }, 00:38:19.681 { 00:38:19.681 "nbd_device": "/dev/nbd10", 00:38:19.681 "bdev_name": "crypto_ram2" 00:38:19.681 }, 00:38:19.681 { 00:38:19.681 "nbd_device": "/dev/nbd11", 00:38:19.681 "bdev_name": "crypto_ram3" 00:38:19.681 } 00:38:19.681 ]' 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:38:19.681 /dev/nbd1 00:38:19.681 /dev/nbd10 00:38:19.681 /dev/nbd11' 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:38:19.681 /dev/nbd1 00:38:19.681 /dev/nbd10 00:38:19.681 /dev/nbd11' 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:38:19.681 256+0 records in 00:38:19.681 256+0 records out 00:38:19.681 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105623 s, 99.3 MB/s 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:38:19.681 256+0 records in 00:38:19.681 256+0 records out 00:38:19.681 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0759545 s, 13.8 MB/s 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:38:19.681 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:38:19.941 256+0 records in 00:38:19.941 256+0 records out 00:38:19.941 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0424922 s, 24.7 MB/s 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:38:19.941 256+0 records in 00:38:19.941 256+0 records out 00:38:19.941 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0526692 s, 19.9 MB/s 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:38:19.941 256+0 records in 00:38:19.941 256+0 records out 00:38:19.941 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0508894 s, 20.6 MB/s 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:19.941 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:20.200 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:20.458 06:54:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:20.716 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:20.975 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:38:21.233 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:38:21.491 malloc_lvol_verify 00:38:21.491 06:54:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:38:21.750 461c8f21-e1ee-4f2c-88e4-1f879e29e8c6 00:38:21.750 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:38:22.009 505182d8-c0a2-44d4-b796-817eff02a9fb 00:38:22.009 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:38:22.009 /dev/nbd0 00:38:22.009 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:38:22.009 mke2fs 1.46.5 (30-Dec-2021) 00:38:22.009 Discarding device blocks: 0/4096 done 00:38:22.009 Creating filesystem with 4096 1k blocks and 1024 inodes 00:38:22.009 00:38:22.009 Allocating group tables: 0/1 done 00:38:22.009 Writing inode tables: 0/1 done 00:38:22.270 Creating journal (1024 blocks): done 00:38:22.270 Writing superblocks and filesystem accounting information: 0/1 done 00:38:22.270 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1362908 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1362908 ']' 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1362908 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:38:22.270 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1362908 00:38:22.528 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:38:22.528 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:38:22.528 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1362908' 00:38:22.528 killing process with pid 1362908 00:38:22.528 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1362908 00:38:22.528 06:54:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1362908 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:38:22.787 00:38:22.787 real 0m9.843s 00:38:22.787 user 0m12.485s 00:38:22.787 sys 0m4.047s 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:38:22.787 ************************************ 00:38:22.787 END TEST bdev_nbd 00:38:22.787 ************************************ 00:38:22.787 06:54:36 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:38:22.787 06:54:36 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:38:22.787 06:54:36 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:38:22.787 06:54:36 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:38:22.787 06:54:36 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:38:22.787 06:54:36 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:22.787 06:54:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:22.787 ************************************ 00:38:22.787 START TEST bdev_fio 00:38:22.787 ************************************ 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:38:22.787 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:22.787 06:54:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:38:23.046 ************************************ 00:38:23.046 START TEST bdev_fio_rw_verify 00:38:23.046 ************************************ 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:38:23.046 06:54:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:23.305 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:38:23.305 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:38:23.305 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:38:23.305 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:38:23.305 fio-3.35 00:38:23.305 Starting 4 threads 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:01.0 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:01.1 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:01.2 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:01.3 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:01.4 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:01.5 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:01.6 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:01.7 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:02.0 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:02.1 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:02.2 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:02.3 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:02.4 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:02.5 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:02.6 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3d:02.7 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:01.0 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:01.1 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:01.2 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:01.3 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:01.4 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:01.5 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:01.6 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:01.7 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:02.0 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:02.1 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:02.2 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:02.3 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:02.4 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:02.5 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:02.6 cannot be used 00:38:23.564 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:23.564 EAL: Requested device 0000:3f:02.7 cannot be used 00:38:38.448 00:38:38.448 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1365401: Thu Jul 25 06:54:49 2024 00:38:38.448 read: IOPS=24.4k, BW=95.3MiB/s (99.9MB/s)(953MiB/10001msec) 00:38:38.448 slat (usec): min=15, max=1160, avg=57.01, stdev=22.97 00:38:38.448 clat (usec): min=17, max=1686, avg=322.61, stdev=176.11 00:38:38.448 lat (usec): min=33, max=1750, avg=379.63, stdev=183.57 00:38:38.448 clat percentiles (usec): 00:38:38.448 | 50.000th=[ 285], 99.000th=[ 848], 99.900th=[ 971], 99.990th=[ 1045], 00:38:38.448 | 99.999th=[ 1188] 00:38:38.448 write: IOPS=26.7k, BW=104MiB/s (110MB/s)(1017MiB/9736msec); 0 zone resets 00:38:38.448 slat (usec): min=26, max=391, avg=67.82, stdev=21.64 00:38:38.448 clat (usec): min=32, max=1585, avg=358.84, stdev=179.27 00:38:38.448 lat (usec): min=78, max=1783, avg=426.66, stdev=185.81 00:38:38.448 clat percentiles (usec): 00:38:38.448 | 50.000th=[ 334], 99.000th=[ 857], 99.900th=[ 963], 99.990th=[ 1045], 00:38:38.448 | 99.999th=[ 1418] 00:38:38.448 bw ( KiB/s): min=90496, max=114456, per=97.69%, avg=104519.05, stdev=1727.85, samples=76 00:38:38.448 iops : min=22624, max=28614, avg=26129.74, stdev=431.96, samples=76 00:38:38.448 lat (usec) : 20=0.01%, 50=0.01%, 100=3.03%, 250=33.75%, 500=44.80% 00:38:38.448 lat (usec) : 750=15.82%, 1000=2.55% 00:38:38.448 lat (msec) : 2=0.04% 00:38:38.448 cpu : usr=99.63%, sys=0.00%, ctx=99, majf=0, minf=255 00:38:38.448 IO depths : 1=1.3%, 2=28.2%, 4=56.4%, 8=14.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:38:38.448 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:38:38.448 complete : 0=0.0%, 4=87.6%, 8=12.4%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:38:38.448 issued rwts: total=243938,260419,0,0 short=0,0,0,0 dropped=0,0,0,0 00:38:38.448 latency : target=0, window=0, percentile=100.00%, depth=8 00:38:38.448 00:38:38.448 Run status group 0 (all jobs): 00:38:38.448 READ: bw=95.3MiB/s (99.9MB/s), 95.3MiB/s-95.3MiB/s (99.9MB/s-99.9MB/s), io=953MiB (999MB), run=10001-10001msec 00:38:38.448 WRITE: bw=104MiB/s (110MB/s), 104MiB/s-104MiB/s (110MB/s-110MB/s), io=1017MiB (1067MB), run=9736-9736msec 00:38:38.448 00:38:38.448 real 0m13.568s 00:38:38.448 user 0m54.346s 00:38:38.448 sys 0m0.654s 00:38:38.448 06:54:49 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:38:38.449 ************************************ 00:38:38.449 END TEST bdev_fio_rw_verify 00:38:38.449 ************************************ 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:38:38.449 06:54:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0fb69492-45d0-5726-94f6-ac4a2f861355"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0fb69492-45d0-5726-94f6-ac4a2f861355",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "0640b008-aa8d-5e8f-9a51-89fa0e383c94"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0640b008-aa8d-5e8f-9a51-89fa0e383c94",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "dce68e69-bb75-5a52-adab-a0e1068f41fd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dce68e69-bb75-5a52-adab-a0e1068f41fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "9386ba65-9b5a-5697-8448-a104cb4e5629"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9386ba65-9b5a-5697-8448-a104cb4e5629",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:38:38.449 crypto_ram1 00:38:38.449 crypto_ram2 00:38:38.449 crypto_ram3 ]] 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "0fb69492-45d0-5726-94f6-ac4a2f861355"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0fb69492-45d0-5726-94f6-ac4a2f861355",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "0640b008-aa8d-5e8f-9a51-89fa0e383c94"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0640b008-aa8d-5e8f-9a51-89fa0e383c94",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "dce68e69-bb75-5a52-adab-a0e1068f41fd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dce68e69-bb75-5a52-adab-a0e1068f41fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "9386ba65-9b5a-5697-8448-a104cb4e5629"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9386ba65-9b5a-5697-8448-a104cb4e5629",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:38:38.449 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:38:38.450 ************************************ 00:38:38.450 START TEST bdev_fio_trim 00:38:38.450 ************************************ 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:38:38.450 06:54:50 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:38.450 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:38:38.450 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:38:38.450 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:38:38.450 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:38:38.450 fio-3.35 00:38:38.450 Starting 4 threads 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:01.0 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:01.1 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:01.2 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:01.3 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:01.4 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:01.5 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:01.6 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:01.7 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:02.0 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:02.1 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:02.2 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:02.3 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:02.4 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:02.5 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:02.6 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3d:02.7 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:01.0 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:01.1 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:01.2 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:01.3 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:01.4 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:01.5 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:01.6 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:01.7 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:02.0 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:02.1 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:02.2 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:02.3 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:02.4 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:02.5 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:02.6 cannot be used 00:38:38.450 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:38.450 EAL: Requested device 0000:3f:02.7 cannot be used 00:38:50.658 00:38:50.658 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1367792: Thu Jul 25 06:55:03 2024 00:38:50.658 write: IOPS=35.1k, BW=137MiB/s (144MB/s)(1373MiB/10001msec); 0 zone resets 00:38:50.658 slat (usec): min=15, max=464, avg=66.96, stdev=37.78 00:38:50.658 clat (usec): min=30, max=1773, avg=239.14, stdev=144.55 00:38:50.658 lat (usec): min=45, max=1863, avg=306.10, stdev=167.45 00:38:50.658 clat percentiles (usec): 00:38:50.658 | 50.000th=[ 208], 99.000th=[ 725], 99.900th=[ 816], 99.990th=[ 873], 00:38:50.658 | 99.999th=[ 914] 00:38:50.658 bw ( KiB/s): min=126240, max=217681, per=100.00%, avg=141099.00, stdev=5515.89, samples=76 00:38:50.658 iops : min=31560, max=54419, avg=35274.68, stdev=1378.90, samples=76 00:38:50.658 trim: IOPS=35.1k, BW=137MiB/s (144MB/s)(1373MiB/10001msec); 0 zone resets 00:38:50.658 slat (usec): min=5, max=1307, avg=18.41, stdev= 8.31 00:38:50.658 clat (usec): min=45, max=1863, avg=306.26, stdev=167.46 00:38:50.658 lat (usec): min=51, max=1876, avg=324.66, stdev=170.84 00:38:50.658 clat percentiles (usec): 00:38:50.658 | 50.000th=[ 269], 99.000th=[ 865], 99.900th=[ 971], 99.990th=[ 1037], 00:38:50.658 | 99.999th=[ 1074] 00:38:50.658 bw ( KiB/s): min=126240, max=217681, per=100.00%, avg=141099.00, stdev=5515.88, samples=76 00:38:50.658 iops : min=31560, max=54419, avg=35274.68, stdev=1378.90, samples=76 00:38:50.658 lat (usec) : 50=0.77%, 100=7.84%, 250=45.09%, 500=36.72%, 750=7.80% 00:38:50.658 lat (usec) : 1000=1.76% 00:38:50.658 lat (msec) : 2=0.02% 00:38:50.658 cpu : usr=99.63%, sys=0.00%, ctx=98, majf=0, minf=106 00:38:50.658 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:38:50.658 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:38:50.658 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:38:50.658 issued rwts: total=0,351365,351365,0 short=0,0,0,0 dropped=0,0,0,0 00:38:50.658 latency : target=0, window=0, percentile=100.00%, depth=8 00:38:50.658 00:38:50.658 Run status group 0 (all jobs): 00:38:50.658 WRITE: bw=137MiB/s (144MB/s), 137MiB/s-137MiB/s (144MB/s-144MB/s), io=1373MiB (1439MB), run=10001-10001msec 00:38:50.658 TRIM: bw=137MiB/s (144MB/s), 137MiB/s-137MiB/s (144MB/s-144MB/s), io=1373MiB (1439MB), run=10001-10001msec 00:38:50.658 00:38:50.658 real 0m13.597s 00:38:50.658 user 0m54.325s 00:38:50.658 sys 0m0.673s 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:38:50.658 ************************************ 00:38:50.658 END TEST bdev_fio_trim 00:38:50.658 ************************************ 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:38:50.658 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:38:50.658 00:38:50.658 real 0m27.529s 00:38:50.658 user 1m48.856s 00:38:50.658 sys 0m1.529s 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:38:50.658 ************************************ 00:38:50.658 END TEST bdev_fio 00:38:50.658 ************************************ 00:38:50.658 06:55:03 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:38:50.658 06:55:03 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:38:50.658 06:55:03 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:38:50.658 06:55:03 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:50.658 06:55:03 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:50.658 ************************************ 00:38:50.658 START TEST bdev_verify 00:38:50.658 ************************************ 00:38:50.658 06:55:03 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:38:50.658 [2024-07-25 06:55:03.915756] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:38:50.658 [2024-07-25 06:55:03.915808] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1369464 ] 00:38:50.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.658 EAL: Requested device 0000:3d:01.0 cannot be used 00:38:50.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.658 EAL: Requested device 0000:3d:01.1 cannot be used 00:38:50.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.658 EAL: Requested device 0000:3d:01.2 cannot be used 00:38:50.658 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.658 EAL: Requested device 0000:3d:01.3 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:01.4 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:01.5 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:01.6 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:01.7 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:02.0 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:02.1 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:02.2 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:02.3 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:02.4 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:02.5 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:02.6 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3d:02.7 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:01.0 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:01.1 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:01.2 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:01.3 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:01.4 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:01.5 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:01.6 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:01.7 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:02.0 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:02.1 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:02.2 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:02.3 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:02.4 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:02.5 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:02.6 cannot be used 00:38:50.659 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:50.659 EAL: Requested device 0000:3f:02.7 cannot be used 00:38:50.659 [2024-07-25 06:55:04.048418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:38:50.659 [2024-07-25 06:55:04.093201] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:38:50.659 [2024-07-25 06:55:04.093207] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:50.659 [2024-07-25 06:55:04.114538] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:38:50.659 [2024-07-25 06:55:04.122566] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:38:50.659 [2024-07-25 06:55:04.130588] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:38:50.918 [2024-07-25 06:55:04.233424] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:38:53.456 [2024-07-25 06:55:06.545995] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:38:53.456 [2024-07-25 06:55:06.546056] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:38:53.456 [2024-07-25 06:55:06.546069] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:53.456 [2024-07-25 06:55:06.554002] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:38:53.456 [2024-07-25 06:55:06.554020] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:38:53.456 [2024-07-25 06:55:06.554031] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:53.456 [2024-07-25 06:55:06.562024] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:38:53.456 [2024-07-25 06:55:06.562041] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:38:53.456 [2024-07-25 06:55:06.562051] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:53.457 [2024-07-25 06:55:06.570048] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:38:53.457 [2024-07-25 06:55:06.570064] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:38:53.457 [2024-07-25 06:55:06.570074] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:38:53.457 Running I/O for 5 seconds... 00:38:58.770 00:38:58.770 Latency(us) 00:38:58.770 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:58.770 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:38:58.770 Verification LBA range: start 0x0 length 0x1000 00:38:58.770 crypto_ram : 5.06 540.36 2.11 0.00 0.00 235574.41 3224.37 169449.88 00:38:58.770 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:38:58.770 Verification LBA range: start 0x1000 length 0x1000 00:38:58.770 crypto_ram : 5.07 544.63 2.13 0.00 0.00 233795.01 3093.30 168611.02 00:38:58.770 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:38:58.770 Verification LBA range: start 0x0 length 0x1000 00:38:58.770 crypto_ram1 : 5.07 544.89 2.13 0.00 0.00 233291.57 2516.58 157705.83 00:38:58.770 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:38:58.770 Verification LBA range: start 0x1000 length 0x1000 00:38:58.770 crypto_ram1 : 5.07 549.16 2.15 0.00 0.00 231552.50 2988.44 156866.97 00:38:58.770 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:38:58.770 Verification LBA range: start 0x0 length 0x1000 00:38:58.770 crypto_ram2 : 5.05 4246.58 16.59 0.00 0.00 29847.40 2411.72 27053.26 00:38:58.770 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:38:58.770 Verification LBA range: start 0x1000 length 0x1000 00:38:58.770 crypto_ram2 : 5.05 4261.63 16.65 0.00 0.00 29753.63 6474.96 27262.98 00:38:58.770 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:38:58.770 Verification LBA range: start 0x0 length 0x1000 00:38:58.770 crypto_ram3 : 5.05 4255.44 16.62 0.00 0.00 29711.56 1644.95 27053.26 00:38:58.770 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:38:58.770 Verification LBA range: start 0x1000 length 0x1000 00:38:58.770 crypto_ram3 : 5.06 4276.02 16.70 0.00 0.00 29563.97 3211.26 27053.26 00:38:58.770 =================================================================================================================== 00:38:58.770 Total : 19218.70 75.07 0.00 0.00 52888.47 1644.95 169449.88 00:38:58.770 00:38:58.770 real 0m8.182s 00:38:58.770 user 0m15.486s 00:38:58.770 sys 0m0.488s 00:38:58.770 06:55:12 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:38:58.770 06:55:12 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:38:58.770 ************************************ 00:38:58.770 END TEST bdev_verify 00:38:58.770 ************************************ 00:38:58.770 06:55:12 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:38:58.770 06:55:12 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:38:58.770 06:55:12 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:38:58.770 06:55:12 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:38:58.770 ************************************ 00:38:58.770 START TEST bdev_verify_big_io 00:38:58.770 ************************************ 00:38:58.770 06:55:12 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:38:58.770 [2024-07-25 06:55:12.184783] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:38:58.770 [2024-07-25 06:55:12.184837] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1370818 ] 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:01.0 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:01.1 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:01.2 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:01.3 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:01.4 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:01.5 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:01.6 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:01.7 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:02.0 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:02.1 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:02.2 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:02.3 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:02.4 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:02.5 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:02.6 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3d:02.7 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:01.0 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:01.1 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:01.2 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:01.3 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:01.4 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:01.5 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:01.6 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:01.7 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:02.0 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:02.1 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:02.2 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:02.3 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:02.4 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:02.5 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:02.6 cannot be used 00:38:58.770 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:38:58.770 EAL: Requested device 0000:3f:02.7 cannot be used 00:38:58.770 [2024-07-25 06:55:12.318170] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:38:59.029 [2024-07-25 06:55:12.363739] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:38:59.029 [2024-07-25 06:55:12.363745] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:59.029 [2024-07-25 06:55:12.385205] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:38:59.030 [2024-07-25 06:55:12.393234] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:38:59.030 [2024-07-25 06:55:12.401256] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:38:59.030 [2024-07-25 06:55:12.504952] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:39:01.564 [2024-07-25 06:55:14.820548] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:39:01.564 [2024-07-25 06:55:14.820605] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:39:01.564 [2024-07-25 06:55:14.820619] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:01.564 [2024-07-25 06:55:14.828565] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:39:01.564 [2024-07-25 06:55:14.828583] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:39:01.564 [2024-07-25 06:55:14.828593] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:01.564 [2024-07-25 06:55:14.836588] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:39:01.564 [2024-07-25 06:55:14.836609] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:39:01.564 [2024-07-25 06:55:14.836620] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:01.564 [2024-07-25 06:55:14.844611] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:39:01.564 [2024-07-25 06:55:14.844626] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:39:01.564 [2024-07-25 06:55:14.844637] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:01.564 Running I/O for 5 seconds... 00:39:02.131 [2024-07-25 06:55:15.681445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.681832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.681904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.681948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.681987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.682025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.682435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.682455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.685636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.685680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.685723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.685761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.686260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.686301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.686339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.131 [2024-07-25 06:55:15.686377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.686774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.686790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.689805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.689847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.689884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.689921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.690346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.690386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.690424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.690465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.690861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.690877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.694063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.694106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.694149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.694188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.694617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.694657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.694695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.694736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.695097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.695112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.698104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.698161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.698200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.698239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.698663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.698722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.698772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.698810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.699231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.699248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.702503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.702546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.702584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.702624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.703030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.703072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.703110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.703154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.703577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.703593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.706616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.706671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.706718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.706756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.707215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.707258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.707296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.707333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.707743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.707760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.710787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.710829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.710867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.710905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.711339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.711379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.711419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.711457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.711826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.393 [2024-07-25 06:55:15.711840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.714937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.714978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.715016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.715054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.715495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.715536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.715574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.715613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.715973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.715992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.719028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.719072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.719110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.719153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.719564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.719619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.719658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.719696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.720033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.720050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.723097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.723152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.723192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.723232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.723623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.723662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.723713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.723751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.724122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.724137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.727383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.727439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.727478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.727516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.727889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.727930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.727968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.728006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.728365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.728385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.731354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.731396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.731442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.731488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.731860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.731903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.731940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.731977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.732396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.732414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.735284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.735328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.735367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.735405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.735800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.735843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.735881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.735920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.736328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.736349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.739088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.739129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.739172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.739209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.739647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.739687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.739725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.739763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.740163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.740180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.743169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.743212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.743250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.743287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.743710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.743750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.743788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.743825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.744177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.744193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.747034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.747080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.747118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.747162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.747604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.747664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.747702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.747740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.748166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.748182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.394 [2024-07-25 06:55:15.751073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.751115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.751159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.751197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.751594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.751634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.751672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.751710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.752113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.752129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.754908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.754954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.754991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.755028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.755467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.755510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.755548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.755586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.755979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.755995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.758736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.758780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.758818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.758856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.759270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.759312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.759351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.759388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.759701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.759716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.762551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.762594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.762632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.762669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.763067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.763121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.763180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.763230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.763616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.763632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.766545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.766593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.766630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.766688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.767099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.767150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.767200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.767243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.767611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.767626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.770541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.770597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.770636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.770673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.771053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.771094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.771132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.771174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.771524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.771539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.774263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.774317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.774355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.774404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.774795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.774835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.774873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.774910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.775343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.775359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.777968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.778010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.778055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.778093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.778581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.778622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.778660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.778698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.779093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.395 [2024-07-25 06:55:15.779109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.781651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.781693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.781730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.781767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.782195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.782236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.782277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.782315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.782685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.782700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.785297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.785339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.785377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.785417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.785841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.785881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.785922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.785959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.786417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.786433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.789204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.789247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.789289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.789327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.789749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.789790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.789828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.789867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.790259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.790275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.791881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.791929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.791967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.792005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.792286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.792325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.792361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.792398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.792623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.792638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.794548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.794591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.794629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.794667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.795107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.795153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.795210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.795248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.795693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.795708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.797492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.797536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.797573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.797613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.797896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.797936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.797975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.798012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.798245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.798261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.799998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.800039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.800075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.800108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.800532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.800573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.800612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.800650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.801011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.801026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.804463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.805290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.806707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.808238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.809944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.810324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.810681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.811036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.811450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.811466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.814570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.815841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.817087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.818565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.819722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.820085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.820444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.820803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.821201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.396 [2024-07-25 06:55:15.821218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.823509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.824776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.826249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.827725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.828380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.828740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.829097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.829460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.829738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.829753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.832875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.834435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.835916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.837216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.837955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.838317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.838673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.839565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.839883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.839898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.842708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.844184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.845654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.846038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.846815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.847178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.847556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.848905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.849143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.849158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.852254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.853725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.854656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.855026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.855783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.856145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.857410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.858647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.858879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.858893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.862095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.863587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.863945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.864308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.865017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.865842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.867092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.868559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.868791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.868806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.871882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.872388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.872751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.873108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.873854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.875256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.876748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.878228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.878462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.878476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.881049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.881427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.881786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.882148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.883906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.885174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.886635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.888119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.888472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.888488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.890308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.890669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.891026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.891389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.892894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.894359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.895829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.896642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.896877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.896891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.898870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.899237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.899598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.900556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.902300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.397 [2024-07-25 06:55:15.903784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.904987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.906459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.906728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.906743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.908887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.909251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.909796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.911058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.912759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.914367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.915342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.916592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.916826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.916841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.919225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.919586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.921173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.922708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.924415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.925169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.926426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.927896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.928129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.928148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.930669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.931942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.933194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.934658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.935831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.937353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.938960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.940442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.940675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.940689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.944006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.398 [2024-07-25 06:55:15.945263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.660 [2024-07-25 06:55:15.946735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.660 [2024-07-25 06:55:15.948208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.660 [2024-07-25 06:55:15.949975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.660 [2024-07-25 06:55:15.951293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.952769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.954329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.954710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.954725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.958317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.959794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.961284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.962293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.963775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.965248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.966727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.967290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.967711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.967727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.971443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.972912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.974234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.975556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.977307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.978772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.979648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.980008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.980411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.980429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.983930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.985411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.986284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.987529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.989253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.990524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.990884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.991244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.991702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.991717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.994888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.995660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.996907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:15.998402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.000156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.000523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.000882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.001242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.001653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.001669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.004309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.005778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.007157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.008659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.009559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.009920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.010280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.010644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.010972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.010986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.013395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.014653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.016111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.017580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.018250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.018608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.018967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.019550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.019783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.019798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.022517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.024002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.025482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.026520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.027279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.027638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.027993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.029548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.029781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.029796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.032730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.034334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.035805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.036167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.036913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.037276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.038485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.039749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.039980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.039996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.042937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.044425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.044873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.045237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.045943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.661 [2024-07-25 06:55:16.046725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.047969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.049438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.049670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.049685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.052660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.053480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.053840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.054202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.054927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.056253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.057500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.058975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.059211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.059226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.062173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.062596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.062953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.063312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.064558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.065817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.067293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.068774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.069069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.069084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.071328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.071694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.072054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.072416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.074287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.075882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.077360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.078702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.078977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.078992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.080693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.081054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.081417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.081775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.083245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.084632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.085258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.086788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.087020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.087035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.089328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.089690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.090059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.090421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.091167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.091524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.091883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.092249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.092663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.092679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.095206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.095574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.095932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.096292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.097017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.097399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.097759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.098114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.098506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.098523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.101038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.101403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.101782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.101820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.102533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.102891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.103248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.103608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.103989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.104004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.106633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.106998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.107361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.107717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.107760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.662 [2024-07-25 06:55:16.108171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.108535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.108916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.109282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.109652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.110029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.110045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.112965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.113403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.113420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.115589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.115630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.115668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.115705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.116069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.116112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.116156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.116193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.116231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.116624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.116640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.118699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.118740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.118777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.118815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.119231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.119276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.119321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.119359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.119397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.119787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.119803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.122712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.123061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.123076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.125320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.125361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.125399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.125437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.125835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.125882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.125921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.125971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.126009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.126493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.126508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.128798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.128839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.128878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.128920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.129271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.129325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.129364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.129402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.129440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.129749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.129766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.132798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.133193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.663 [2024-07-25 06:55:16.133209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.135417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.135458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.135500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.135538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.135872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.135925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.135965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.136013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.136053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.136515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.136531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.138726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.138781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.138832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.138891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.139243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.139299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.139337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.139374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.139411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.139813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.139829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.141861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.141902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.141941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.141977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.142410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.142454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.142493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.142531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.142568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.142972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.142987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.145938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.146337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.146353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.148454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.148495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.148536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.148574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.148978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.149022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.149069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.149108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.149150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.149575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.149591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.151779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.151821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.151860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.151898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.152270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.152316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.152354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.152392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.152431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.152831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.152849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.154876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.154918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.154956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.154996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.155429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.155472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.155512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.155553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.155592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.155966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.155981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.158906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.159260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.664 [2024-07-25 06:55:16.159276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.161418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.161460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.161501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.161539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.161932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.161985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.162026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.162083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.162134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.162521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.162536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.164798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.164839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.164877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.164917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.165265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.165323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.165363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.165401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.165452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.165824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.165840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.168334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.168385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.168439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.168492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.168933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.168998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.169048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.169097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.169135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.169500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.169515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.171773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.171814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.171852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.171900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.172217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.172270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.172329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.172367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.172404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.172813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.172830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.174979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.175033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.175090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.175129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.175530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.175572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.175610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.175649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.175686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.176104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.176120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.178679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.179065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.179080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.181197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.665 [2024-07-25 06:55:16.181238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.181984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.183995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.184229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.184244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.186978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.187212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.187227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.188686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.188733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.188773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.188813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.189044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.189087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.189144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.189185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.189225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.189455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.189470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.191424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.191464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.191503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.191541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.191949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.191991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.192032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.192082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.192122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.192358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.192373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.193820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.193863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.193900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.193937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.194182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.194230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.194271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.194308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.194345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.194574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.194589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.196381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.196422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.196465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.196503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.196924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.196967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.197011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.197049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.197088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.197482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.197498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.198857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.198897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.198933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.198978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.199318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.199367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.199405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.666 [2024-07-25 06:55:16.199442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.199479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.199748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.199762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.201994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.202427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.202443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.203989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.204029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.205505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.205558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.205870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.205917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.205954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.205991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.206027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.206298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.206313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.207874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.207914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.207953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.208315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.208707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.208752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.208791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.208829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.208866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.209284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.209299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.211381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.667 [2024-07-25 06:55:16.212676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.214153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.215634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.215866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.216238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.216608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.216966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.217328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.217560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.217575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.927 [2024-07-25 06:55:16.220388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.221723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.223196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.224756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.225122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.225495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.225851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.226209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.227433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.227707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.227722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.230375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.231843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.233316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.234022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.234496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.234859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.235219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.235931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.237183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.237414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.237429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.240326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.241806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.243114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.243477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.243881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.244242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.244599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.246166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.247621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.247857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.247872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.250793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.252278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.252644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.253002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.253384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.253744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.254853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.256109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.257582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.257814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.257828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.260784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.261529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.261886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.262246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.262672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.263345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.264584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.266035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.267506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.267743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.267758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.270430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.270792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.271151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.271511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.271920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.273520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.275011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.276614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.278082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.278409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.278424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.280092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.280459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.280815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.281176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.281407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.282672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.284133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.285610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.286387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.286620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.286635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.288393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.288752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.289108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.290232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.290501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.291996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.293468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.294465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.928 [2024-07-25 06:55:16.296064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.296303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.296318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.298287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.298646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.299328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.300587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.300819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.302409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.303837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.305020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.306264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.306498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.306514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.308681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.309042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.310506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.312055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.312294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.313771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.314551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.315807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.317284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.317517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.317532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.319850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.321192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.322449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.323925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.324160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.324972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.326361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.327848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.329325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.329558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.329573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.332739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.333999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.335472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.336953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.337313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.338853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.340290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.341839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.343400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.343749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.343765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.347100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.348575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.350039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.350972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.351210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.352465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.353931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.355405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.355873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.356291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.356307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.359786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.361274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.362606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.363928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.364223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.365719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.367204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.367977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.368340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.368757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.368773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.372227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.373713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.374491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.375752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.375986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.377484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.378816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.379177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.379537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.379959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.379974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.383079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.383847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.385221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.386698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.386932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.388410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.388774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.389132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.929 [2024-07-25 06:55:16.389491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.389909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.389924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.392669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.393959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.395207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.396675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.396910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.397789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.398159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.398517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.398880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.399208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.399224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.401310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.402555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.404032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.405513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.405804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.406178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.406534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.406913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.407483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.407714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.407729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.410611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.412145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.413621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.414900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.415323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.415688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.416045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.416405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.417981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.418219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.418235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.420890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.422371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.423883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.424252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.424670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.425032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.425398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.426651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.427897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.428130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.428148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.431010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.432489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.433132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.433494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.433883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.434249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.434980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.436233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.437712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.437944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.437959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.440902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.442032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.442393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.442751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.443228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.443591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.445046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.446600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.448079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.448314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.448329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.451360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.451729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.452089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.452452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.452869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.454202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.455468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.456946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.458454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.458890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.458904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.460760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.461121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.461482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.461838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.930 [2024-07-25 06:55:16.462145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.463657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.465150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.466075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.467109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.467345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.467361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.469497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.469859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.470223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.470593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.471017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.471387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.471745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.472105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.472473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.472901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.472917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.475425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.475791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.476151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.476509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.476921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.477295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.477659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.478019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.478379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.478776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.478792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:02.931 [2024-07-25 06:55:16.481256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.481619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.481978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.482344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.482778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.483145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.483515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.483873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.484254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.484649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.484665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.487223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.487588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.487946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.257 [2024-07-25 06:55:16.488305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.488726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.489089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.489455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.489815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.490173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.490628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.490647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.493050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.493414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.493774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.494137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.494516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.494878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.495239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.495594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.495959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.496323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.496340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.499027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.499396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.499440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.499797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.500160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.500521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.500880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.501247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.501607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.502026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.502041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.504392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.504751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.505121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.505169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.505565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.505932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.506299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.506661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.507019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.507426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.507442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.509549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.509592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.509631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.509669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.510090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.510135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.510178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.510218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.510257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.510663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.510678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.512838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.512878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.512915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.512952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.513374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.513416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.513455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.513493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.513532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.513859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.513874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.516829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.517270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.517286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.519541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.519583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.519621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.519660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.519966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.520019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.520059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.520096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.520133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.520457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.258 [2024-07-25 06:55:16.520473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.522797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.522839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.522888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.522926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.523366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.523424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.523493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.523543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.523592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.523997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.524013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.526889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.527350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.527366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.529480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.529542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.529593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.529648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.530009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.530061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.530100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.530137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.530178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.530576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.530592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.532700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.532742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.532783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.532820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.533259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.533304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.533344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.533383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.533420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.533817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.533832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.536743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.537124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.537144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.539965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.540411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.540427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.542489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.542530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.542568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.542609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.542969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.543011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.543048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.543086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.543129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.543547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.543564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.545589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.545631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.545683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.545721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.546135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.546182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.546221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.546259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.546296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.546664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.546679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.548858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.259 [2024-07-25 06:55:16.548898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.548935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.548972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.549387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.549435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.549476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.549515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.549552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.549834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.549849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.551643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.551684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.551724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.551762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.552162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.552206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.552249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.552286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.552323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.552759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.552775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.554943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.554984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.555997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.557348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.557388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.557428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.557465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.557841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.557893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.557931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.557968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.558004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.558271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.558286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.559741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.559781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.559818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.559858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.560269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.560316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.560355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.560393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.560429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.560832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.560847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.562984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.563342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.563357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.564691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.564731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.564768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.564811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.565174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.565226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.565264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.565301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.565338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.565745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.565761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.567614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.567661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.567702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.567739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.567965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.568014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.568051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.568088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.260 [2024-07-25 06:55:16.568125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.568357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.568372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.569811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.569850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.569887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.569931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.570166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.570213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.570250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.570295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.570334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.570711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.570726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.572837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.572877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.572914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.572950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.573219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.573269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.573306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.573343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.573380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.573608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.573626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.575861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.578838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.580807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.581033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.581048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.583976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.585939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.261 [2024-07-25 06:55:16.586171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.586187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.588654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.589051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.589067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.590444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.590484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.590521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.590558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.590886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.590937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.590979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.591016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.591053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.591287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.591303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.592766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.592807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.592844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.592881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.593269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.593313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.593352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.593390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.593429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.593805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.593820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.595466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.595506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.596974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.597015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.597331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.597382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.597424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.597461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.597498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.597729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.597744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.599217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.599259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.599296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.599650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.599969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.600023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.600061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.600099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.600137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.600551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.600567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.603105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.604651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.606086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.607634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.607869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.608484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.608844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.609204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.609561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.609840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.609855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.612118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.262 [2024-07-25 06:55:16.613367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.614843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.616304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.616618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.616987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.617351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.617713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.618266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.618498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.618514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.621305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.622794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.624252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.625439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.625837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.626207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.626565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.626921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.628450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.628682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.628697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.631344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.632802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.634274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.634638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.635068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.635440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.635800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.636775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.638037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.638275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.638290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.641180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.642661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.643388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.643751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.644147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.644509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.645168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.646416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.647885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.648118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.648134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.651080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.652346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.652709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.653070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.653484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.653851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.655277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.656592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.658073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.658310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.658325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.661319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.661686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.662044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.662404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.662811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.663861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.665114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.666577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.668044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.668374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.668390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.670617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.670988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.671353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.671712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.672079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.673458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.674930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.676407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.677648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.677926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.677941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.679732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.680094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.680470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.681007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.681243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.682618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.684125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.685740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.686780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.687061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.687076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.688878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.689245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.689605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.690971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.263 [2024-07-25 06:55:16.691245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.692732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.694205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.694972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.696316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.696549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.696563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.698753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.699114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.700284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.701519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.701753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.703242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.704245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.705835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.707404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.707639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.707653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.709993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.710606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.711849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.713332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.713565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.715109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.716222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.717469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.718958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.719196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.719212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.721774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.723073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.724527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.726007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.726243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.727134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.728375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.729853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.731334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.731625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.731640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.735171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.736421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.737910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.739406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.739822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.741161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.742634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.744116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.745304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.745721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.745736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.749025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.750486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.751954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.752413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.752648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.753898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.755361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.756835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.757204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.757601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.757621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.761096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.762578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.763852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.765218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.765491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.766991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.768471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.769212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.769574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.769978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.769992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.773406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.774934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.776050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.777298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.777532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.779035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.780035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.780412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.780776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.781219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.781235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.784258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.785009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.786317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.787780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.264 [2024-07-25 06:55:16.788014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.789534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.789900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.790267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.790623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.791041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.791057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.793528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.795112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.796592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.798173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.798407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.798926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.799290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.799649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.800008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.800316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.800331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.802500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.803755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.805239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.806705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.806999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.807373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.807730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.808086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.808633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.808866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.265 [2024-07-25 06:55:16.808881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.811638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.813121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.814585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.815754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.816102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.816475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.816835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.817197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.818748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.819023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.819038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.821841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.822589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.822948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.823309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.823739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.824353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.825592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.827099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.828578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.828812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.828827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.831675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.832044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.832410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.832768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.833172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.833538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.833902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.834266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.834623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.835041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.835061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.837599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.837964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.528 [2024-07-25 06:55:16.838336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.838700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.839108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.839476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.839833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.840195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.840563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.840877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.840893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.843877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.844249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.844609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.844966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.845379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.845746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.846109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.846475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.846837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.847188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.847204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.849610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.849969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.850330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.850696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.851089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.851463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.851821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.852188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.852548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.852942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.852957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.855554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.855923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.856291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.856651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.856985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.857358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.857718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.858080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.858443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.858836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.858852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.861271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.861637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.862000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.862367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.862713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.863079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.863444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.863800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.864164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.864549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.864565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.867089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.867460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.867822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.868199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.868598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.868959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.869325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.869688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.870064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.870500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.870517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.872954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.873322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.873679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.874036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.874383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.874751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.875113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.875478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.875837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.876244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.876261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.878740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.879106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.879156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.879521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.879883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.880256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.880619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.880980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.881346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.881710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.881726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.884304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.884670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.885032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.529 [2024-07-25 06:55:16.885077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.885478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.885842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.886212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.886571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.886953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.887391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.887407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.889603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.889659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.889709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.889758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.890129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.890187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.890226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.890264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.890300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.890697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.890713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.892769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.892811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.892849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.892887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.893287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.893331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.893370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.893408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.893446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.893853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.893870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.896845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.897195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.897211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.899971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.900406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.900423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.902556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.902600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.902639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.902677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.903025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.903083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.903122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.903165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.903203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.903610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.903628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.905659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.905706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.905747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.905786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.906205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.906251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.906291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.906330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.906369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.906774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.906790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.908961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.909988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.911704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.911746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.911790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.530 [2024-07-25 06:55:16.911828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.912229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.912276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.912316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.912359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.912410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.912839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.912861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.915715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.916122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.916144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.917555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.917596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.917632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.917669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.917949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.918000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.918038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.918080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.918117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.918355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.918370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.919860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.919903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.919940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.919978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.920396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.920441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.920480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.920521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.920561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.920902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.920918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.922638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.922678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.922718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.922755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.922985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.923035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.923072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.923109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.923152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.923471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.923486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.924909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.924950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.924987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.925031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.925359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.925417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.925456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.925494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.925531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.925930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.925945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.927817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.927857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.927894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.927931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.928164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.928217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.928254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.928291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.928328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.928556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.928570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.531 [2024-07-25 06:55:16.930952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.933928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.935959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.936194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.936209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.938481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.938525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.938563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.938602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.938868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.938946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.938984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.939020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.939059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.939332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.939348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.940850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.940897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.940943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.940981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.941217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.941259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.941306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.941344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.941381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.941609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.941624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.943565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.943606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.943647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.943689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.944095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.944143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.944183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.944221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.944276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.944508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.944523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.946833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.948572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.948614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.948653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.948690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.949044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.949098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.949137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.949182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.949220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.949616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.949632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.951741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.532 [2024-07-25 06:55:16.952008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.952023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.953533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.953575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.953616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.953655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.954046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.954089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.954130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.954174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.954226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.954637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.954652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.956805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.957192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.957207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.958556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.958605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.958642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.958679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.959019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.959072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.959110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.959154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.959192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.959600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.959617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.961520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.961560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.961597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.961634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.961862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.961911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.961949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.961985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.962047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.962280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.962295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.963848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.963908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.963952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.963989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.964226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.964276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.964315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.964352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.964390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.964759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.964774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.966913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.966953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.966993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.967038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.967273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.967321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.967359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.967397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.967444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.967674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.967689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.969176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.533 [2024-07-25 06:55:16.969226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.969958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.972129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.972180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.973779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.973822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.974055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.974101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.974143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.974189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.974232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.974460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.974474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.976014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.976054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.976090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.977541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.977811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.977862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.977901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.977939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.977989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.978393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.978410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.981653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.983118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.984587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.985402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.985670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.987255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.988723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.990043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.990410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.990821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.990842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.994101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.995581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.996384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.997772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.998006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:16.999509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.001005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.001375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.001734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.002055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.002071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.005288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.006550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.007943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.009100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.009339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.010838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.011699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.012064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.012428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.012874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.012891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.015893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.016659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.017915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.019385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.019619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.021091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.021456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.021817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.022185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.022592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.022610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.025061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.026671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.028170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.029770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.030004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.030565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.030926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.031290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.031649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.031953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.031968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.034327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.035589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.534 [2024-07-25 06:55:17.037057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.038534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.038874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.039250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.039612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.039976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.040613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.040846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.040862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.043578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.045060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.046547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.047577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.047937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.048317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.048687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.049048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.050525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.050759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.050774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.052913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.054386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.055865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.056312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.056743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.057113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.057477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.058203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.059446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.059678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.059692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.062636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.064108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.065119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.065487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.065881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.066255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.066615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.068206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.069756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.069989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.070003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.073031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.074594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.074956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.075325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.075669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.076037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.077230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.078471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.079946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.080185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.535 [2024-07-25 06:55:17.080200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.083166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.083687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.084047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.084412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.084825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.085492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.086734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.088208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.089674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.089917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.089932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.092421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.092791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.093156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.093521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.093903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.095518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.097059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.098635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.100050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.100347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.100363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.102118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.102486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.102847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.103211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.103443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.104705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.106181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.107669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.108428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.108661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.108677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.110549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.110913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.111278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.112448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.112719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.114214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.115688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.116638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.798 [2024-07-25 06:55:17.118187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.118419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.118434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.120521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.120889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.121596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.122853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.123089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.124628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.126005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.127272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.128529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.128767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.128782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.131112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.131487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.132851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.134316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.134551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.136030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.136845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.138095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.139569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.139803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.139817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.142181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.143540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.144784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.146272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.146506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.147292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.148624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.150109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.151587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.151820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.151834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.154995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.156246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.157714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.159191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.159496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.160928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.162274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.163757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.165343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.165706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.165721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.169465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.170939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.172282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.173235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.173469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.174958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.176021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.176388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.176748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.177115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.177129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.180434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.182044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.182764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.184017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.184257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.185761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.187110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.187477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.187840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.188192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.188208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.190583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.190946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.191316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.191678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.192110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.192494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.192857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.193220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.193585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.194022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.194037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.196621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.196992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.197363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.197724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.198069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.799 [2024-07-25 06:55:17.198457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.198819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.199188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.199549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.199954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.199970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.202453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.202819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.203200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.203563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.203903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.204278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.204642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.205001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.205366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.205725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.205741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.208378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.208751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.209116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.209484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.209895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.210266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.210625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.210983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.211354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.211724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.211739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.214273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.214641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.215001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.215369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.215759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.216123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.216491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.216856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.217219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.217636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.217654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.220149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.220513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.220873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.221241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.221590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.221958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.222325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.222682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.223043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.223369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.223391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.225975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.226346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.227959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.228332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.228717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.230164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.230531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.230893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.231275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.231619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.231634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.233747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.234117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.234489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.235254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.235498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.235865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.236506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.237606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.237970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.238325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.238341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.241017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.241994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.242361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.242719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.243026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.244014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.244758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.245117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.246525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.246891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.246906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.249459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.249824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.251308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.251675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.800 [2024-07-25 06:55:17.252067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.252438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.252806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.254303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.254662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.255061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.255076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.257382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.258760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.258806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.259166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.259512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.260849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.261215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.261574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.261935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.262238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.262253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.264372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.264738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.265101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.265150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.265385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.266195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.266555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.267882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.268288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.268680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.268699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.270592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.270634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.270672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.270710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.271094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.271149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.271187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.271224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.271261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.271498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.271516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.273680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.273721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.273758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.273794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.274068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.274118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.274161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.274199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.274237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.274632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.274648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.276503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.276546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.276584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.276624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.276953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.277007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.277045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.277097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.277144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.277377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.277391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.279538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.279580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.279618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.279657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.280041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.280094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.280134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.280196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.280247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.280624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.280639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.282612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.282652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.282689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.282726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.283008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.283064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.283102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.283144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.283183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.283583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.283599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.285275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.285316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.285368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.285407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.801 [2024-07-25 06:55:17.285636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.285677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.285723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.285775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.285813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.286248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.286265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.288373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.288416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.288473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.288523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.288961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.289017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.289069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.289118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.289162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.289493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.289508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.291645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.291687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.291728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.291766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.292085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.292145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.292184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.292237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.292286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.292732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.292748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.294752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.294794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.294835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.294872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.295104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.295159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.295200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.295237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.295274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.295503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.295518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.297938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.300905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.301135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.301155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.302616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.302657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.302697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.302733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.302963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.303012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.303049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.303085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.303130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.303364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.303379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.305616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.305663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.305701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.305739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.306022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.306070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.306108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.306150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.306187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.306451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.306466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.307961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.308007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.308051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.308100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.308341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.308382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.802 [2024-07-25 06:55:17.308427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.308469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.308506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.308735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.308750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.310742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.310784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.310821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.310858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.311254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.311299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.311339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.311377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.311417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.311649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.311663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.313978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.317885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.317934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.317982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.318023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.318260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.318305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.318350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.318388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.318429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.318661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.318675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.322983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.323029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.323066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.323102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.323403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.323455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.323494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.323552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.323592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.324067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.324084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.327996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.328232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.328251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.332018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.332066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.332106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.332148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.332568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.332611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.803 [2024-07-25 06:55:17.332650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.332689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.332726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.332990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.333005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.336685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.336731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.336768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.336812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.337043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.337088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.337126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.337184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.337222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.337452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.337467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.340821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.344715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.344761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.344798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.344839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.345243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.345288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.345326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.345364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.345413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.345828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.345844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.349994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:03.804 [2024-07-25 06:55:17.350008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.354925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.359747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.360175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.360191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.363490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.363535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.363572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.363609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.363907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.363958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.363997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.364038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.364075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.364311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.364326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.367977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.066 [2024-07-25 06:55:17.368022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.368498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.368545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.368583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.368855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.372595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.372640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.372677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.372720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.372983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.373021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.373061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.373108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.373340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.375957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.376002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.376040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.376076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.376344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.376382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.376419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.376456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.376683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.380576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.380622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.380670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.381003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.381042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.381398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.381451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.381490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.381528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.381573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.381967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.383400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.383441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.383478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.383524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.384278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.384516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.384532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.384592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.384633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.384670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.384706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.384938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.387110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.387479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.388981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.390367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.390600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.390616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.392106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.392769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.394093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.395559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.395791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.398049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.399283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.400530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.401979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.402218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.402234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.403145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.404611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.406173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.407638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.407872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.411084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.412323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.413800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.415270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.415617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.415632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.417227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.418783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.420330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.421752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.422130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.425476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.426960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.428424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.429190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.429422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.429437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.067 [2024-07-25 06:55:17.430940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.432549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.434007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.434369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.434762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.438098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.439581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.440354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.441589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.441825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.441840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.443371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.444961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.445324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.445681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.446091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.449242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.450098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.451565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.453111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.453348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.453366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.454840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.455210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.455567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.455923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.456324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.458937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.460415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.461790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.463290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.463524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.463539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.464151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.464510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.464876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.465235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.465514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.468101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.469364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.470838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.472319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.472668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.472683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.473048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.473408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.473764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.474858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.475154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.477865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.479340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.480814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.481456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.481887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.481902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.482268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.482626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.483401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.484651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.484883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.487833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.489313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.490331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.490699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.491100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.491115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.491482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.491901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.493218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.494694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.494929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.497924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.499186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.499547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.499903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.500317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.500334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.500695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.502201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.503791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.505272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.505504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.508497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.508862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.509222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.509578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.068 [2024-07-25 06:55:17.509986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.510001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.511446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.512786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.514255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.515843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.516196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.517918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.518284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.518641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.518997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.519234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.519249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.520498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.521972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.523432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.524198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.524432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.526319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.526680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.527036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.528400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.528677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.528691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.530174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.531657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.532417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.533750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.533982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.536146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.536507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.537588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.538823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.539057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.539072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.540332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.541734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.543036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.544385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.544790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.547183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.547547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.548994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.550554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.550786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.550806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.552273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.553077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.554321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.555797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.556031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.558399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.559749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.560990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.562501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.562733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.562748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.563472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.565071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.566568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.568172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.568404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.570897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.571269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.571628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.571986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.572397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.572413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.572773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.573132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.573496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.573857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.574269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.576971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.577335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.577698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.578057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.578444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.578459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.578831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.579194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.579551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.579906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.580258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.582701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.583067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.583437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.583795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.584202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.584217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.584578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.069 [2024-07-25 06:55:17.584937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.585308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.585675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.586074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.588519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.588879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.589240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.589596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.589926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.589941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.590313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.590673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.591030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.591390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.591799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.594389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.594753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.595121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.595488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.595893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.595909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.596274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.596632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.596989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.597355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.597760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.600293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.600655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.601011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.601382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.601795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.601811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.602182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.602545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.602903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.603262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.603689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.606155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.606517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.606878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.607244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.607626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.607641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.608001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.608361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.608717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.609087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.609431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.612328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.612691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.613050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.613409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.613816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.613832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.614200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.614564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.614922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.615281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.615671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.618131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.618494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.618852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.070 [2024-07-25 06:55:17.619220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.619598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.619613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.619991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.620352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.620708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.621066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.621453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.623994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.624365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.624725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.625080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.625489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.625505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.625869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.626234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.626603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.626962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.627365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.629963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.630331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.630688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.631048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.631416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.631431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.631811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.632173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.632531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.632891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.633229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.635680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.333 [2024-07-25 06:55:17.636042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.636407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.636766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.637169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.637184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.637544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.637903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.638283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.638641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.639033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.641471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.642614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.642973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.643332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.643771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.643787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.644151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.644515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.644877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.645240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.645633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.648809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.648858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.650154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.651406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.651639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.651653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.653144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.653926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.655308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.656784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.657016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.659325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.660099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.661339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.661382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.661613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.661628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.663122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.664282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.665750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.667151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.667384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.669417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.669462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.669500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.669538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.669952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.669967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.670009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.670047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.670085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.670123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.670422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.671799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.671840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.671878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.671917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.672152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.672166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.672210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.672247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.672284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.672327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.672561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.674874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.675289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.676749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.676790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.676826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.676863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.677131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.677155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.677203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.677242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.334 [2024-07-25 06:55:17.677280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.677321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.677553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.679751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.680117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.681828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.681868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.681905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.681941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.682172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.682187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.682234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.682276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.682312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.682349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.682712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.684719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.685131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.687849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.689893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.690271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.692487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.692528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.692565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.692602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.692874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.692889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.692936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.692974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.693011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.693048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.693280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.694756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.694797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.694834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.694891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.695120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.695134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.695180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.695225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.695267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.695304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.695534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.697803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.697844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.697890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.697930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.698178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.698193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.698238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.335 [2024-07-25 06:55:17.698275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.698321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.698359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.698591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.700820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.702823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.702865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.702903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.702940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.703358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.703374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.703416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.703455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.703494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.703535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.703768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.705965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.707763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.707805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.707846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.707885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.708303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.708319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.708362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.708402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.708440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.708478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.708882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.710970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.711271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.712786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.712828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.712868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.712906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.713319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.713335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.713378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.713416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.713455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.713495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.713951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.715540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.715580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.715617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.715654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.715885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.715899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.715946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.715984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.716026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.716066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.716374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.717724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.717772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.717813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.336 [2024-07-25 06:55:17.717852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.718227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.718242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.718289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.718328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.718368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.718406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.718814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.720744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.720785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.720822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.720866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.721096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.721110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.721159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.721204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.721246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.721283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.721513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.722968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.723928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.725943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.725983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.726718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.728739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.729052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.731874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.732101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.733652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.733699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.733736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.733772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.734002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.734016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.734063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.734101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.734142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.734179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.734410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.736440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.736485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.736522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.736560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.736956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.736973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.737028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.737066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.737103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.737144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.737396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.738870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.738910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.739269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.739311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.739348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.739385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.739615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.337 [2024-07-25 06:55:17.767647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.767708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.769165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.769220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.770667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.771427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.771662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.771677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.773162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.773501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.773858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.774217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.775369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.775636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.776390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.777834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.779293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.780043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.780278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.780293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.782052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.782416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.782772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.783683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.783956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.785454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.786927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.788154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.789549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.789822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.789836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.791752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.792110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.792510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.793841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.794074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.795537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.797003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.797857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.799071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.799308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.799323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.801440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.801802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.803054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.804307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.804541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.806024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.806917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.808407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.810014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.810251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.810266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.812550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.813314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.814569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.816041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.816278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.817660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.818924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.820174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.821640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.821872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.821886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.824309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.825736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.827243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.828726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.338 [2024-07-25 06:55:17.828960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.829778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.831028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.832501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.833984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.834261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.834276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.837908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.839187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.840658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.842151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.842518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.843864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.845341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.846816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.848014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.848407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.848422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.851732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.853215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.854692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.855459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.855692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.857059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.858551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.860159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.860520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.860932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.860947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.864230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.865693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.866804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.868313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.868618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.870097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.871570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.872210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.872568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.872939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.872954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.876239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.877843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.878883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.880136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.880372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.881887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.882936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.883302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.883660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.884078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.339 [2024-07-25 06:55:17.884093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.602 [2024-07-25 06:55:17.886456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.602 [2024-07-25 06:55:17.887935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.889538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.891033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.891444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.891807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.892171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.892530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.893963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.894232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.894247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.896900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.898387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.899858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.900305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.900752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.901113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.901474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.902404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.903656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.903888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.903902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.906938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.908416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.909657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.910016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.910427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.910789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.911149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.911507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.911871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.912297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.912313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.914747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.915111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.915473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.915830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.916151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.916519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.916877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.917236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.917596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.918003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.918018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.920569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.920932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.921301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.921672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.922093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.922459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.922815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.923177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.923538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.923908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.923923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.926411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.926773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.927130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.927494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.927896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.928276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.928640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.928999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.929359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.929784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.929801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.932205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.932570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.932929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.933298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.933699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.934062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.934421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.934781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.935148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.935523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.935538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.938307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.938679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.939037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.939397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.939803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.940174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.940537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.940898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.603 [2024-07-25 06:55:17.941262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.941691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.941706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.944135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.944505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.944866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.945254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.945579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.945941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.946301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.946659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.947017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.947358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.947378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.949928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.950300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.950660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.951015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.951462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.951826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.952194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.952558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.952915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.953261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.953276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.955719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.956079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.956438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.956797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.957123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.957494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.957850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.958211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.958569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.958894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.958909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.961423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.961792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.962162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.962520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.962868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.963236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.963595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.963961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.964325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.964718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.964733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.967050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.967416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.967789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.968150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.968526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.968894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.969257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.969612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.969966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.970338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.970354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.972832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.973199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.973560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.973919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.974332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.974693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.975048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.975417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.975779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.976213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.976229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.978892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.980352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.980717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.981073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.981462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.981827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.982189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.982551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.982907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.983311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.983327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.985684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.986055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.986416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.986459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.986852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.987226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.987895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.988960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.989325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.989623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.604 [2024-07-25 06:55:17.989637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:17.992927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:17.994402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:17.995579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:17.997037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:17.997310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:17.998822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.000302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.000912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.001273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.001649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.001665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.004876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.006488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.006534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.007767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.008043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.009530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.011009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.011922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.013540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.013967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.013983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.016096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.017693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.017734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.019241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.019476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.020960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.021803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.021846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.023096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.023334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.023349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.025129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.025493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.025535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.025891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.026123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.026172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.027513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.027555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.029161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.029395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.029409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.030966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.032564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.032609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.033698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.034019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.034069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.034431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.034471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.035846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.036234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.036250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.037918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.039391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.039433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.040193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.040426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.040477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.041992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.042048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.042089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.042324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.042339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.044950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.045193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.045208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.046655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.046696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.046737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.046775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.047005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.047054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.605 [2024-07-25 06:55:18.047096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.047142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.047180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.047411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.047425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.048942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.048983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.049881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.051698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.051739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.051776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.051814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.052045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.052095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.052137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.052179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.052216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.052447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.052543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.054983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.057823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.058055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.058070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.059495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.059535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.059573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.059610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.059846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.059899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.059937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.059974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.060012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.060247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.060262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.062860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.063268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.063283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.064721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.064761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.064799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.064837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.065093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.065147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.065187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.065226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.065269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.065502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.065517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.606 [2024-07-25 06:55:18.066964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.067105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.067150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.067188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.067590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.067633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.067673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.067711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.067751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.068168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.068183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.069841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.069882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.069920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.069958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.070195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.070248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.070287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.070324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.070362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.070765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.070780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.072709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.073050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.073069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.075812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.077811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.078042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.078057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.080991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.607 [2024-07-25 06:55:18.082961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.083197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.083212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.084697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.084737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.084776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.084815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.085214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.085258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.085297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.085337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.085376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.085608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.085622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.087484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.087524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.087565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.087603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.087832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.087887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.087925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.087963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.088021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.088253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.088276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.089764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.089821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.089858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.089896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.090124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.090180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.090220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.090258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.090296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.090700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.090713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.092727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.092767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.092812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.092851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.093084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.093129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.093172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.093216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.093258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.093487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.093501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.094974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.095809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.097617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.097657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.097698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.097736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.098128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.098190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.098232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.098271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.098311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.098710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.098724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.100960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.102514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.102557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.102597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.102636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.103047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.103089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.103128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.103172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.103211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.103645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.608 [2024-07-25 06:55:18.103661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.105990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.106004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.107357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.107398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.107441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.108815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.109222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.109274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.109314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.109353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.109395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.109811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.109826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.111625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.111668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.111714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.111764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.111996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.112039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.112084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.112128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.112170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.112401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.112415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.113903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.113944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.115424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.115467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.115725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.115776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.115816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.115854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.115908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.116348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.116363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.119548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.119595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.121089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.121142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.121376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.121420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.121464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.122449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.122490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.122774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.122788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.125185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.125229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.125585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.125625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.125857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.126557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.126599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.126957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.126999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.127235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.127250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.132289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.132359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.133741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.133782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.134195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.134558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.134598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.134954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.134993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.135416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.135432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.137728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.137776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.139160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.139203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.139437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.140901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.609 [2024-07-25 06:55:18.140949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.142433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.143488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.143791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.143806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.147499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.148982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.149885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.151380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.151615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.610 [2024-07-25 06:55:18.153103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.154589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.154953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.155314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.155690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.155705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.158943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.160342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.161568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.162826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.163058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.164556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.165456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.167059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.167418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.167829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.167843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.173785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.174589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.175853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.177299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.177532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.178886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.179249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.179605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.179979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.180390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.180406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.182663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.184174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.185765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.187261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.187494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.188163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.189293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.189649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.190747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.191044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.191059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.195558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.196816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.198308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.199804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.200189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.200558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.200913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.201272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.202418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.202697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.202715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.205357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.206843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.208312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.871 [2024-07-25 06:55:18.209048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.209286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.209773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.210129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.211682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.212042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.212427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.212442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.218195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.219681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.220856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.221218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.221629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.221992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.222351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.223891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.225467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.225701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.225715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.228616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.230085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.230869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.231826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.232232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.232995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.233923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.234282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.235669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.235936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.235950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.240428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.240791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.241153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.241997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.242299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.243833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.245308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.246591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.247929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.248221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.248237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.250441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.250805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.252262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.252629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.253049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.254572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.255987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.257526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.259091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.259457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.259473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.264167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.264530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.266045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.267448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.267680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.269176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.269761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.271286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.272879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.273112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.273127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.275061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.276557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.276924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.277291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.277649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.278030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.278391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.278746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.279101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.279430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.279446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.282127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.282497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.282861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.283223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.283608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.872 [2024-07-25 06:55:18.283968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.284333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.284697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.285557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.285795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.285809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.288201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.288571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.288942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.289307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.289669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.290029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.290393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.290756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.292131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.292533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.292548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.295387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.295751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.296108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.296471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.296850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.297396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.298562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.298919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.299975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.300297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.300312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.302728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.303090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.303450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.303809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.304156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.305474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.305888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.306247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.307669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.308092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.308107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.311218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.311601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.312519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.313343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.313759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.314613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.315497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.315853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.316214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.316532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.316546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.319157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.319527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.321105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.321464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.321853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.323469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.323830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.324193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.324560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.324960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.324975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.328543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.328906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.330174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.330645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.331053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.331422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.331785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.332158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.332512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.332948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.332964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.335132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.335603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.336864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.337224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.337623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.337991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.338358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.338715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.339070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.339475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.339491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.343190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.343554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.343914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.344279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.344718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.345081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.345444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.873 [2024-07-25 06:55:18.345804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.346171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.346550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.346564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.348687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.349054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.349424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.349786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.350185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.350547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.350907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.351269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.351635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.351874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.351888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.355567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.355932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.356293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.356649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.357053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.357425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.357840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.359165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.359521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.359880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.359894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.362191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.362552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.362908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.363267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.363574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.363940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.364981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.365679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.366034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.366275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.366290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.369809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.370177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.370536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.370893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.371256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.371677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.372996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.373357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.374233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.374470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.374484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.377816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.378183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.379186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.379230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.379578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.379945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.380309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.380670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.382285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.382726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.382741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.386455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.387938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.388787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.390035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.390272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.391764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.393044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.394267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.394784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.395195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.395211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.398198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.399673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.399724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.401201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.401557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.403007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.404536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.874 [2024-07-25 06:55:18.406030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.407294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.407589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.407604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.411312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.412558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.412600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.414067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.414302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.415192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.416645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.416699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.418178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.418410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.418425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.420144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.420773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.420814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.421769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.422175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.422223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.423260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.423301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:04.875 [2024-07-25 06:55:18.424546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.424779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.424798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.428936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.429613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.429656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.430887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.431299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.431347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.431967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.432007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.432968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.433381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.433397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.434863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.436293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.436334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.437713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.437981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.438031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.439509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.439550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.439588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.439819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.439833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.442683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.442729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.442769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.442808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.443136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.443186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.443224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.443262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.443303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.443567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.443581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.445823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.449347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.449395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.449433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.449470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.449890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.449936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.137 [2024-07-25 06:55:18.449976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.450015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.450054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.450327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.450342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.451703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.451744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.451784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.451822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.452055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.452103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.452146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.452191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.452229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.452462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.452476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.457721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.458129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.458148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.459550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.459591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.459631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.459669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.459971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.460020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.460063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.460104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.460146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.460379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.460393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.464828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.465182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.465197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.466850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.466891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.466928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.466966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.467198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.467249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.467287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.467324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.467362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.467702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.467716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.472947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.474724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.474766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.474806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.474844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.475072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.475122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.475165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.475204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.475241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.475475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.475490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.479288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.479335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.479373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.479411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.479733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.138 [2024-07-25 06:55:18.479785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.479826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.479865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.479905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.480315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.480331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.482928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.487992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.488415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.488432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.490937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.491173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.491188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.495964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.496200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.496215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.498388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.498432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.498471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.498510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.498873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.498918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.498957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.498994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.499031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.499318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.499334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.503873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.504223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.504247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.506999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.507013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.510823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.510868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.510907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.139 [2024-07-25 06:55:18.510944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.511183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.511237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.511276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.511313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.511359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.511591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.511605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.513690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.513734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.513773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.513810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.514044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.514092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.514131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.514174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.514212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.514623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.514639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.518701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.518747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.518791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.518837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.519071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.519117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.519160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.519204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.519251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.519479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.519494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.521842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.522196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.522211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.525997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.526233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.526252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.527753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.527794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.527831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.527869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.528266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.528310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.528349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.528388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.528427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.528667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.528681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.532797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.533074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.533089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.534555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.534596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.534637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.535424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.535829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.535873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.535912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.535952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.535994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.536234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.536250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.539549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.140 [2024-07-25 06:55:18.539594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.539632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.539670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.540020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.540073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.540111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.540154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.540193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.540466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.540480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.541944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.541985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.542828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.542873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.543283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.543327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.543366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.543406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.543454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.543686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.543701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.547738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.547789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.549026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.549068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.549305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.549377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.549417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.550881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.550922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.551164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.551179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.556660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.556710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.557945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.557988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.558225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.559699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.559742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.560489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.560531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.560793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.560808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.565385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.565436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.566254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.566296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.566691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.567871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.567914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.569155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.569196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.569428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.569442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.574555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.574607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.575673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.575717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.576115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.576942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.576984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.577746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.578101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.578341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.578356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.583321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.584792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.585433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.586585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.587011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.587587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.588748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.589104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.590263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.590536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.590551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.596198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.597067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.598657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.599016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.599415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.601023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.141 [2024-07-25 06:55:18.601389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.602145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.603383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.603616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.603632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.609122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.610285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.610874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.611235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.611468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.612122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.612485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.614019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.615605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.615842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.615857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.620995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.622022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.622386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.623588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.623956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.624332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.625638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.626880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.628357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.628590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.628605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.633709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.634076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.634943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.635816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.636228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.637183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.638426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.639898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.641369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.641717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.641732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.646604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.647145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.648354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.648712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.649027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.650269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.651726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.653189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.654223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.654458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.654474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.658398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.660001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.660371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.660763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.660996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.662465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.664046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.664709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.666175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.666407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.666422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.669922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.671316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.672603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.674078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.674318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.675102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.676423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.677881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.679362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.679595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.679610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.684212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.685459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.686928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.142 [2024-07-25 06:55:18.688384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.143 [2024-07-25 06:55:18.688686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.690110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.691431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.692906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.694437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.694869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.694884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.700667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.702163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.703640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.704660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.704940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.706176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.707679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.709159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.709992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.710255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.710270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.714064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.715603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.715962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.716650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.404 [2024-07-25 06:55:18.716890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.717261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.717622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.717985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.718946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.719219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.719236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.722565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.723365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.723723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.725124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.725536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.725904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.726268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.726633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.728129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.728572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.728590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.732628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.732996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.733759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.734738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.735123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.735495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.735858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.736912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.737563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.737964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.737980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.742146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.742515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.744038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.744410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.744806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.745176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.745540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.746994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.747359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.747739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.747754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.751815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.752491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.753560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.753920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.754279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.754647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.755646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.756390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.756747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.756986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.757001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.760144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.761478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.761876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.762241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.762556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.762921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.764528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.764898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.765483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.765718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.765738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.769122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.770382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.770742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.771102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.771480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.772304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.773216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.773575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.774855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.775214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.775229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.779948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.780567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.780927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.781295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.781610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.782957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.405 [2024-07-25 06:55:18.783343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.783704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.785181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.785608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.785624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.791025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.791403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.791767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.792136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.792478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.793929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.794292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.794977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.796012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.796412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.796429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.800380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.801121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.801484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.801841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.802181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.803268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.803907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.804268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.805729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.806166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.806182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.811430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.811827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.812192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.812562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.812903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.814489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.814856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.815336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.816593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.817009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.817026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.822171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.822538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.822898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.823264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.823569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.824794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.825160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.826071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.826875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.827273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.827289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.831415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.831781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.832147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.832508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.832750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.833633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.833991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.835228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.835722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.836121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.836136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.839598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.839965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.840972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.841691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.841927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.842818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.844372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.844739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.845204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.845436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.845451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.848520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.849963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.850327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.850690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.851057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.851708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.852793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.853154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.854258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.854562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.854577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.859128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.859802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.860166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.860218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.406 [2024-07-25 06:55:18.860538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.860905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.862474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.862832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.863485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.863719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.863736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.867865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.869111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.870594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.872071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.872328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.873375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.874056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.874418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.875949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.876363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.876378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.881173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.882739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.882785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.884256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.884491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.885323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.886220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.886578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.887897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.888275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.888291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.891622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.893068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.893115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.894593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.894827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.896378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.897225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.897268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.897996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.898408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.898425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.902632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.904115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.904163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.904912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.905151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.905204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.906802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.906848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.908327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.908565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.908581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.912563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.913820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.913862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.915156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.915391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.915442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.916918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.916960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.917713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.917951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.917967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.921869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.922779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.922822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.923494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.923892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.923937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.925247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.925289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.925333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.925577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.925592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.929859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.929912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.929955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.929993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.930229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.930280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.930319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.930362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.930401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.930642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.930656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.933722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.933767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.407 [2024-07-25 06:55:18.933815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.933855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.934090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.934136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.934180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.934225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.934268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.934499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.934513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.939587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.940019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.940034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.942998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.943251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.943266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.947608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.947654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.947704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.947743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.948155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.948199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.948240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.948279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.948318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.948659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.948674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.952680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.952728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.952770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.952808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.953079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.953130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.953174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.953217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.953255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.953489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.953504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.408 [2024-07-25 06:55:18.957863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.958222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.958239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.961553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.961603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.961641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.961679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.961923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.961974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.962017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.962054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.962092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.962329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.962344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.966419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.966464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.966502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.966540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.966948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.966998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.967039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.967079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.967118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.967524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.967544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.971925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.971970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.972696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.975594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.975640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.975679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.975718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.976080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.976129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.976172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.976210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.976248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.976519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.976533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.980603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.980649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.980687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.980724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.980958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.981007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.981045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.981083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.981124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.981523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.981537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.986994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.987008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.991905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.992172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.992188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.994501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.994547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.671 [2024-07-25 06:55:18.994585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.994623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.994854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.994907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.994952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.994990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.995028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.995264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.995278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:18.999897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.000338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.000354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.004849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.008913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.008958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.008999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.009041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.009323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.009373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.009411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.009449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.009487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.009717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.009732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.014852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.015260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.015277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.018459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.018505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.018543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.018581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.018946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.018996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.019034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.019072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.019110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.019381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.019396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.023778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.024067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.024082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.028671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.029016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.029031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.032144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.032189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.032227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.672 [2024-07-25 06:55:19.032265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.032497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.032550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.032588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.032627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.032664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.032998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.033013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.037812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.037860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.037900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.037938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.038310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.038353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.038394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.038433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.038471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.038861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.038876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.043943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.044179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.044194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.047539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.047589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.047634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.049110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.049350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.049406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.049447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.049489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.049526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.049778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.049792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.054333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.054382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.054420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.054459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.054863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.054914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.054955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.054994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.055031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.055474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.055490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.059350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.059396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.060642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.060685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.060920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.060969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.061008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.061046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.061084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.061321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.061335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.065402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.065455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.066928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.066969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.067360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.067412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.067455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.068850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.068895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.069129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.069149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.073934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.073985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.075231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.075275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.075507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.076985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.077028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.077789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.077832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.078065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.078080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.082072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.082121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.083353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.083395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.083663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.673 [2024-07-25 06:55:19.085161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.085204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.086672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.086714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.087156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.087172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.092108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.092161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.092523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.092563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.092804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.094045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.094087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.095560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.097020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.097388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.097403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.102381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.102746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.103950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.105191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.105424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.106905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.107861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.109430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.110974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.111215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.111231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.116456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.117828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.119005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.120429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.120700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.122086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.122999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.123363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.123718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.124163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.124179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.129071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.130559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.132027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.132956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.133198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.133896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.134258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.135756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.136121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.136512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.136527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.142586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.144075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.145359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.145721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.146114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.146484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.146842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.148220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.149472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.149706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.149720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.154722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.155087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.155448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.155804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.156090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.157336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.158844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.160324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.161117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.161363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.161378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.165578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.165945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.166320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.166686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.167091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.167458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.167814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.168176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.168543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.168931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.168947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.172215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.172579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.172936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.674 [2024-07-25 06:55:19.173315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.173651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.174014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.174374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.174729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.175087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.175415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.175431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.178615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.178979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.179342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.179702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.180091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.180470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.180834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.181192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.181548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.181862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.181877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.185119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.185491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.185849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.186208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.186600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.186969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.187344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.187700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.188056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.188493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.188509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.191669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.192051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.192419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.192775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.193157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.193518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.193878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.194243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.194605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.195019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.195035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.198193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.198562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.198924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.199290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.199690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.200053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.200414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.200773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.201148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.201571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.201586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.204995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.205365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.205726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.206089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.206496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.206861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.207222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.207578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.207939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.208305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.208321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.211490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.211852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.212213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.212572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.675 [2024-07-25 06:55:19.212888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.213259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.213619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.213975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.214336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.214693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.214708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.217225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.217601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.217967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.218328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.218696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.219058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.219421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.219781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.220148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.220558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.220574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.676 [2024-07-25 06:55:19.223049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.223413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.223770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.224128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.224466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.224833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.225195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.225550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.225907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.226268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.226284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.228772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.229719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.230497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.231811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.232177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.232546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.232911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.233271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.233630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.234046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.234061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.236664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.237027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.237391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.237750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.238157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.238518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.238877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.239244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.239602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.239990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.240004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.242507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.242871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.243233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.244528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.244833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.246326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.247800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.248636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.250061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.250299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.250314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.252353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.252713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.253611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.938 [2024-07-25 06:55:19.254854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.255088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.256581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.257815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.259249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.260575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.260808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.260823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.263192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.263739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.264994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.266475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.266711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.268308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.269397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.270641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.272117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.272353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.272368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.274775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.276273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.277864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.279337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.279571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.280332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.281586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.283062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.284542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.284821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.284836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.288535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.289866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.291341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.292918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.293257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.294509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.295978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.297458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.298581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.298943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.298958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.302200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.303676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.305153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.305908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.306144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.307528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.309013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.310611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.310971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.311382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.311398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.314701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.316184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.317185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.318786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.319021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.320488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.321951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.322479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.322836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.323211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.323227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.326577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.328030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.329267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.330528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.330763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.332250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.333148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.333511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.333866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.334295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.334311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.337268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.338043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.339295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.339339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.339573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.341065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.342260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.342616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.342974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.343395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.343411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.346465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.347236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.348487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.349963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.939 [2024-07-25 06:55:19.350199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.351780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.352150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.352505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.352860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.353268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.353286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.355749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.357312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.357364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.358942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.359179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.360652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.361038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.361400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.361756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.362154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.362170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.363633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.364660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.364712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.366161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.366396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.367860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.369336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.369378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.369741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.370145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.370161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.372090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.373571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.373613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.375090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.375377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.375430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.376954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.377003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.378605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.378839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.378853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.380851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.381215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.381255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.382405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.382678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.382729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.384208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.384250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.385728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.386084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.386100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.387462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.388191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.388233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.388588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.388948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.388992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.389353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.389392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.389432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.389715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.389729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.391882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.393606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.393648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.393687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.393725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.394093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.394135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.940 [2024-07-25 06:55:19.394177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.394216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.394254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.394670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.394687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.396984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.397000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.398455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.398497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.398535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.398576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.398964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.399007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.399046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.399085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.399124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.399494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.399509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.401725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.402078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.402092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.403442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.403482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.403532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.403575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.403926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.403969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.404008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.404047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.404086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.404510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.404526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.406897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.407128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.407147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.408656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.408700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.408738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.408775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.409010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.409060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.409099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.409142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.409181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.409604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.409619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.411657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.411705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.411742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.411784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.412015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.412056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.412102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.412149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.412187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.412418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.412436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.413935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.413975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.414016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.414054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.414289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.414339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.941 [2024-07-25 06:55:19.414378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.414416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.414453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.414776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.414790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.417942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.418176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.418191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.419644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.419684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.419721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.419759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.419991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.420040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.420079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.420119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.420161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.420392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.420406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.422508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.422549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.422590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.422629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.422967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.423013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.423051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.423089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.423127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.423410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.423425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.424866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.424906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.424946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.424984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.425257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.425307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.425345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.425383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.425421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.425650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.425664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.427702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.427744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.427782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.427820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.428222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.428272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.428311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.428352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.428391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.428660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.428674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.430809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.432497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.432539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.432578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.432617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.432967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.942 [2024-07-25 06:55:19.433010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.433049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.433087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.433125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.433536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.433552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.434987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.435805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.437943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.438327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.438343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.440983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.442948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.443356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.443372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.445976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.447428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.447468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.447513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.447554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.447784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.447828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.447875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.943 [2024-07-25 06:55:19.447913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.447956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.448333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.448349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.450934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.451169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.451184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.452660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.452721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.452762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.452799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.453028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.453078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.453117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.453159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.453196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.453459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.453474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.455718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.455760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.455811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.457238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.457472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.457518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.457561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.457611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.457652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.457882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.457896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.459911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.460203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.460218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.462410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.462467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.463950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.464001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.464239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.464284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.464324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.464372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.464409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.464638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.464652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.466441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.466486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.466842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.466880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.467301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.467346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.467386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.467741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.467782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.468015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.468030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.471024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.471071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.472583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.472625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.472858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.474440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.474489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.474846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.474887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.475288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.475305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.478632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.478677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.480156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.480198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.480594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.482089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.482135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.483613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.483655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.944 [2024-07-25 06:55:19.483885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.945 [2024-07-25 06:55:19.483900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.945 [2024-07-25 06:55:19.486202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.945 [2024-07-25 06:55:19.486251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.945 [2024-07-25 06:55:19.487608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.945 [2024-07-25 06:55:19.487650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.945 [2024-07-25 06:55:19.487945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.945 [2024-07-25 06:55:19.489443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:05.945 [2024-07-25 06:55:19.489485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.490958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.491722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.491956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.491971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.493750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.494110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.494470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.495641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.495909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.497415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.498843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.499844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.501133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.501433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.501447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.503308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.503668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.504025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.504391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.504727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.505095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.505455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.505811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.506173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.506508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.506529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.509031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.509400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.509760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.510115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.510527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.510888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.511261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.511622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.511980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.512354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.512369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.514863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.515227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.515583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.515942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.516266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.516634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.516991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.517352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.517710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.518035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.518050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.520548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.520921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.521292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.521651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.522037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.522402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.522761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.523127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.523489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.523911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.523927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.206 [2024-07-25 06:55:19.526451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.526815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.527176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.527544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.527938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.528315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.528675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.529029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.529390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.529694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.529709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.532262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.532624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.532988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.533349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.533753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.534124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.534489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.534858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.535224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.535643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.535659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.538025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.538388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.538745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.539103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.539469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.539839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.540213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.540568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.540924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.541296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.541311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.543760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.544120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.544485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.544852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.545253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.545614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.545970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.546331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.546693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.547110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.547125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.549580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.549945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.550306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.550660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.551009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.551380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.551744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.552099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.552461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.552865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.552880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.555305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.555666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.556033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.556407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.556822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.557190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.557546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.557903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.558267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.558643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.558658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.561079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.561445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.561802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.562177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.562591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.562954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.563322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.563680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.564034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.564442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.564458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.566487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.566848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.567207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.567564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.567912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.568285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.568645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.568999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.569358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.569769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.569784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.572259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.572625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.207 [2024-07-25 06:55:19.572991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.573352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.573744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.574105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.574465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.574822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.575191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.575629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.575645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.578187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.578555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.578915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.579276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.579659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.581058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.582557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.584034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.585283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.585576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.585591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.587266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.587628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.587984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.588342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.588574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.589829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.591307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.592776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.593570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.593829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.593844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.595620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.595983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.596346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.597485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.597770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.599230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.600704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.601728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.603325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.603578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.603592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.605555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.605930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.606523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.607765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.607998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.609541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.611116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.612187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.613444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.613676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.613691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.615781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.616148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.617608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.618963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.619199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.620688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.621450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.622710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.624181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.624415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.624429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.626763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.627750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.629012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.630503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.630736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.631863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.633380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.634776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.636289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.636521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.636536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.639043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.640336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.641815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.643294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.643527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.644435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.645684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.647156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.648640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.648989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.649004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.652752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.654221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.655822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.657346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.657744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.208 [2024-07-25 06:55:19.658995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.660458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.661930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.662919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.663293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.663309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.666590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.668073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.669554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.670325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.670557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.671984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.673530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.675101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.675462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.675862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.675877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.679146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.680624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.681625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.683235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.683488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.684962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.686408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.686978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.687338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.687719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.687734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.691003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.692525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.693464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.694711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.694944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.696438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.697639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.697995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.698354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.698809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.698828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.701890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.702664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.703996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.704039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.704276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.705787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.707391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.707754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.708117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.708484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.708500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.711676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.712910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.714311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.715611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.715844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.717341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.718059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.718418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.718774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.719183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.719204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.722125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.723027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.723070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.724310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.724543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.726037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.727130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.727496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.727854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.728271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.728287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.729927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.731407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.731448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.732211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.732445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.733975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.735591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.735634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.736894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.737296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.737312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.739379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.740619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.740660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.742115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.742352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.742403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.209 [2024-07-25 06:55:19.743180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.743227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.744473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.744707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.744722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.746472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.746835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.746878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.747236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.747556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.747603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.748823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.748866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.750348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.750582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.750596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.752071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.753555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.753605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.754188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.754627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.754674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.755031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.755070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.755108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.755515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.755531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.756995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.757774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.759198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.759239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.210 [2024-07-25 06:55:19.759290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.759342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.759762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.759806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.759846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.759885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.759924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.760296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.760312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.762985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.764961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.765401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.765417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.767884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.768114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.768128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.769610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.769650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.769691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.769734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.769966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.473 [2024-07-25 06:55:19.770015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.770053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.770091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.770129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.770440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.770455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.772765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.772823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.772862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.772900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.773131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.773188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.773235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.773273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.773311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.773541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.773555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.775806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.777848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.777889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.777929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.777967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.778366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.778412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.778451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.778489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.778526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.778821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.778835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.780826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.781057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.781071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.782924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.782965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.783020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.783071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.783503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.783551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.783590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.783630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.783669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.784035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.784049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.785986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.786274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.786290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.787851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.787892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.787931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.787970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.788386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.788430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.474 [2024-07-25 06:55:19.788480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.788519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.788557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.788977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.788993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.790491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.790532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.790589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.790631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.790862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.790910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.790952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.790991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.791030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.791276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.791292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.792652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.792695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.792734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.792773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.793190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.793250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.793290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.793331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.793370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.793788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.793803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.795728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.795768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.795806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.795856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.796086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.796126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.796176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.796220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.796258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.796488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.796502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.797946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.797986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.798953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.800969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.801775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.803777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.804100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.804115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.806514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.806560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.806598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.806636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.806869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.806918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.806962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.807000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.807038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.807274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.807293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.808774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.808818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.808855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.808893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.809128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.809183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.809223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.809261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.475 [2024-07-25 06:55:19.809300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.809735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.809750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.811774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.811814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.811863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.811914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.812153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.812196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.812234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.812283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.812321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.812551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.812565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.814881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.816972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.817760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.819989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.822945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.823183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.823198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.824694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.824738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.824775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.824813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.825045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.825098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.825137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.825180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.825217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.825447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.825461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.827433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.827478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.827836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.827888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.828121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.828169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.828210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.828254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.828300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.828532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.828547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.831369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.831423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.832901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.832948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.833186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.833237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.833278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.476 [2024-07-25 06:55:19.833639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.833678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.834095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.834110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.837437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.837486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.838946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.838988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.839343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.840768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.840821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.842328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.842381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.842614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.842629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.845040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.845089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.845456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.845503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.845820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.846195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.846239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.846594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.846645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.847062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.847078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.849477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.849540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.849903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.849945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.850321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.850688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.850747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.851113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.851476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.851850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.851865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.854324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.854690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.855050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.855415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.855766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.856132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.856495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.856861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.857225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.857564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.857579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.860098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.860474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.860848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.861212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.861581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.861939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.862304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.862667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.863029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.863445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.863462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.865898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.866265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.866634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.866993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.867356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.867724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.868084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.868448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.868807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.869204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.869219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.871670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.872033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.872402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.872765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.873156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.873522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.873880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.874243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.874605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.875002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.875017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.877554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.877923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.878290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.878662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.477 [2024-07-25 06:55:19.879071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.879455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.879824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.880184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.880539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.880955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.880971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.883362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.883728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.884087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.884452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.884866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.885232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.885603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.885960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.886339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.886693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.886708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.889385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.889754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.890115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.890475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.890871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.891240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.891603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.891967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.892331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.892677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.892692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.895086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.895452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.895814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.896180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.896545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.896915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.897284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.897642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.898001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.898374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.898390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.901037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.901408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.901772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.902133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.902519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.902885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.903248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.903609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.903982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.904382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.904398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.906912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.907288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.907648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.908068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.908307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.908675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.910149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.910509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.910867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.911244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.911259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.913706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.914075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.478 [2024-07-25 06:55:19.914438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.914800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.915119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.915490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.915846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.916204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.916564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.916945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.916960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.919505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.919873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.920242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.920600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.920945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.921317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.921676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.922036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.922402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.922807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.922822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.926080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.927555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.928324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.929584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.929817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.931300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.932895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.933262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.933619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.933964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.933982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.937153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.938467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.939802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.941056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.941293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.942788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.943583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.943946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.944312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.944724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.944740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.947736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.948539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.949790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.951262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.951495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.952859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.953223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.953581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.953939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.954345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.954361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.956866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.958360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.959742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.961243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.961487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.962092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.962455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.962815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.963180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.963497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.963511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.965832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.967095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.968584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.970065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.970392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.970762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.971122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.971489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.972093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.972331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.972347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.975167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.976676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.978157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.979416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.979777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.479 [2024-07-25 06:55:19.980152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.980513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.980871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.982338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.982618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.982633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.985329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.986803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.988271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.988635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.989055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.989433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.989793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.990758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.992012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.992252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.992267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.995113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.996581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.997509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.997879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.998303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.998669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:19.999030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.000495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.002033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.002272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.002287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.005151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.005894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.006510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.006998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.007377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.007745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.008104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.008468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.009632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.009909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.009924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.012581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.014050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.015517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.016229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.016664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.017029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.017408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.017888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.019137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.019374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.019389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.022429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.023919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.480 [2024-07-25 06:55:20.025153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.025514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.025934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.026304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.026663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.028073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.029348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.029580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.029594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.032476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.033950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.034500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.034858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.035247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.035615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.036183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.037434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.038909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.039147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.039162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.042164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.043313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.043674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.044032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.044437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.044806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.046321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.047739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.049251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.747 [2024-07-25 06:55:20.049486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.049500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.052406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.052836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.053357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.053400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.053794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.054167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.054548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.055900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.057376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.057609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.057624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.060623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.061973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.062338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.062697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.063060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.063431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.064711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.065955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.067401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.067639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.067654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.069926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.070527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.070813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.072209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.072560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.072937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.073310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.073671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.074381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.074617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.074632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.076158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.077400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.077442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.078908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.079147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.080417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.080779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.080823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.081183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.081585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.081600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.083234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.084713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.084754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.085496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.085730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.085783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.087395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.087440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.088911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.089152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.089167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.091353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.092395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.092438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.093685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.093919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.093971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.095441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.095483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.096237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.096471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.096485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.097959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.098344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.098390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.098748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.099153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.099202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.099992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.100034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.100071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.100367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.100382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.101883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.101925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.101962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.102000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.102270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.102320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.102358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.102396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.102436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.102671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.102685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.104767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.748 [2024-07-25 06:55:20.104821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.104861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.104899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.105325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.105369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.105409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.105448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.105487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.105798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.105812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.107988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.109631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.109676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.109718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.109757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.110124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.110183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.110223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.110261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.110298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.110703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.110720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.112769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.113002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.113017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.114461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.114513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.114567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.114606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.115027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.115070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.115111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.115156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.115196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.115551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.115570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.117917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.118203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.118218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.119568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.119609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.119648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.119685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.120027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.120098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.120154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.120194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.120233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.120647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.120665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.122619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.122667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.122705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.122742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.122975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.123026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.123065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.123103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.123150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.123382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.123397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.124885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.124925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.124967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.125005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.749 [2024-07-25 06:55:20.125242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.125291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.125330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.125368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.125405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.125765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.125779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.128846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.129079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.129093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.130528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.130569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.130607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.130645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.130875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.130929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.130967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.131005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.131043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.131278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.131293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.133500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.133542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.133581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.133620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.133982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.134028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.134067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.134104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.134147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.134413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.134428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.135860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.135900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.135941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.135979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.136277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.136327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.136366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.136403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.136441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.136671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.136686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.138745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.138800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.138851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.138895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.139321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.139369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.139408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.139448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.139487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.139864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.139878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.141880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.142112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.142127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.144868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.145102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.145117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.146587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.750 [2024-07-25 06:55:20.146628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.146669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.146707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.146981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.147030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.147068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.147106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.147150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.147383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.147398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.149982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.150369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.150385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.151729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.151771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.151809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.151855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.152189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.152236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.152274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.152312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.152350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.152628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.152643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.154969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.155408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.155425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.156990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.157833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.159942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.160332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.160348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.162959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.164506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.164548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.164586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.164624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.164891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.164941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.164981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.165019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.165072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.165490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.165506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.167540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.167581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.167623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.168922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.169167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.751 [2024-07-25 06:55:20.169220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.169259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.169297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.169342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.169573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.169588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.171717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.172053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.172067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.174410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.174467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.174833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.174877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.175235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.175290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.175330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.175373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.175412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.175810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.175825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.178289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.178339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.178700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.178751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.179190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.179235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.179274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.179633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.179688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.180104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.180119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.182691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.182750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.183115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.183166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.183490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.183856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.183898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.184261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.184305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.184691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.184707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.187156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.187206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.187562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.187602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.187949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.188327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.188375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.188741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.188784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.189182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.189205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.191820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.191870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.192233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.192276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.192685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.193049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.193110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.193480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.193841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.194264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.194280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.196721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.197084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.752 [2024-07-25 06:55:20.197449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.197807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.198166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.198534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.198896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.199260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.199630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.200020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.200037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.202414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.202777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.203137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.203503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.203902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.204275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.204635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.204992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.205363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.205699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.205714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.208605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.208973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.209348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.209708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.210167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.210532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.210894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.211276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.211635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.212005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.212020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.214523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.214889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.215254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.215615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.216002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.216380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.216740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.217099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.217463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.217818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.217833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.220307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.220672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.221034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.221405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.221811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.222190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.222552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.222913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.223280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.223722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.223738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.226281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.226649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.227011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.227376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.227699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.228065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.228430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.228790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.229181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.229601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.229618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.232163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.232527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.232889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.233255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.233650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.234016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.234380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.234751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.235115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.235551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.235569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.237936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.239436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.239797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.240165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.240540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.240907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.241282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.241646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.242005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.242358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.242374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.244889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.245259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.245623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.753 [2024-07-25 06:55:20.246000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.246354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.246723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.247081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.247442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.247802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.248170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.248186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.250854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.251231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.251590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.251953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.252335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.252703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.253068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.253432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.253795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.254195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.254212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.256340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.257592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.259066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.260532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.260766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.261134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.261498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.261863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.262643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.262902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.262917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.265648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.267128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.268606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.269526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.269911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.270283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.270647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.271622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.272716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.272958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.272973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.275911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.277386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.278218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.278580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.278944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.279317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.280341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.281588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.283063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.283308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.283323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.286289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.286902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.287268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.287626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.288017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.289279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.290530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.292000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.293487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.293837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.293851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.295691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.296057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.296422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:06.754 [2024-07-25 06:55:20.296891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.040 [2024-07-25 06:55:20.297124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.040 [2024-07-25 06:55:20.298439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.299907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.301509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.302503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.302773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.302787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.304700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.305064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.305760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.307005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.307245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.308825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.310262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.311460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.312697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.312932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.312946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.315219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.316088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.317329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.318781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.319016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.320332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.321666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.322907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.324342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.324575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.324589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.327721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.328966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.330431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.331898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.332236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.333764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.335183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.336721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.338292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.338625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.338641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.341913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.343384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.344847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.345596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.345852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.347464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.348954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.350314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.350674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.351076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.351092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.354411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.041 [2024-07-25 06:55:20.355878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:39:07.301 00:39:07.301 Latency(us) 00:39:07.301 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:07.301 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:39:07.301 Verification LBA range: start 0x0 length 0x100 00:39:07.301 crypto_ram : 5.78 44.26 2.77 0.00 0.00 2800002.66 290245.84 2254857.83 00:39:07.301 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:39:07.301 Verification LBA range: start 0x100 length 0x100 00:39:07.301 crypto_ram : 5.86 43.71 2.73 0.00 0.00 2849007.21 255013.68 2402497.33 00:39:07.301 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:39:07.301 Verification LBA range: start 0x0 length 0x100 00:39:07.301 crypto_ram1 : 5.79 44.25 2.77 0.00 0.00 2702429.39 290245.84 2066953.01 00:39:07.301 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:39:07.301 Verification LBA range: start 0x100 length 0x100 00:39:07.301 crypto_ram1 : 5.86 43.70 2.73 0.00 0.00 2748606.05 255013.68 2201170.74 00:39:07.301 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:39:07.301 Verification LBA range: start 0x0 length 0x100 00:39:07.301 crypto_ram2 : 5.55 306.77 19.17 0.00 0.00 375557.49 23697.82 583847.12 00:39:07.301 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:39:07.301 Verification LBA range: start 0x100 length 0x100 00:39:07.301 crypto_ram2 : 5.60 290.24 18.14 0.00 0.00 396189.39 2018.51 593913.45 00:39:07.301 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:39:07.301 Verification LBA range: start 0x0 length 0x100 00:39:07.301 crypto_ram3 : 5.64 317.73 19.86 0.00 0.00 353250.09 55784.24 452984.83 00:39:07.301 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:39:07.301 Verification LBA range: start 0x100 length 0x100 00:39:07.301 crypto_ram3 : 5.67 298.70 18.67 0.00 0.00 373671.95 14260.63 347288.37 00:39:07.301 =================================================================================================================== 00:39:07.301 Total : 1389.36 86.84 0.00 0.00 687848.27 2018.51 2402497.33 00:39:07.560 00:39:07.560 real 0m8.990s 00:39:07.560 user 0m17.016s 00:39:07.560 sys 0m0.554s 00:39:07.820 06:55:21 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:39:07.820 06:55:21 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:39:07.820 ************************************ 00:39:07.820 END TEST bdev_verify_big_io 00:39:07.820 ************************************ 00:39:07.820 06:55:21 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:39:07.820 06:55:21 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:39:07.820 06:55:21 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:39:07.820 06:55:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:39:07.820 ************************************ 00:39:07.820 START TEST bdev_write_zeroes 00:39:07.820 ************************************ 00:39:07.820 06:55:21 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:39:07.820 [2024-07-25 06:55:21.243974] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:07.820 [2024-07-25 06:55:21.244028] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1372330 ] 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:07.820 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.820 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:07.821 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:07.821 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:08.080 [2024-07-25 06:55:21.377984] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:08.080 [2024-07-25 06:55:21.421622] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:08.080 [2024-07-25 06:55:21.442866] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:39:08.080 [2024-07-25 06:55:21.450894] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:39:08.080 [2024-07-25 06:55:21.458912] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:39:08.080 [2024-07-25 06:55:21.562028] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:39:10.619 [2024-07-25 06:55:23.884258] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:39:10.619 [2024-07-25 06:55:23.884321] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:39:10.619 [2024-07-25 06:55:23.884335] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:10.619 [2024-07-25 06:55:23.892277] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:39:10.619 [2024-07-25 06:55:23.892294] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:39:10.619 [2024-07-25 06:55:23.892305] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:10.619 [2024-07-25 06:55:23.900296] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:39:10.619 [2024-07-25 06:55:23.900313] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:39:10.619 [2024-07-25 06:55:23.900324] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:10.619 [2024-07-25 06:55:23.908317] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:39:10.619 [2024-07-25 06:55:23.908333] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:39:10.619 [2024-07-25 06:55:23.908343] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:10.619 Running I/O for 1 seconds... 00:39:11.555 00:39:11.555 Latency(us) 00:39:11.555 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:11.555 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:39:11.555 crypto_ram : 1.02 2214.02 8.65 0.00 0.00 57444.01 5033.16 69625.45 00:39:11.555 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:39:11.555 crypto_ram1 : 1.02 2219.58 8.67 0.00 0.00 57006.09 5006.95 64592.28 00:39:11.555 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:39:11.555 crypto_ram2 : 1.02 17102.77 66.81 0.00 0.00 7386.77 2215.12 9699.33 00:39:11.555 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:39:11.555 crypto_ram3 : 1.02 17135.41 66.94 0.00 0.00 7351.30 2202.01 7759.46 00:39:11.555 =================================================================================================================== 00:39:11.555 Total : 38671.78 151.06 0.00 0.00 13106.40 2202.01 69625.45 00:39:11.813 00:39:11.813 real 0m4.126s 00:39:11.813 user 0m3.625s 00:39:11.813 sys 0m0.459s 00:39:11.813 06:55:25 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:39:11.813 06:55:25 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:39:11.813 ************************************ 00:39:11.813 END TEST bdev_write_zeroes 00:39:11.813 ************************************ 00:39:11.814 06:55:25 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:39:11.814 06:55:25 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:39:11.814 06:55:25 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:39:11.814 06:55:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:39:12.073 ************************************ 00:39:12.073 START TEST bdev_json_nonenclosed 00:39:12.073 ************************************ 00:39:12.073 06:55:25 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:39:12.073 [2024-07-25 06:55:25.446180] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:12.073 [2024-07-25 06:55:25.446235] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1373066 ] 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:12.073 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.073 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:12.073 [2024-07-25 06:55:25.580067] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:12.073 [2024-07-25 06:55:25.624067] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:12.073 [2024-07-25 06:55:25.624131] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:39:12.073 [2024-07-25 06:55:25.624152] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:39:12.073 [2024-07-25 06:55:25.624163] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:39:12.332 00:39:12.332 real 0m0.307s 00:39:12.332 user 0m0.153s 00:39:12.332 sys 0m0.152s 00:39:12.332 06:55:25 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:39:12.332 06:55:25 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:39:12.332 ************************************ 00:39:12.332 END TEST bdev_json_nonenclosed 00:39:12.332 ************************************ 00:39:12.332 06:55:25 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:39:12.332 06:55:25 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:39:12.332 06:55:25 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:39:12.332 06:55:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:39:12.332 ************************************ 00:39:12.332 START TEST bdev_json_nonarray 00:39:12.332 ************************************ 00:39:12.332 06:55:25 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:39:12.332 [2024-07-25 06:55:25.834493] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:12.333 [2024-07-25 06:55:25.834545] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1373149 ] 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.592 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:12.592 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:12.593 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:12.593 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:12.593 [2024-07-25 06:55:25.969243] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:12.593 [2024-07-25 06:55:26.012732] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:12.593 [2024-07-25 06:55:26.012802] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:39:12.593 [2024-07-25 06:55:26.012818] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:39:12.593 [2024-07-25 06:55:26.012830] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:39:12.593 00:39:12.593 real 0m0.310s 00:39:12.593 user 0m0.167s 00:39:12.593 sys 0m0.141s 00:39:12.593 06:55:26 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:39:12.593 06:55:26 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:39:12.593 ************************************ 00:39:12.593 END TEST bdev_json_nonarray 00:39:12.593 ************************************ 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:39:12.593 06:55:26 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:39:12.593 00:39:12.593 real 1m11.068s 00:39:12.593 user 2m54.730s 00:39:12.593 sys 0m9.972s 00:39:12.593 06:55:26 blockdev_crypto_qat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:39:12.593 06:55:26 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:39:12.593 ************************************ 00:39:12.593 END TEST blockdev_crypto_qat 00:39:12.593 ************************************ 00:39:12.853 06:55:26 -- spdk/autotest.sh@364 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:39:12.853 06:55:26 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:39:12.853 06:55:26 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:39:12.853 06:55:26 -- common/autotest_common.sh@10 -- # set +x 00:39:12.853 ************************************ 00:39:12.853 START TEST chaining 00:39:12.853 ************************************ 00:39:12.853 06:55:26 chaining -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:39:12.853 * Looking for test storage... 00:39:12.853 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:39:12.853 06:55:26 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@7 -- # uname -s 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00bef996-69be-e711-906e-00163566263e 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=00bef996-69be-e711-906e-00163566263e 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:39:12.853 06:55:26 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:39:12.853 06:55:26 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:39:12.853 06:55:26 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:39:12.853 06:55:26 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:12.853 06:55:26 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:12.853 06:55:26 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:12.853 06:55:26 chaining -- paths/export.sh@5 -- # export PATH 00:39:12.853 06:55:26 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@47 -- # : 0 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:39:12.853 06:55:26 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:39:12.853 06:55:26 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:39:12.853 06:55:26 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:39:12.853 06:55:26 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:39:12.853 06:55:26 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:39:12.853 06:55:26 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:39:12.853 06:55:26 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:39:12.853 06:55:26 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:39:12.853 06:55:26 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:39:12.853 06:55:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@296 -- # e810=() 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@297 -- # x722=() 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@298 -- # mlx=() 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:20:00.0 (0x8086 - 0x159b)' 00:39:22.838 Found 0000:20:00.0 (0x8086 - 0x159b) 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:39:22.838 06:55:34 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:20:00.1 (0x8086 - 0x159b)' 00:39:22.839 Found 0000:20:00.1 (0x8086 - 0x159b) 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:20:00.0: cvl_0_0' 00:39:22.839 Found net devices under 0000:20:00.0: cvl_0_0 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:20:00.1: cvl_0_1' 00:39:22.839 Found net devices under 0000:20:00.1: cvl_0_1 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:39:22.839 06:55:34 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:39:22.839 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:39:22.839 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.149 ms 00:39:22.839 00:39:22.839 --- 10.0.0.2 ping statistics --- 00:39:22.839 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:39:22.839 rtt min/avg/max/mdev = 0.149/0.149/0.149/0.000 ms 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:39:22.839 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:39:22.839 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.142 ms 00:39:22.839 00:39:22.839 --- 10.0.0.1 ping statistics --- 00:39:22.839 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:39:22.839 rtt min/avg/max/mdev = 0.142/0.142/0.142/0.000 ms 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@422 -- # return 0 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:39:22.839 06:55:35 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:39:22.839 06:55:35 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:39:22.839 06:55:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@481 -- # nvmfpid=1377326 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:39:22.839 06:55:35 chaining -- nvmf/common.sh@482 -- # waitforlisten 1377326 00:39:22.839 06:55:35 chaining -- common/autotest_common.sh@831 -- # '[' -z 1377326 ']' 00:39:22.839 06:55:35 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:39:22.839 06:55:35 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:39:22.839 06:55:35 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:39:22.839 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:39:22.839 06:55:35 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:39:22.839 06:55:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:22.839 [2024-07-25 06:55:35.224454] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:22.839 [2024-07-25 06:55:35.224516] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:22.839 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.839 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:22.840 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.840 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:22.840 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.840 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:22.840 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.840 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:22.840 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.840 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:22.840 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.840 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:22.840 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.840 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:22.840 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.840 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:22.840 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:22.840 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:22.840 [2024-07-25 06:55:35.355893] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:22.840 [2024-07-25 06:55:35.399009] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:39:22.840 [2024-07-25 06:55:35.399050] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:39:22.840 [2024-07-25 06:55:35.399064] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:39:22.840 [2024-07-25 06:55:35.399075] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:39:22.840 [2024-07-25 06:55:35.399085] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:39:22.840 [2024-07-25 06:55:35.399109] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@864 -- # return 0 00:39:22.840 06:55:36 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:22.840 06:55:36 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@69 -- # mktemp 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.i5wESmRvOY 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@69 -- # mktemp 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.4JzP16HmaI 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:22.840 malloc0 00:39:22.840 true 00:39:22.840 true 00:39:22.840 [2024-07-25 06:55:36.212093] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:39:22.840 crypto0 00:39:22.840 [2024-07-25 06:55:36.220117] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:39:22.840 crypto1 00:39:22.840 [2024-07-25 06:55:36.228238] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:39:22.840 [2024-07-25 06:55:36.244441] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@85 -- # update_stats 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:22.840 06:55:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:22.840 06:55:36 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:39:23.098 06:55:36 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:23.098 06:55:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:23.098 06:55:36 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.i5wESmRvOY bs=1K count=64 00:39:23.098 64+0 records in 00:39:23.098 64+0 records out 00:39:23.098 65536 bytes (66 kB, 64 KiB) copied, 0.00105837 s, 61.9 MB/s 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.i5wESmRvOY --ob Nvme0n1 --bs 65536 --count 1 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@25 -- # local config 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:39:23.098 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@31 -- # config='{ 00:39:23.098 "subsystems": [ 00:39:23.098 { 00:39:23.098 "subsystem": "bdev", 00:39:23.098 "config": [ 00:39:23.098 { 00:39:23.098 "method": "bdev_nvme_attach_controller", 00:39:23.098 "params": { 00:39:23.098 "trtype": "tcp", 00:39:23.098 "adrfam": "IPv4", 00:39:23.098 "name": "Nvme0", 00:39:23.098 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:23.098 "traddr": "10.0.0.2", 00:39:23.098 "trsvcid": "4420" 00:39:23.098 } 00:39:23.098 }, 00:39:23.098 { 00:39:23.098 "method": "bdev_set_options", 00:39:23.098 "params": { 00:39:23.098 "bdev_auto_examine": false 00:39:23.098 } 00:39:23.098 } 00:39:23.098 ] 00:39:23.098 } 00:39:23.098 ] 00:39:23.098 }' 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.i5wESmRvOY --ob Nvme0n1 --bs 65536 --count 1 00:39:23.098 06:55:36 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:39:23.098 "subsystems": [ 00:39:23.098 { 00:39:23.098 "subsystem": "bdev", 00:39:23.098 "config": [ 00:39:23.098 { 00:39:23.098 "method": "bdev_nvme_attach_controller", 00:39:23.098 "params": { 00:39:23.098 "trtype": "tcp", 00:39:23.098 "adrfam": "IPv4", 00:39:23.098 "name": "Nvme0", 00:39:23.098 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:23.098 "traddr": "10.0.0.2", 00:39:23.098 "trsvcid": "4420" 00:39:23.098 } 00:39:23.098 }, 00:39:23.098 { 00:39:23.098 "method": "bdev_set_options", 00:39:23.098 "params": { 00:39:23.098 "bdev_auto_examine": false 00:39:23.098 } 00:39:23.098 } 00:39:23.098 ] 00:39:23.098 } 00:39:23.098 ] 00:39:23.098 }' 00:39:23.098 [2024-07-25 06:55:36.564285] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:23.098 [2024-07-25 06:55:36.564345] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1377521 ] 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:23.098 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:23.098 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:23.356 [2024-07-25 06:55:36.699532] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:23.356 [2024-07-25 06:55:36.743741] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:23.874  Copying: 64/64 [kB] (average 62 MBps) 00:39:23.874 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:23.874 06:55:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:23.874 06:55:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:24.133 06:55:37 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@96 -- # update_stats 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:24.133 06:55:37 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:24.133 06:55:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:24.133 06:55:37 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:39:24.133 06:55:37 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:24.134 06:55:37 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.4JzP16HmaI --ib Nvme0n1 --bs 65536 --count 1 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@25 -- # local config 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:39:24.134 06:55:37 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:39:24.134 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:39:24.393 06:55:37 chaining -- bdev/chaining.sh@31 -- # config='{ 00:39:24.393 "subsystems": [ 00:39:24.393 { 00:39:24.393 "subsystem": "bdev", 00:39:24.393 "config": [ 00:39:24.393 { 00:39:24.393 "method": "bdev_nvme_attach_controller", 00:39:24.393 "params": { 00:39:24.393 "trtype": "tcp", 00:39:24.393 "adrfam": "IPv4", 00:39:24.393 "name": "Nvme0", 00:39:24.393 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:24.393 "traddr": "10.0.0.2", 00:39:24.393 "trsvcid": "4420" 00:39:24.393 } 00:39:24.393 }, 00:39:24.393 { 00:39:24.393 "method": "bdev_set_options", 00:39:24.393 "params": { 00:39:24.393 "bdev_auto_examine": false 00:39:24.393 } 00:39:24.393 } 00:39:24.393 ] 00:39:24.393 } 00:39:24.393 ] 00:39:24.393 }' 00:39:24.393 06:55:37 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.4JzP16HmaI --ib Nvme0n1 --bs 65536 --count 1 00:39:24.393 06:55:37 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:39:24.393 "subsystems": [ 00:39:24.393 { 00:39:24.393 "subsystem": "bdev", 00:39:24.393 "config": [ 00:39:24.393 { 00:39:24.393 "method": "bdev_nvme_attach_controller", 00:39:24.393 "params": { 00:39:24.393 "trtype": "tcp", 00:39:24.393 "adrfam": "IPv4", 00:39:24.393 "name": "Nvme0", 00:39:24.393 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:24.393 "traddr": "10.0.0.2", 00:39:24.393 "trsvcid": "4420" 00:39:24.393 } 00:39:24.393 }, 00:39:24.393 { 00:39:24.393 "method": "bdev_set_options", 00:39:24.393 "params": { 00:39:24.393 "bdev_auto_examine": false 00:39:24.393 } 00:39:24.393 } 00:39:24.393 ] 00:39:24.393 } 00:39:24.393 ] 00:39:24.393 }' 00:39:24.393 [2024-07-25 06:55:37.770808] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:24.393 [2024-07-25 06:55:37.770876] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1377801 ] 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:24.393 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.393 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:24.394 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:24.394 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:24.394 [2024-07-25 06:55:37.906650] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:24.653 [2024-07-25 06:55:37.950232] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:24.911  Copying: 64/64 [kB] (average 62 MBps) 00:39:24.911 00:39:24.911 06:55:38 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:39:24.911 06:55:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:24.911 06:55:38 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:24.911 06:55:38 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:24.911 06:55:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:24.911 06:55:38 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:24.911 06:55:38 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:39:24.911 06:55:38 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:24.911 06:55:38 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:24.911 06:55:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:24.911 06:55:38 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:39:25.171 06:55:38 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.i5wESmRvOY /tmp/tmp.4JzP16HmaI 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@25 -- # local config 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:39:25.171 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@31 -- # config='{ 00:39:25.171 "subsystems": [ 00:39:25.171 { 00:39:25.171 "subsystem": "bdev", 00:39:25.171 "config": [ 00:39:25.171 { 00:39:25.171 "method": "bdev_nvme_attach_controller", 00:39:25.171 "params": { 00:39:25.171 "trtype": "tcp", 00:39:25.171 "adrfam": "IPv4", 00:39:25.171 "name": "Nvme0", 00:39:25.171 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:25.171 "traddr": "10.0.0.2", 00:39:25.171 "trsvcid": "4420" 00:39:25.171 } 00:39:25.171 }, 00:39:25.171 { 00:39:25.171 "method": "bdev_set_options", 00:39:25.171 "params": { 00:39:25.171 "bdev_auto_examine": false 00:39:25.171 } 00:39:25.171 } 00:39:25.171 ] 00:39:25.171 } 00:39:25.171 ] 00:39:25.171 }' 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:39:25.171 06:55:38 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:39:25.171 "subsystems": [ 00:39:25.171 { 00:39:25.171 "subsystem": "bdev", 00:39:25.171 "config": [ 00:39:25.171 { 00:39:25.171 "method": "bdev_nvme_attach_controller", 00:39:25.171 "params": { 00:39:25.171 "trtype": "tcp", 00:39:25.171 "adrfam": "IPv4", 00:39:25.171 "name": "Nvme0", 00:39:25.171 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:25.171 "traddr": "10.0.0.2", 00:39:25.171 "trsvcid": "4420" 00:39:25.171 } 00:39:25.171 }, 00:39:25.171 { 00:39:25.171 "method": "bdev_set_options", 00:39:25.171 "params": { 00:39:25.171 "bdev_auto_examine": false 00:39:25.171 } 00:39:25.171 } 00:39:25.171 ] 00:39:25.171 } 00:39:25.171 ] 00:39:25.171 }' 00:39:25.431 [2024-07-25 06:55:38.747892] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:25.431 [2024-07-25 06:55:38.747951] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1378080 ] 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:25.431 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:25.431 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:25.431 [2024-07-25 06:55:38.884341] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:25.431 [2024-07-25 06:55:38.927854] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:25.999  Copying: 64/64 [kB] (average 31 MBps) 00:39:25.999 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@106 -- # update_stats 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:25.999 06:55:39 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:25.999 06:55:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:25.999 06:55:39 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:25.999 06:55:39 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:25.999 06:55:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:25.999 06:55:39 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:25.999 06:55:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:26.258 06:55:39 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:26.258 06:55:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:26.258 06:55:39 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:39:26.258 06:55:39 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:26.258 06:55:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:26.258 06:55:39 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.i5wESmRvOY --ob Nvme0n1 --bs 4096 --count 16 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@25 -- # local config 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:39:26.258 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@31 -- # config='{ 00:39:26.258 "subsystems": [ 00:39:26.258 { 00:39:26.258 "subsystem": "bdev", 00:39:26.258 "config": [ 00:39:26.258 { 00:39:26.258 "method": "bdev_nvme_attach_controller", 00:39:26.258 "params": { 00:39:26.258 "trtype": "tcp", 00:39:26.258 "adrfam": "IPv4", 00:39:26.258 "name": "Nvme0", 00:39:26.258 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:26.258 "traddr": "10.0.0.2", 00:39:26.258 "trsvcid": "4420" 00:39:26.258 } 00:39:26.258 }, 00:39:26.258 { 00:39:26.258 "method": "bdev_set_options", 00:39:26.258 "params": { 00:39:26.258 "bdev_auto_examine": false 00:39:26.258 } 00:39:26.258 } 00:39:26.258 ] 00:39:26.258 } 00:39:26.258 ] 00:39:26.258 }' 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.i5wESmRvOY --ob Nvme0n1 --bs 4096 --count 16 00:39:26.258 06:55:39 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:39:26.258 "subsystems": [ 00:39:26.258 { 00:39:26.258 "subsystem": "bdev", 00:39:26.258 "config": [ 00:39:26.258 { 00:39:26.258 "method": "bdev_nvme_attach_controller", 00:39:26.258 "params": { 00:39:26.258 "trtype": "tcp", 00:39:26.258 "adrfam": "IPv4", 00:39:26.258 "name": "Nvme0", 00:39:26.258 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:26.258 "traddr": "10.0.0.2", 00:39:26.258 "trsvcid": "4420" 00:39:26.258 } 00:39:26.258 }, 00:39:26.258 { 00:39:26.258 "method": "bdev_set_options", 00:39:26.258 "params": { 00:39:26.258 "bdev_auto_examine": false 00:39:26.258 } 00:39:26.258 } 00:39:26.259 ] 00:39:26.259 } 00:39:26.259 ] 00:39:26.259 }' 00:39:26.259 [2024-07-25 06:55:39.734953] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:26.259 [2024-07-25 06:55:39.735015] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1378122 ] 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.259 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:26.259 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:26.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:26.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:26.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:26.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:26.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:26.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:26.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:26.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.517 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:26.518 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:26.518 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:26.518 [2024-07-25 06:55:39.875304] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:26.518 [2024-07-25 06:55:39.918185] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:27.036  Copying: 64/64 [kB] (average 10 MBps) 00:39:27.036 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:27.036 06:55:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:27.036 06:55:40 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@114 -- # update_stats 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:27.295 06:55:40 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:27.295 06:55:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:39:27.296 06:55:40 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:27.296 06:55:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:27.296 06:55:40 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@117 -- # : 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.4JzP16HmaI --ib Nvme0n1 --bs 4096 --count 16 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@25 -- # local config 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:39:27.296 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@31 -- # config='{ 00:39:27.296 "subsystems": [ 00:39:27.296 { 00:39:27.296 "subsystem": "bdev", 00:39:27.296 "config": [ 00:39:27.296 { 00:39:27.296 "method": "bdev_nvme_attach_controller", 00:39:27.296 "params": { 00:39:27.296 "trtype": "tcp", 00:39:27.296 "adrfam": "IPv4", 00:39:27.296 "name": "Nvme0", 00:39:27.296 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:27.296 "traddr": "10.0.0.2", 00:39:27.296 "trsvcid": "4420" 00:39:27.296 } 00:39:27.296 }, 00:39:27.296 { 00:39:27.296 "method": "bdev_set_options", 00:39:27.296 "params": { 00:39:27.296 "bdev_auto_examine": false 00:39:27.296 } 00:39:27.296 } 00:39:27.296 ] 00:39:27.296 } 00:39:27.296 ] 00:39:27.296 }' 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.4JzP16HmaI --ib Nvme0n1 --bs 4096 --count 16 00:39:27.296 06:55:40 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:39:27.296 "subsystems": [ 00:39:27.296 { 00:39:27.296 "subsystem": "bdev", 00:39:27.296 "config": [ 00:39:27.296 { 00:39:27.296 "method": "bdev_nvme_attach_controller", 00:39:27.296 "params": { 00:39:27.296 "trtype": "tcp", 00:39:27.296 "adrfam": "IPv4", 00:39:27.296 "name": "Nvme0", 00:39:27.296 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:39:27.296 "traddr": "10.0.0.2", 00:39:27.296 "trsvcid": "4420" 00:39:27.296 } 00:39:27.296 }, 00:39:27.296 { 00:39:27.296 "method": "bdev_set_options", 00:39:27.296 "params": { 00:39:27.296 "bdev_auto_examine": false 00:39:27.296 } 00:39:27.296 } 00:39:27.296 ] 00:39:27.296 } 00:39:27.296 ] 00:39:27.296 }' 00:39:27.584 [2024-07-25 06:55:40.900212] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:27.584 [2024-07-25 06:55:40.900275] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1378415 ] 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:27.584 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:27.584 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:27.584 [2024-07-25 06:55:41.037273] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:27.584 [2024-07-25 06:55:41.080498] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:28.101  Copying: 64/64 [kB] (average 719 kBps) 00:39:28.101 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:28.101 06:55:41 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:28.101 06:55:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:28.101 06:55:41 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:28.101 06:55:41 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:28.101 06:55:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:28.101 06:55:41 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:28.101 06:55:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:28.101 06:55:41 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:28.101 06:55:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.i5wESmRvOY /tmp/tmp.4JzP16HmaI 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.i5wESmRvOY /tmp/tmp.4JzP16HmaI 00:39:28.360 06:55:41 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@117 -- # sync 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@120 -- # set +e 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:39:28.360 rmmod nvme_tcp 00:39:28.360 rmmod nvme_fabrics 00:39:28.360 rmmod nvme_keyring 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@124 -- # set -e 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@125 -- # return 0 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@489 -- # '[' -n 1377326 ']' 00:39:28.360 06:55:41 chaining -- nvmf/common.sh@490 -- # killprocess 1377326 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@950 -- # '[' -z 1377326 ']' 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@954 -- # kill -0 1377326 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@955 -- # uname 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1377326 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1377326' 00:39:28.360 killing process with pid 1377326 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@969 -- # kill 1377326 00:39:28.360 06:55:41 chaining -- common/autotest_common.sh@974 -- # wait 1377326 00:39:28.619 06:55:42 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:39:28.619 06:55:42 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:39:28.619 06:55:42 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:39:28.619 06:55:42 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:39:28.619 06:55:42 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:39:28.619 06:55:42 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:39:28.619 06:55:42 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:39:28.619 06:55:42 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:39:31.156 06:55:44 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:39:31.156 06:55:44 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:39:31.156 06:55:44 chaining -- bdev/chaining.sh@132 -- # bperfpid=1379001 00:39:31.156 06:55:44 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1379001 00:39:31.156 06:55:44 chaining -- common/autotest_common.sh@831 -- # '[' -z 1379001 ']' 00:39:31.156 06:55:44 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:39:31.156 06:55:44 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:39:31.156 06:55:44 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:39:31.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:39:31.156 06:55:44 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:39:31.156 06:55:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:31.156 06:55:44 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:39:31.156 [2024-07-25 06:55:44.213327] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:31.156 [2024-07-25 06:55:44.213387] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1379001 ] 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:31.156 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:31.156 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:31.156 [2024-07-25 06:55:44.345967] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:31.156 [2024-07-25 06:55:44.390210] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:31.724 06:55:45 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:39:31.724 06:55:45 chaining -- common/autotest_common.sh@864 -- # return 0 00:39:31.724 06:55:45 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:39:31.724 06:55:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:31.724 06:55:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:31.724 malloc0 00:39:31.724 true 00:39:31.724 true 00:39:31.724 [2024-07-25 06:55:45.212368] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:39:31.724 crypto0 00:39:31.724 [2024-07-25 06:55:45.220390] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:39:31.724 crypto1 00:39:31.724 06:55:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:31.724 06:55:45 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:39:31.983 Running I/O for 5 seconds... 00:39:37.257 00:39:37.257 Latency(us) 00:39:37.257 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:37.257 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:39:37.257 Verification LBA range: start 0x0 length 0x2000 00:39:37.257 crypto1 : 5.01 12433.75 48.57 0.00 0.00 20528.25 2202.01 14260.63 00:39:37.257 =================================================================================================================== 00:39:37.257 Total : 12433.75 48.57 0.00 0.00 20528.25 2202.01 14260.63 00:39:37.257 0 00:39:37.257 06:55:50 chaining -- bdev/chaining.sh@146 -- # killprocess 1379001 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@950 -- # '[' -z 1379001 ']' 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@954 -- # kill -0 1379001 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@955 -- # uname 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1379001 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1379001' 00:39:37.257 killing process with pid 1379001 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@969 -- # kill 1379001 00:39:37.257 Received shutdown signal, test time was about 5.000000 seconds 00:39:37.257 00:39:37.257 Latency(us) 00:39:37.257 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:37.257 =================================================================================================================== 00:39:37.257 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@974 -- # wait 1379001 00:39:37.257 06:55:50 chaining -- bdev/chaining.sh@152 -- # bperfpid=1380058 00:39:37.257 06:55:50 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:39:37.257 06:55:50 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1380058 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@831 -- # '[' -z 1380058 ']' 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:39:37.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:39:37.257 06:55:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:37.257 [2024-07-25 06:55:50.783984] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:37.257 [2024-07-25 06:55:50.784046] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1380058 ] 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:37.517 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:37.517 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:37.517 [2024-07-25 06:55:50.920172] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:37.517 [2024-07-25 06:55:50.963281] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:38.455 06:55:51 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:39:38.455 06:55:51 chaining -- common/autotest_common.sh@864 -- # return 0 00:39:38.455 06:55:51 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:39:38.455 06:55:51 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:38.455 06:55:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:38.455 malloc0 00:39:38.455 true 00:39:38.456 true 00:39:38.456 [2024-07-25 06:55:51.808919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:39:38.456 [2024-07-25 06:55:51.808967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:39:38.456 [2024-07-25 06:55:51.808987] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x932c20 00:39:38.456 [2024-07-25 06:55:51.808998] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:39:38.456 [2024-07-25 06:55:51.809986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:39:38.456 [2024-07-25 06:55:51.810009] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:39:38.456 pt0 00:39:38.456 [2024-07-25 06:55:51.816947] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:39:38.456 crypto0 00:39:38.456 [2024-07-25 06:55:51.824966] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:39:38.456 crypto1 00:39:38.456 06:55:51 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:38.456 06:55:51 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:39:38.456 Running I/O for 5 seconds... 00:39:43.729 00:39:43.729 Latency(us) 00:39:43.729 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:43.729 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:39:43.729 Verification LBA range: start 0x0 length 0x2000 00:39:43.729 crypto1 : 5.01 9699.48 37.89 0.00 0.00 26325.18 5898.24 15938.36 00:39:43.729 =================================================================================================================== 00:39:43.729 Total : 9699.48 37.89 0.00 0.00 26325.18 5898.24 15938.36 00:39:43.729 0 00:39:43.729 06:55:56 chaining -- bdev/chaining.sh@167 -- # killprocess 1380058 00:39:43.729 06:55:56 chaining -- common/autotest_common.sh@950 -- # '[' -z 1380058 ']' 00:39:43.729 06:55:56 chaining -- common/autotest_common.sh@954 -- # kill -0 1380058 00:39:43.729 06:55:56 chaining -- common/autotest_common.sh@955 -- # uname 00:39:43.729 06:55:56 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:39:43.729 06:55:56 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1380058 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1380058' 00:39:43.729 killing process with pid 1380058 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@969 -- # kill 1380058 00:39:43.729 Received shutdown signal, test time was about 5.000000 seconds 00:39:43.729 00:39:43.729 Latency(us) 00:39:43.729 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:43.729 =================================================================================================================== 00:39:43.729 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@974 -- # wait 1380058 00:39:43.729 06:55:57 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:39:43.729 06:55:57 chaining -- bdev/chaining.sh@170 -- # killprocess 1380058 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@950 -- # '[' -z 1380058 ']' 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@954 -- # kill -0 1380058 00:39:43.729 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1380058) - No such process 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@977 -- # echo 'Process with pid 1380058 is not found' 00:39:43.729 Process with pid 1380058 is not found 00:39:43.729 06:55:57 chaining -- bdev/chaining.sh@171 -- # wait 1380058 00:39:43.729 06:55:57 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:39:43.729 06:55:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@296 -- # e810=() 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@297 -- # x722=() 00:39:43.729 06:55:57 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@298 -- # mlx=() 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:20:00.0 (0x8086 - 0x159b)' 00:39:43.730 Found 0000:20:00.0 (0x8086 - 0x159b) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:20:00.1 (0x8086 - 0x159b)' 00:39:43.730 Found 0000:20:00.1 (0x8086 - 0x159b) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:20:00.0: cvl_0_0' 00:39:43.730 Found net devices under 0000:20:00.0: cvl_0_0 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:20:00.1: cvl_0_1' 00:39:43.730 Found net devices under 0000:20:00.1: cvl_0_1 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:39:43.730 06:55:57 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:39:43.990 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:39:43.990 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.152 ms 00:39:43.990 00:39:43.990 --- 10.0.0.2 ping statistics --- 00:39:43.990 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:39:43.990 rtt min/avg/max/mdev = 0.152/0.152/0.152/0.000 ms 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:39:43.990 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:39:43.990 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.179 ms 00:39:43.990 00:39:43.990 --- 10.0.0.1 ping statistics --- 00:39:43.990 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:39:43.990 rtt min/avg/max/mdev = 0.179/0.179/0.179/0.000 ms 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@422 -- # return 0 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:39:43.990 06:55:57 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:39:43.990 06:55:57 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:39:43.990 06:55:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@481 -- # nvmfpid=1381156 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@482 -- # waitforlisten 1381156 00:39:43.990 06:55:57 chaining -- common/autotest_common.sh@831 -- # '[' -z 1381156 ']' 00:39:43.990 06:55:57 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:39:43.990 06:55:57 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:39:43.990 06:55:57 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:39:43.990 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:39:43.990 06:55:57 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:39:43.990 06:55:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:43.990 06:55:57 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:39:44.250 [2024-07-25 06:55:57.597184] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:44.250 [2024-07-25 06:55:57.597247] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.250 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:44.250 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.251 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:44.251 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.251 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:44.251 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.251 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:44.251 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.251 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:44.251 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.251 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:44.251 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.251 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:44.251 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:44.251 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:44.251 [2024-07-25 06:55:57.729964] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:44.251 [2024-07-25 06:55:57.773299] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:39:44.251 [2024-07-25 06:55:57.773341] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:39:44.251 [2024-07-25 06:55:57.773355] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:39:44.251 [2024-07-25 06:55:57.773366] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:39:44.251 [2024-07-25 06:55:57.773376] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:39:44.251 [2024-07-25 06:55:57.773400] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@864 -- # return 0 00:39:45.630 06:55:58 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:45.630 06:55:58 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:39:45.630 06:55:58 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:45.630 malloc0 00:39:45.630 [2024-07-25 06:55:58.830759] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:39:45.630 [2024-07-25 06:55:58.846936] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:39:45.630 06:55:58 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:39:45.630 06:55:58 chaining -- bdev/chaining.sh@189 -- # bperfpid=1381429 00:39:45.630 06:55:58 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1381429 /var/tmp/bperf.sock 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@831 -- # '[' -z 1381429 ']' 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:39:45.630 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:39:45.630 06:55:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:45.630 06:55:58 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:39:45.630 [2024-07-25 06:55:58.916886] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:45.630 [2024-07-25 06:55:58.916944] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1381429 ] 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:45.630 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.630 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:45.631 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:45.631 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:45.631 [2024-07-25 06:55:59.054397] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:45.631 [2024-07-25 06:55:59.098806] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:46.611 06:55:59 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:39:46.611 06:55:59 chaining -- common/autotest_common.sh@864 -- # return 0 00:39:46.612 06:55:59 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:39:46.612 06:55:59 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:39:46.870 [2024-07-25 06:56:00.201328] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:39:46.870 nvme0n1 00:39:46.870 true 00:39:46.870 crypto0 00:39:46.870 06:56:00 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:39:46.870 Running I/O for 5 seconds... 00:39:52.136 00:39:52.136 Latency(us) 00:39:52.136 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:52.136 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:39:52.136 Verification LBA range: start 0x0 length 0x2000 00:39:52.136 crypto0 : 5.02 9456.27 36.94 0.00 0.00 26990.79 3237.48 22544.38 00:39:52.136 =================================================================================================================== 00:39:52.136 Total : 9456.27 36.94 0.00 0.00 26990.79 3237.48 22544.38 00:39:52.136 0 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@205 -- # sequence=94898 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:52.136 06:56:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@206 -- # encrypt=47449 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@207 -- # decrypt=47449 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:39:52.395 06:56:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:39:52.655 06:56:06 chaining -- bdev/chaining.sh@208 -- # crc32c=94898 00:39:52.655 06:56:06 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:39:52.655 06:56:06 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:39:52.655 06:56:06 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:39:52.655 06:56:06 chaining -- bdev/chaining.sh@214 -- # killprocess 1381429 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@950 -- # '[' -z 1381429 ']' 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@954 -- # kill -0 1381429 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@955 -- # uname 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1381429 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1381429' 00:39:52.655 killing process with pid 1381429 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@969 -- # kill 1381429 00:39:52.655 Received shutdown signal, test time was about 5.000000 seconds 00:39:52.655 00:39:52.655 Latency(us) 00:39:52.655 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:52.655 =================================================================================================================== 00:39:52.655 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:39:52.655 06:56:06 chaining -- common/autotest_common.sh@974 -- # wait 1381429 00:39:52.914 06:56:06 chaining -- bdev/chaining.sh@219 -- # bperfpid=1382756 00:39:52.914 06:56:06 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1382756 /var/tmp/bperf.sock 00:39:52.914 06:56:06 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:39:52.914 06:56:06 chaining -- common/autotest_common.sh@831 -- # '[' -z 1382756 ']' 00:39:52.914 06:56:06 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:39:52.914 06:56:06 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:39:52.914 06:56:06 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:39:52.914 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:39:52.914 06:56:06 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:39:52.914 06:56:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:39:52.914 [2024-07-25 06:56:06.390665] Starting SPDK v24.09-pre git sha1 d005e023b / DPDK 23.11.0 initialization... 00:39:52.914 [2024-07-25 06:56:06.390726] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1382756 ] 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:01.0 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:01.1 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:01.2 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:01.3 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:01.4 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:01.5 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:01.6 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:01.7 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:02.0 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:02.1 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:02.2 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:02.3 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:02.4 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:02.5 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:02.6 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3d:02.7 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3f:01.0 cannot be used 00:39:52.914 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:52.914 EAL: Requested device 0000:3f:01.1 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:01.2 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:01.3 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:01.4 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:01.5 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:01.6 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:01.7 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:02.0 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:02.1 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:02.2 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:02.3 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:02.4 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:02.5 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:02.6 cannot be used 00:39:53.173 qat_pci_device_allocate(): Reached maximum number of QAT devices 00:39:53.173 EAL: Requested device 0000:3f:02.7 cannot be used 00:39:53.173 [2024-07-25 06:56:06.527480] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:53.173 [2024-07-25 06:56:06.571797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:53.431 06:56:06 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:39:53.431 06:56:06 chaining -- common/autotest_common.sh@864 -- # return 0 00:39:53.431 06:56:06 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:39:53.431 06:56:06 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:39:53.690 [2024-07-25 06:56:07.237469] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:39:53.690 nvme0n1 00:39:53.690 true 00:39:53.690 crypto0 00:39:53.949 06:56:07 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:39:53.949 Running I/O for 5 seconds... 00:39:59.217 00:39:59.217 Latency(us) 00:39:59.217 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:59.217 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:39:59.217 Verification LBA range: start 0x0 length 0x200 00:39:59.217 crypto0 : 5.01 1870.92 116.93 0.00 0.00 16757.47 1874.33 19818.09 00:39:59.217 =================================================================================================================== 00:39:59.217 Total : 1870.92 116.93 0.00 0.00 16757.47 1874.33 19818.09 00:39:59.217 0 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@39 -- # opcode= 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@233 -- # sequence=18740 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:39:59.217 06:56:12 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@234 -- # encrypt=9370 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:39:59.475 06:56:12 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@235 -- # decrypt=9370 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:39:59.734 06:56:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:39:59.993 06:56:13 chaining -- bdev/chaining.sh@236 -- # crc32c=18740 00:39:59.993 06:56:13 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:39:59.993 06:56:13 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:39:59.993 06:56:13 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:39:59.993 06:56:13 chaining -- bdev/chaining.sh@242 -- # killprocess 1382756 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@950 -- # '[' -z 1382756 ']' 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@954 -- # kill -0 1382756 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@955 -- # uname 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1382756 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1382756' 00:39:59.993 killing process with pid 1382756 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@969 -- # kill 1382756 00:39:59.993 Received shutdown signal, test time was about 5.000000 seconds 00:39:59.993 00:39:59.993 Latency(us) 00:39:59.993 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:39:59.993 =================================================================================================================== 00:39:59.993 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:39:59.993 06:56:13 chaining -- common/autotest_common.sh@974 -- # wait 1382756 00:40:00.251 06:56:13 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@117 -- # sync 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@120 -- # set +e 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:40:00.251 rmmod nvme_tcp 00:40:00.251 rmmod nvme_fabrics 00:40:00.251 rmmod nvme_keyring 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@124 -- # set -e 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@125 -- # return 0 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@489 -- # '[' -n 1381156 ']' 00:40:00.251 06:56:13 chaining -- nvmf/common.sh@490 -- # killprocess 1381156 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@950 -- # '[' -z 1381156 ']' 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@954 -- # kill -0 1381156 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@955 -- # uname 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1381156 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1381156' 00:40:00.251 killing process with pid 1381156 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@969 -- # kill 1381156 00:40:00.251 06:56:13 chaining -- common/autotest_common.sh@974 -- # wait 1381156 00:40:00.509 06:56:13 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:40:00.509 06:56:13 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:40:00.509 06:56:13 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:40:00.509 06:56:13 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:40:00.509 06:56:13 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:40:00.509 06:56:13 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:40:00.509 06:56:13 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:40:00.509 06:56:13 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:40:02.413 06:56:15 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:40:02.413 06:56:15 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:40:02.413 00:40:02.413 real 0m49.739s 00:40:02.413 user 0m59.078s 00:40:02.413 sys 0m13.288s 00:40:02.413 06:56:15 chaining -- common/autotest_common.sh@1126 -- # xtrace_disable 00:40:02.413 06:56:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:40:02.413 ************************************ 00:40:02.413 END TEST chaining 00:40:02.413 ************************************ 00:40:02.672 06:56:15 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:40:02.672 06:56:15 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:40:02.672 06:56:15 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:40:02.673 06:56:15 -- spdk/autotest.sh@379 -- # [[ 0 -eq 1 ]] 00:40:02.673 06:56:15 -- spdk/autotest.sh@384 -- # trap - SIGINT SIGTERM EXIT 00:40:02.673 06:56:15 -- spdk/autotest.sh@386 -- # timing_enter post_cleanup 00:40:02.673 06:56:15 -- common/autotest_common.sh@724 -- # xtrace_disable 00:40:02.673 06:56:15 -- common/autotest_common.sh@10 -- # set +x 00:40:02.673 06:56:16 -- spdk/autotest.sh@387 -- # autotest_cleanup 00:40:02.673 06:56:16 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:40:02.673 06:56:16 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:40:02.673 06:56:16 -- common/autotest_common.sh@10 -- # set +x 00:40:09.245 INFO: APP EXITING 00:40:09.245 INFO: killing all VMs 00:40:09.246 INFO: killing vhost app 00:40:09.246 INFO: EXIT DONE 00:40:13.438 Waiting for block devices as requested 00:40:13.438 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:40:13.438 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:40:13.438 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:40:13.438 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:40:13.438 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:40:13.438 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:40:13.438 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:40:13.697 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:40:13.697 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:40:13.697 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:40:13.956 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:40:13.956 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:40:13.956 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:40:14.215 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:40:14.215 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:40:14.215 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:40:14.473 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:40:19.746 Cleaning 00:40:19.746 Removing: /var/run/dpdk/spdk0/config 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:40:19.746 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:40:19.746 Removing: /var/run/dpdk/spdk0/hugepage_info 00:40:19.746 Removing: /dev/shm/nvmf_trace.0 00:40:19.746 Removing: /dev/shm/spdk_tgt_trace.pid1042183 00:40:19.746 Removing: /var/run/dpdk/spdk0 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1037206 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1040833 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1042183 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1042886 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1043880 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1044128 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1045088 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1045351 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1045640 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1049079 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1051145 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1051496 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1051935 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1052269 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1052596 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1052877 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1053160 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1053470 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1054322 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1057629 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1057821 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1058097 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1058517 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1058654 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1058723 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1059000 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1059288 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1059621 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1059975 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1060258 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1060872 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1061238 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1061527 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1061804 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1062093 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1062370 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1062655 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1062936 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1063215 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1063502 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1063781 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1064068 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1064353 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1064636 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1064915 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1065200 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1065492 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1065804 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1066315 00:40:19.746 Removing: /var/run/dpdk/spdk_pid1066607 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1066898 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1067391 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1067718 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1067824 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1068284 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1068775 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1069169 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1069349 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1074230 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1076490 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1078558 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1079701 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1081165 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1081455 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1081512 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1081744 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1086602 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1087169 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1088486 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1088777 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1098156 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1100197 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1101203 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1106166 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1107992 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1109148 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1114191 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1116910 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1118056 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1129951 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1132609 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1133776 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1145191 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1147819 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1148981 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1160875 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1165501 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1166702 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1179510 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1182216 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1183629 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1196360 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1199895 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1201123 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1214020 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1218646 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1219833 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1221258 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1224931 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1231251 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1234690 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1240317 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1244194 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1250587 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1253892 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1261461 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1264700 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1272085 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1274781 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1282084 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1285044 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1290289 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1290802 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1291279 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1291611 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1292215 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1293081 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1294014 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1294436 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1296787 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1299318 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1301326 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1302942 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1311190 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1316553 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1318690 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1320825 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1323010 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1324823 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1333613 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1338978 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1339786 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1340328 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1342730 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1345212 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1347461 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1348793 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1350380 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1350938 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1351127 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1351272 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1351563 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1351678 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1353088 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1355080 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1357061 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1357875 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1359475 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1359764 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1359790 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1359962 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1361133 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1361734 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1362278 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1364844 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1367151 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1369464 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1370818 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1372330 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1373066 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1373149 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1377521 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1377801 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1378080 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1378122 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1378415 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1379001 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1380058 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1381429 00:40:19.747 Removing: /var/run/dpdk/spdk_pid1382756 00:40:19.747 Clean 00:40:19.747 06:56:33 -- common/autotest_common.sh@1451 -- # return 0 00:40:19.747 06:56:33 -- spdk/autotest.sh@388 -- # timing_exit post_cleanup 00:40:19.747 06:56:33 -- common/autotest_common.sh@730 -- # xtrace_disable 00:40:19.747 06:56:33 -- common/autotest_common.sh@10 -- # set +x 00:40:19.747 06:56:33 -- spdk/autotest.sh@390 -- # timing_exit autotest 00:40:19.747 06:56:33 -- common/autotest_common.sh@730 -- # xtrace_disable 00:40:19.747 06:56:33 -- common/autotest_common.sh@10 -- # set +x 00:40:20.005 06:56:33 -- spdk/autotest.sh@391 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:40:20.005 06:56:33 -- spdk/autotest.sh@393 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:40:20.005 06:56:33 -- spdk/autotest.sh@393 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:40:20.005 06:56:33 -- spdk/autotest.sh@395 -- # hash lcov 00:40:20.005 06:56:33 -- spdk/autotest.sh@395 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:40:20.005 06:56:33 -- spdk/autotest.sh@397 -- # hostname 00:40:20.005 06:56:33 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-19 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:40:20.005 geninfo: WARNING: invalid characters removed from testname! 00:40:46.552 06:56:59 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:40:49.878 06:57:02 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:40:52.411 06:57:05 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:40:54.316 06:57:07 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:40:56.848 06:57:10 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:40:59.385 06:57:12 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:41:01.922 06:57:15 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:41:01.922 06:57:15 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:41:01.922 06:57:15 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:41:01.922 06:57:15 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:41:01.922 06:57:15 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:41:01.922 06:57:15 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:41:01.922 06:57:15 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:41:01.922 06:57:15 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:41:01.922 06:57:15 -- paths/export.sh@5 -- $ export PATH 00:41:01.922 06:57:15 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:41:01.922 06:57:15 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:41:01.922 06:57:15 -- common/autobuild_common.sh@447 -- $ date +%s 00:41:01.922 06:57:15 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721883435.XXXXXX 00:41:01.922 06:57:15 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721883435.BpFj7z 00:41:01.922 06:57:15 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:41:01.922 06:57:15 -- common/autobuild_common.sh@453 -- $ '[' -n v23.11 ']' 00:41:01.922 06:57:15 -- common/autobuild_common.sh@454 -- $ dirname /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:41:01.922 06:57:15 -- common/autobuild_common.sh@454 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/crypto-phy-autotest/dpdk' 00:41:01.922 06:57:15 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:41:01.922 06:57:15 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/dpdk --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:41:01.922 06:57:15 -- common/autobuild_common.sh@463 -- $ get_config_params 00:41:01.922 06:57:15 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:41:01.922 06:57:15 -- common/autotest_common.sh@10 -- $ set +x 00:41:01.922 06:57:15 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build' 00:41:01.922 06:57:15 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:41:01.922 06:57:15 -- pm/common@17 -- $ local monitor 00:41:01.922 06:57:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:41:01.922 06:57:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:41:01.922 06:57:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:41:01.922 06:57:15 -- pm/common@21 -- $ date +%s 00:41:01.922 06:57:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:41:01.923 06:57:15 -- pm/common@21 -- $ date +%s 00:41:01.923 06:57:15 -- pm/common@21 -- $ date +%s 00:41:01.923 06:57:15 -- pm/common@25 -- $ sleep 1 00:41:01.923 06:57:15 -- pm/common@21 -- $ date +%s 00:41:01.923 06:57:15 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721883435 00:41:01.923 06:57:15 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721883435 00:41:01.923 06:57:15 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721883435 00:41:01.923 06:57:15 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721883435 00:41:01.923 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721883435_collect-vmstat.pm.log 00:41:01.923 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721883435_collect-cpu-temp.pm.log 00:41:01.923 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721883435_collect-cpu-load.pm.log 00:41:01.923 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721883435_collect-bmc-pm.bmc.pm.log 00:41:02.860 06:57:16 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:41:02.860 06:57:16 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:41:02.860 06:57:16 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:41:02.860 06:57:16 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:41:02.860 06:57:16 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:41:02.860 06:57:16 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:41:02.860 06:57:16 -- spdk/autopackage.sh@19 -- $ timing_finish 00:41:02.860 06:57:16 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:41:02.860 06:57:16 -- common/autotest_common.sh@737 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:41:02.860 06:57:16 -- common/autotest_common.sh@739 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:41:02.860 06:57:16 -- spdk/autopackage.sh@20 -- $ exit 0 00:41:02.860 06:57:16 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:41:02.860 06:57:16 -- pm/common@29 -- $ signal_monitor_resources TERM 00:41:02.860 06:57:16 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:41:02.860 06:57:16 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:41:02.860 06:57:16 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:41:02.860 06:57:16 -- pm/common@44 -- $ pid=1396497 00:41:02.860 06:57:16 -- pm/common@50 -- $ kill -TERM 1396497 00:41:02.860 06:57:16 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:41:02.860 06:57:16 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:41:02.860 06:57:16 -- pm/common@44 -- $ pid=1396499 00:41:02.860 06:57:16 -- pm/common@50 -- $ kill -TERM 1396499 00:41:02.860 06:57:16 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:41:02.860 06:57:16 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:41:02.860 06:57:16 -- pm/common@44 -- $ pid=1396501 00:41:02.860 06:57:16 -- pm/common@50 -- $ kill -TERM 1396501 00:41:02.860 06:57:16 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:41:02.860 06:57:16 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:41:02.860 06:57:16 -- pm/common@44 -- $ pid=1396523 00:41:02.860 06:57:16 -- pm/common@50 -- $ sudo -E kill -TERM 1396523 00:41:02.860 + [[ -n 865125 ]] 00:41:02.860 + sudo kill 865125 00:41:02.869 [Pipeline] } 00:41:02.886 [Pipeline] // stage 00:41:02.891 [Pipeline] } 00:41:02.906 [Pipeline] // timeout 00:41:02.911 [Pipeline] } 00:41:02.922 [Pipeline] // catchError 00:41:02.927 [Pipeline] } 00:41:02.939 [Pipeline] // wrap 00:41:02.944 [Pipeline] } 00:41:02.954 [Pipeline] // catchError 00:41:02.961 [Pipeline] stage 00:41:02.962 [Pipeline] { (Epilogue) 00:41:02.971 [Pipeline] catchError 00:41:02.973 [Pipeline] { 00:41:02.982 [Pipeline] echo 00:41:02.983 Cleanup processes 00:41:02.986 [Pipeline] sh 00:41:03.264 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:41:03.264 1396605 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:41:03.264 1396945 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:41:03.276 [Pipeline] sh 00:41:03.588 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:41:03.588 ++ grep -v 'sudo pgrep' 00:41:03.588 ++ awk '{print $1}' 00:41:03.588 + sudo kill -9 1396605 00:41:03.598 [Pipeline] sh 00:41:03.879 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:41:03.880 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:41:11.997 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:41:17.275 [Pipeline] sh 00:41:17.554 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:41:17.554 Artifacts sizes are good 00:41:17.568 [Pipeline] archiveArtifacts 00:41:17.574 Archiving artifacts 00:41:17.735 [Pipeline] sh 00:41:18.018 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:41:18.032 [Pipeline] cleanWs 00:41:18.041 [WS-CLEANUP] Deleting project workspace... 00:41:18.041 [WS-CLEANUP] Deferred wipeout is used... 00:41:18.047 [WS-CLEANUP] done 00:41:18.049 [Pipeline] } 00:41:18.069 [Pipeline] // catchError 00:41:18.080 [Pipeline] sh 00:41:18.361 + logger -p user.info -t JENKINS-CI 00:41:18.369 [Pipeline] } 00:41:18.386 [Pipeline] // stage 00:41:18.391 [Pipeline] } 00:41:18.408 [Pipeline] // node 00:41:18.414 [Pipeline] End of Pipeline 00:41:18.451 Finished: SUCCESS